Sample records for techniques previous work

  1. Testing a mediation model of psychotherapy process and outcome in psychodynamic psychotherapy: Previous client distress, psychodynamic techniques, dyadic working alliance, and current client distress.

    PubMed

    Kivlighan, Dennis M; Hill, Clara E; Ross, Katherine; Kline, Kathryn; Furhmann, Amy; Sauber, Elizabeth

    2018-01-05

    To test a sequential model of psychotherapy process and outcome, we included previous client distress, therapist psychodynamic techniques, dyadic working alliance, and current client distress. For 114 sets of eight-session segments in 40 cases of psychodynamic psychotherapy, clients completed the Outcome Questionnaire-45 and Inventory of Interpersonal Problems-32 after the first and final session, judges reliably coded one middle sessions on the Psychodynamic subscale of the Multitheoretical List of Therapeutic Interventions, and clients and therapists completed the Working Alliance Inventory after every session. Results indicated that higher use of psychodynamic techniques was associated with higher levels of the working alliance, which in turn was associated decreased client distress; and working alliance was higher later in psychotherapy. There was a significant indirect effect of psychodynamic techniques on decreases in distress mediated by the working alliance. Implications for theory, practice, and research are provided. Clinical or methodological significance of this article: Conducted a longitudinal, latent variable examination of the relationships of psychodynamic techniques and working alliance on client distress. Psychodynamic techniques have an indirect effect on decreases in client distress through the dyadic working alliance.

  2. Transformer Incipient Fault Prediction Using Combined Artificial Neural Network and Various Particle Swarm Optimisation Techniques.

    PubMed

    Illias, Hazlee Azil; Chai, Xin Rui; Abu Bakar, Ab Halim; Mokhlis, Hazlie

    2015-01-01

    It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works.

  3. Transformer Incipient Fault Prediction Using Combined Artificial Neural Network and Various Particle Swarm Optimisation Techniques

    PubMed Central

    2015-01-01

    It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works. PMID:26103634

  4. Alternative Constraint Handling Technique for Four-Bar Linkage Path Generation

    NASA Astrophysics Data System (ADS)

    Sleesongsom, S.; Bureerat, S.

    2018-03-01

    This paper proposes an extension of a new concept for path generation from our previous work by adding a new constraint handling technique. The propose technique was initially designed for problems without prescribed timing by avoiding the timing constraint, while remain constraints are solving with a new constraint handling technique. The technique is one kind of penalty technique. The comparative study is optimisation of path generation problems are solved using self-adaptive population size teaching-learning based optimization (SAP-TLBO) and original TLBO. In this study, two traditional path generation test problem are used to test the proposed technique. The results show that the new technique can be applied with the path generation problem without prescribed timing and gives better results than the previous technique. Furthermore, SAP-TLBO outperforms the original one.

  5. Estimation of minimum miscibility pressure (MMP) of CO2 and liquid n-alkane systems using an improved MRI technique.

    PubMed

    Liu, Yu; Jiang, Lanlan; Song, Yongchen; Zhao, Yuechao; Zhang, Yi; Wang, Dayong

    2016-02-01

    Minimum miscible pressure (MMP) of gas and oil system is a key parameter for the injection system design of CO2 miscible flooding. Some industrial standard approaches such as the experiment using a rising bubble apparatus (RBA), the slim tube tests (STT), the pressure-density diagram (PDD), etc. have been applied for decades to determine the MMP of gas and oil. Some theoretical or experiential calculations of the MMP were also applied to the gas-oil miscible system. In the present work, an improved technique based on our previous research for the estimation of the MMP by using magnetic resonance imaging (MRI) was proposed. This technique was then applied to the CO2 and n-alkane binary and ternary systems to observe the mixing procedure and to study the miscibility. MRI signal intensities, which represent the proton concentration of n-alkane in both the hydrocarbon rich phase and the CO2 rich phase, were plotted as a reference for determining the MMP. The accuracy of the MMP obtained by using this improved technique was enhanced comparing with the data obtained from our previous works. The results also show good agreement with other established techniques (such as the STT) in previous published works. It demonstrates increases of MMPs as the temperature rise from 20 °C to 37.8 °C. The MMPs of CO2 and n-alkane systems are also found to be proportional to the carbon number in the range of C10 to C14. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Improved image guidance technique for minimally invasive mitral valve repair using real-time tracked 3D ultrasound

    NASA Astrophysics Data System (ADS)

    Rankin, Adam; Moore, John; Bainbridge, Daniel; Peters, Terry

    2016-03-01

    In the past ten years, numerous new surgical and interventional techniques have been developed for treating heart valve disease without the need for cardiopulmonary bypass. Heart valve repair is now being performed in a blood-filled environment, reinforcing the need for accurate and intuitive imaging techniques. Previous work has demonstrated how augmenting ultrasound with virtual representations of specific anatomical landmarks can greatly simplify interventional navigation challenges and increase patient safety. These techniques often complicate interventions by requiring additional steps taken to manually define and initialize virtual models. Furthermore, overlaying virtual elements into real-time image data can also obstruct the view of salient image information. To address these limitations, a system was developed that uses real-time volumetric ultrasound alongside magnetically tracked tools presented in an augmented virtuality environment to provide a streamlined navigation guidance platform. In phantom studies simulating a beating-heart navigation task, procedure duration and tool path metrics have achieved comparable performance to previous work in augmented virtuality techniques, and considerable improvement over standard of care ultrasound guidance.

  7. General Aviation Interior Noise. Part 3; Noise Control Measure Evaluation

    NASA Technical Reports Server (NTRS)

    Unruh, James F.; Till, Paul D.; Palumbo, Daniel L. (Technical Monitor)

    2002-01-01

    The work reported herein is an extension to the work accomplished under NASA Grant NAG1-2091 on the development of noise/source/path identification techniques for single engine propeller driven General Aviation aircraft. The previous work developed a Conditioned Response Analysis (CRA) technique to identify potential noise sources that contributed to the dominating tonal responses within the aircraft cabin. The objective of the present effort was to improve and verify the findings of the CRA and develop and demonstrate noise control measures for single engine propeller driven General Aviation aircraft.

  8. Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.

    ERIC Educational Resources Information Center

    Moffat, A. J.; And Others

    Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…

  9. Scalability of a Methodology for Generating Technical Trading Rules with GAPs Based on Risk-Return Adjustment and Incremental Training

    NASA Astrophysics Data System (ADS)

    de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.

    In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carla J. Miller

    This report provides a summary of the literature review that was performed and based on previous work performed at the Idaho National Laboratory studying the Three Mile Island 2 (TMI-2) nuclear reactor accident, specifically the melted fuel debris. The purpose of the literature review was to document prior published work that supports the feasibility of the analytical techniques that were developed to provide quantitative results of the make-up of the fuel and reactor component debris located inside and outside the containment. The quantitative analysis provides a technique to perform nuclear fuel accountancy measurements

  11. CYCLOPS-3 System Research.

    ERIC Educational Resources Information Center

    Marill, Thomas; And Others

    The aim of the CYCLOPS Project research is the development of techniques for allowing computers to perform visual scene analysis, pre-processing of visual imagery, and perceptual learning. Work on scene analysis and learning has previously been described. The present report deals with research on pre-processing and with further work on scene…

  12. A Secure Test Technique for Pipelined Advanced Encryption Standard

    NASA Astrophysics Data System (ADS)

    Shi, Youhua; Togawa, Nozomu; Yanagisawa, Masao; Ohtsuki, Tatsuo

    In this paper, we presented a Design-for-Secure-Test (DFST) technique for pipelined AES to guarantee both the security and the test quality during testing. Unlike previous works, the proposed method can keep all the secrets inside and provide high test quality and fault diagnosis ability as well. Furthermore, the proposed DFST technique can significantly reduce test application time, test data volume, and test generation effort as additional benefits.

  13. Fiber-Optic Sensing for In-Space Inspection

    NASA Technical Reports Server (NTRS)

    Pena, Francisco; Richards, W. Lance; Piazza, Anthony; Parker, Allen R.; Hudson, Larry D.

    2014-01-01

    This presentation provides examples of fiber optic sensing technology development activities performed at NASA Armstrong. Examples of current and previous work that support in-space inspection techniques and methodologies are highlighted.

  14. Swarm Intelligence: New Techniques for Adaptive Systems to Provide Learning Support

    ERIC Educational Resources Information Center

    Wong, Lung-Hsiang; Looi, Chee-Kit

    2012-01-01

    The notion of a system adapting itself to provide support for learning has always been an important issue of research for technology-enabled learning. One approach to provide adaptivity is to use social navigation approaches and techniques which involve analysing data of what was previously selected by a cluster of users or what worked for…

  15. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less

  16. Erratum to "10 Gbit/s mode-multiplexed QPSK transmission using MDM-to-MFDM based single coherent receiver for intra- and inter data center networking" [Opt. Commun. 391 (2017) 106-110

    NASA Astrophysics Data System (ADS)

    Asif, Rameez; Haithem, Mustafa

    2018-03-01

    We revisited our previous work "10 Gbit/s mode-multiplexed QPSK transmission using MDM-to-MFDM based single coherent receiver for intraand inter data center networking" [Opt. Commun. 391 (2017) 106-110] and discover a mistake in the Appendix 'A', i.e. mode-selective coherent detection technique. In this section, the direct referencing of the previous work at appropriate points is not adequate (page no. 109).

  17. Correlating Resolving Power, Resolution, and Collision Cross Section: Unifying Cross-Platform Assessment of Separation Efficiency in Ion Mobility Spectrometry.

    PubMed

    Dodds, James N; May, Jody C; McLean, John A

    2017-11-21

    Here we examine the relationship among resolving power (R p ), resolution (R pp ), and collision cross section (CCS) for compounds analyzed in previous ion mobility (IM) experiments representing a wide variety of instrument platforms and IM techniques. Our previous work indicated these three variables effectively describe and predict separation efficiency for drift tube ion mobility spectrometry experiments. In this work, we seek to determine if our previous findings are a general reflection of IM behavior that can be applied to various instrument platforms and mobility techniques. Results suggest IM distributions are well characterized by a Gaussian model and separation efficiency can be predicted on the basis of the empirical difference in the gas-phase CCS and a CCS-based resolving power definition (CCS/ΔCCS). Notably traveling wave (TWIMS) was found to operate at resolutions substantially higher than a single-peak resolving power suggested. When a CCS-based R p definition was utilized, TWIMS was found to operate at a resolving power between 40 and 50, confirming the previous observations by Giles and co-workers. After the separation axis (and corresponding resolving power) is converted to cross section space, it is possible to effectively predict separation behavior for all mobility techniques evaluated (i.e., uniform field, trapped ion mobility, traveling wave, cyclic, and overtone instruments) using the equations described in this work. Finally, we are able to establish for the first time that the current state-of-the-art ion mobility separations benchmark at a CCS-based resolving power of >300 that is sufficient to differentiate analyte ions with CCS differences as small as 0.5%.

  18. Improving cerebellar segmentation with statistical fusion

    NASA Astrophysics Data System (ADS)

    Plassard, Andrew J.; Yang, Zhen; Prince, Jerry L.; Claassen, Daniel O.; Landman, Bennett A.

    2016-03-01

    The cerebellum is a somatotopically organized central component of the central nervous system well known to be involved with motor coordination and increasingly recognized roles in cognition and planning. Recent work in multiatlas labeling has created methods that offer the potential for fully automated 3-D parcellation of the cerebellar lobules and vermis (which are organizationally equivalent to cortical gray matter areas). This work explores the trade offs of using different statistical fusion techniques and post hoc optimizations in two datasets with distinct imaging protocols. We offer a novel fusion technique by extending the ideas of the Selective and Iterative Method for Performance Level Estimation (SIMPLE) to a patch-based performance model. We demonstrate the effectiveness of our algorithm, Non- Local SIMPLE, for segmentation of a mixed population of healthy subjects and patients with severe cerebellar anatomy. Under the first imaging protocol, we show that Non-Local SIMPLE outperforms previous gold-standard segmentation techniques. In the second imaging protocol, we show that Non-Local SIMPLE outperforms previous gold standard techniques but is outperformed by a non-locally weighted vote with the deeper population of atlases available. This work advances the state of the art in open source cerebellar segmentation algorithms and offers the opportunity for routinely including cerebellar segmentation in magnetic resonance imaging studies that acquire whole brain T1-weighted volumes with approximately 1 mm isotropic resolution.

  19. Multiobjective Resource-Constrained Project Scheduling with a Time-Varying Number of Tasks

    PubMed Central

    Abello, Manuel Blanco

    2014-01-01

    In resource-constrained project scheduling (RCPS) problems, ongoing tasks are restricted to utilizing a fixed number of resources. This paper investigates a dynamic version of the RCPS problem where the number of tasks varies in time. Our previous work investigated a technique called mapping of task IDs for centroid-based approach with random immigrants (McBAR) that was used to solve the dynamic problem. However, the solution-searching ability of McBAR was investigated over only a few instances of the dynamic problem. As a consequence, only a small number of characteristics of McBAR, under the dynamics of the RCPS problem, were found. Further, only a few techniques were compared to McBAR with respect to its solution-searching ability for solving the dynamic problem. In this paper, (a) the significance of the subalgorithms of McBAR is investigated by comparing McBAR to several other techniques; and (b) the scope of investigation in the previous work is extended. In particular, McBAR is compared to a technique called, Estimation Distribution Algorithm (EDA). As with McBAR, EDA is applied to solve the dynamic problem, an application that is unique in the literature. PMID:24883398

  20. Noise Abatement Techniques for Construction Equipment

    DOT National Transportation Integrated Search

    1979-08-01

    The primary objective of this work was to transfer technology developed in the area of truck noise reduction to that of construction equipment. Included is information gathered from previous contracts, surveys of manufacturers, a noise impact ranking...

  1. Do work technique and musculoskeletal symptoms differ between men and women performing the same type of work tasks?

    PubMed

    Dahlberg, Raymond; Karlqvist, Lena; Bildt, Carina; Nykvist, Karin

    2004-11-01

    Musculoskeletal disorders are more common among women than among men. When comparing the difference between men and women in the prevalence of musculoskeletal disorders, methodological problems arise as men and women seldom perform the same type of activities, neither at work nor at home. The main objective of this cross-sectional case study was to compare work technique and self-reported musculoskeletal symptoms between men and women performing the same type of work tasks within a metal industry. Other factors, such as leisure activities, were also taken into consideration. Three data collection methods were used; questionnaire, interviews and systematic observations. The results from the observations revealed that women worked more frequently and during longer times with their hands above shoulder height than men. Working with hands above shoulder height is considered a risk factor for neck and shoulder disorders according to previous studies. Workplace design factors were probably a reason for differences in working technique between men and women. A higher proportion of women than men reported shoulder symptoms. Women spent more time on household activities than men, which indicates a higher total workload in paid and unpaid work.

  2. An improved output feedback control of flexible large space structures

    NASA Technical Reports Server (NTRS)

    Lin, Y. H.; Lin, J. G.

    1980-01-01

    A special output feedback control design technique for flexible large space structures is proposed. It is shown that the technique will increase both the damping and frequency of selected modes for more effective control. It is also able to effect integrated control of elastic and rigid-body modes and, in particular, closed-loop system stability and robustness to modal truncation and parameter variation. The technique is seen as marking an improvement over previous work concerning large space structures output feedback control.

  3. Digital multishaker modal testing

    NASA Technical Reports Server (NTRS)

    Blair, M.; Craig, R. R., Jr.

    1983-01-01

    A review of several modal testing techniques is made, along with brief discussions of their advantages and limitations. A new technique is presented which overcomes many of the previous limitations. Several simulated experiments are included to verify the validity and accuracy of the new method. Conclusions are drawn from the simulation studies and recommendations for further work are presented. The complete computer code configured for the simulation study is presented.

  4. Expanded Awareness of Student Performance: A Case Study in Applied Ethnographic Monitoring in a Bilingual Classroom. Sociolinguistic Working Paper Number 60.

    ERIC Educational Resources Information Center

    Carrasco, Robert L.

    The case study of the use of a classroom observation technique to evaluate the abilities and performance of a bilingual kindergarten student previously assessed as a low achiever is described. There are three objectives: to show the validity of the ethnographic monitoring technique, to show the value of teachers as collaborating researchers, and…

  5. Job Performance as Multivariate Dynamic Criteria: Experience Sampling and Multiway Component Analysis.

    PubMed

    Spain, Seth M; Miner, Andrew G; Kroonenberg, Pieter M; Drasgow, Fritz

    2010-08-06

    Questions about the dynamic processes that drive behavior at work have been the focus of increasing attention in recent years. Models describing behavior at work and research on momentary behavior indicate that substantial variation exists within individuals. This article examines the rationale behind this body of work and explores a method of analyzing momentary work behavior using experience sampling methods. The article also examines a previously unused set of methods for analyzing data produced by experience sampling. These methods are known collectively as multiway component analysis. Two archetypal techniques of multimode factor analysis, the Parallel factor analysis and the Tucker3 models, are used to analyze data from Miner, Glomb, and Hulin's (2010) experience sampling study of work behavior. The efficacy of these techniques for analyzing experience sampling data is discussed as are the substantive multimode component models obtained.

  6. Paracoccygeal corkscrew approach to ganglion impar injections for tailbone pain.

    PubMed

    Foye, Patrick M; Patel, Shounuck I

    2009-01-01

    A new technique for performing nerve blocks of the ganglion impar (ganglion Walther) is presented. These injections have been reported to relieve coccydynia (tailbone pain), as well as other malignant and nonmalignant pelvic pain syndromes. A variety of techniques have been previously described for blocking this sympathetic nerve ganglion, which is located in the retrorectal space just anterior to the upper coccygeal segments. Prior techniques have included approaches through the anococcygeal ligament, through the sacrococcygeal joint, and through intracoccygeal joint spaces. This article presents a new, paracoccygeal approach whereby the needle is inserted alongside the coccyx and the needle is guided through three discrete steps with a rotating or corkscrew trajectory. Compared with some of the previously published techniques, this paracoccygeal corkscrew approach has multiple potential benefits, including ease of fluoroscopic guidance using the lateral view, ability to easily use a stylet for the spinal needle, and use of a shorter, thinner needle. While no single technique works best for all patients and each technique has potential advantages and disadvantages, this new technique adds to the available options.

  7. Speciation of individual mineral particles of micrometer size by the combined use of attenuated total reflectance-Fourier transform-infrared imaging and quantitative energy-dispersive electron probe X-ray microanalysis techniques.

    PubMed

    Jung, Hae-Jin; Malek, Md Abdul; Ryu, JiYeon; Kim, BoWha; Song, Young-Chul; Kim, HyeKyeong; Ro, Chul-Un

    2010-07-15

    Our previous work demonstrated for the first time the potential of the combined use of two techniques, attenuated total reflectance FT-IR (ATR-FT-IR) imaging and a quantitative energy-dispersive electron probe X-ray microanalysis, low-Z particle EPMA, for the characterization of individual aerosol particles. In this work, the speciation of mineral particles was performed on a single particle level for 24 mineral samples, including kaolinite, montmorillonite, vermiculite, talc, quartz, feldspar, calcite, gypsum, and apatite, by the combined use of ATR-FT-IR imaging and low-Z particle EPMA techniques. These two single particle analytical techniques provide complementary information, the ATR-FT-IR imaging on mineral types and low-Z particle EPMA on the morphology and elemental concentrations, on the same individual particles. This work demonstrates that the combined use of the two single particle analytical techniques can powerfully characterize externally heterogeneous mineral particle samples in detail and has great potential for the characterization of airborne mineral dust particles.

  8. [Rehabilitation, ethics and technique].

    PubMed

    De Martini, André

    2011-04-01

    This paper initially includes the presentation of some ideas on the deficiency and the process of rehabilitation, whereby the latter is defined in its "condition" as a process. A few differences in relation to the idea of a program in the strict sense (defined as a fixed set of previously defined procedures or techniques) will be detected, as well as some ethical implications in the social, health or educational fields for professionals working with the disabled. Thus, the handling of the technique and the use of institutional measures will be discussed in this context, inasmuch as they are related to subjective and educational processes inherent to rehabilitation work. In so doing, we hope to contribute to a better understanding of the role pertaining to these professionals.

  9. A low-rank matrix recovery approach for energy efficient EEG acquisition for a wireless body area network.

    PubMed

    Majumdar, Angshul; Gogna, Anupriya; Ward, Rabab

    2014-08-25

    We address the problem of acquiring and transmitting EEG signals in Wireless Body Area Networks (WBAN) in an energy efficient fashion. In WBANs, the energy is consumed by three operations: sensing (sampling), processing and transmission. Previous studies only addressed the problem of reducing the transmission energy. For the first time, in this work, we propose a technique to reduce sensing and processing energy as well: this is achieved by randomly under-sampling the EEG signal. We depart from previous Compressed Sensing based approaches and formulate signal recovery (from under-sampled measurements) as a matrix completion problem. A new algorithm to solve the matrix completion problem is derived here. We test our proposed method and find that the reconstruction accuracy of our method is significantly better than state-of-the-art techniques; and we achieve this while saving sensing, processing and transmission energy. Simple power analysis shows that our proposed methodology consumes considerably less power compared to previous CS based techniques.

  10. Biased Target Ion Beam Deposition and Nanoskiving for Fabricating NiTi Alloy Nanowires

    NASA Astrophysics Data System (ADS)

    Hou, Huilong; Horn, Mark W.; Hamilton, Reginald F.

    2016-12-01

    Nanoskiving is a novel nanofabrication technique to produce shape memory alloy nanowires. Our previous work was the first to successfully fabricate NiTi alloy nanowires using the top-down approach, which leverages thin film technology and ultramicrotomy for ultra-thin sectioning. For this work, we utilized biased target ion beam deposition technology to fabricate nanoscale (i.e., sub-micrometer) NiTi alloy thin films. In contrast to our previous work, rapid thermal annealing was employed for heat treatment, and the B2 austenite to R-phase martensitic transformation was confirmed using stress-temperature and diffraction measurements. The ultramicrotome was programmable and facilitated sectioning the films to produce nanowires with thickness-to-width ratios ranging from 4:1 to 16:1. Energy dispersive X-ray spectroscopy analysis confirmed the elemental Ni and Ti make-up of the wires. The findings exposed the nanowires exhibited a natural ribbon-like curvature, which depended on the thickness-to-width ratio. The results demonstrate nanoskiving is a potential nanofabrication technique for producing NiTi alloy nanowires that are continuous with an unprecedented length on the order of hundreds of micrometers.

  11. An improved diffusion welding technique for TD-NiCr

    NASA Technical Reports Server (NTRS)

    Holko, K. H.

    1973-01-01

    An improved diffusion welding technique has been developed for TD-NiCr sheet. In the most preferred form, the improved technique consists of diffusion welding 320-grit sanded plus chemically polished surfaces of unrecrystallized TD-NiCr at 760 C under 140 MN/m2 pressure for 1hr followed by postheating at 1180 C for 2hr. Compared to previous work, this improved technique has the advantages of shorter welding time, lower welding temperature, lower welding pressure, and a simpler and more reproducible surface preparation procedure. Weldments were made that had parent-metal creep-rupture shear strength at 1100 C.

  12. Structural response synthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozisik, H.; Keltie, R.F.

    The open loop control technique of predicting a conditioned input signal based on a specified output response for a second order system has been analyzed both analytically and numerically to gain a firm understanding of the method. Differences between this method of control and digital closed loop control using pole cancellation were investigated as a follow up to previous experimental work. Application of the technique to diamond turning using a fast tool is also discussed.

  13. Complex refractive index measurements for BaF 2 and CaF 2 via single-angle infrared reflectance spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly-Gorham, Molly Rose K.; DeVetter, Brent M.; Brauer, Carolyn S.

    We have re-investigated the optical constants n and k for the homologous series of inorganic salts barium fluoride (BaF2) and calcium fluoride (CaF2) using a single-angle near-normal incidence reflectance device in combination with a calibrated Fourier transform infrared (FTIR) spectrometer. Our results are in good qualitative agreement with most previous works. However, certain features of the previously published data near the reststrahlen band exhibit distinct differences in spectral characteristics. Notably, our measurements of BaF2 do not include a spectral feature in the ~250 cm-1 reststrahlen band that was previously published. Additionally, CaF2 exhibits a distinct wavelength shift relative to themore » model derived from previously published data. We confirmed our results with recently published works that use significantly more modern instrumentation and data reduction techniques« less

  14. Nonholonomic Hamiltonian Method for Meso-macroscale Simulations of Reacting Shocks

    NASA Astrophysics Data System (ADS)

    Fahrenthold, Eric; Lee, Sangyup

    2015-06-01

    The seamless integration of macroscale, mesoscale, and molecular scale models of reacting shock physics has been hindered by dramatic differences in the model formulation techniques normally used at different scales. In recent research the authors have developed the first unified discrete Hamiltonian approach to multiscale simulation of reacting shock physics. Unlike previous work, the formulation employs reacting themomechanical Hamiltonian formulations at all scales, including the continuum. Unlike previous work, the formulation employs a nonholonomic modeling approach to systematically couple the models developed at all scales. Example applications of the method show meso-macroscale shock to detonation simulations in nitromethane and RDX. Research supported by the Defense Threat Reduction Agency.

  15. Application of response surface techniques to helicopter rotor blade optimization procedure

    NASA Technical Reports Server (NTRS)

    Henderson, Joseph Lynn; Walsh, Joanne L.; Young, Katherine C.

    1995-01-01

    In multidisciplinary optimization problems, response surface techniques can be used to replace the complex analyses that define the objective function and/or constraints with simple functions, typically polynomials. In this work a response surface is applied to the design optimization of a helicopter rotor blade. In previous work, this problem has been formulated with a multilevel approach. Here, the response surface takes advantage of this decomposition and is used to replace the lower level, a structural optimization of the blade. Problems that were encountered and important considerations in applying the response surface are discussed. Preliminary results are also presented that illustrate the benefits of using the response surface.

  16. In vivo dosimetry for total body irradiation: five‐year results and technique comparison

    PubMed Central

    Warry, Alison J.; Eaton, David J.; Collis, Christopher H.; Rosenberg, Ivan

    2014-01-01

    The aim of this work is to establish if the new CT‐based total body irradiation (TBI) planning techniques used at University College London Hospital (UCLH) and Royal Free Hospital (RFH) are comparable to the previous technique at the Middlesex Hospital (MXH) by analyzing predicted and measured diode results. TBI aims to deliver a homogeneous dose to the entire body, typically using extended SSD fields with beam modulation to limit doses to organs at risk. In vivo dosimetry is used to verify the accuracy of delivered doses. In 2005, when the Middlesex Hospital was decommissioned and merged with UCLH, both UCLH and the RFH introduced updated CT‐planned TBI techniques, based on the old MXH technique. More CT slices and in vivo measurement points were used by both; UCLH introduced a beam modulation technique using MLC segments, while RFH updated to a combination of lead compensators and bolus. Semiconductor diodes were used to measure entrance and exit doses in several anatomical locations along the entire body. Diode results from both centers for over five years of treatments were analyzed and compared to the previous MXH technique for accuracy and precision of delivered doses. The most stable location was the field center with standard deviations of 4.1% (MXH), 3.7% (UCLH), and 1.7% (RFH). The least stable position was the ankles. Mean variation with fraction number was within 1.5% for all three techniques. In vivo dosimetry can be used to verify complex modulated CT‐planned TBI, and demonstrate improvements and limitations in techniques. The results show that the new UCLH technique is no worse than the previous MXH one and comparable to the current RFH technique. PACS numbers: 87.55.Qr, 87.56.N‐ PMID:25207423

  17. In vivo dosimetry for total body irradiation: five-year results and technique comparison.

    PubMed

    Patel, Reshma P; Warry, Alison J; Eaton, David J; Collis, Christopher H; Rosenberg, Ivan

    2014-07-08

    The aim of this work is to establish if the new CT-based total body irradiation (TBI) planning techniques used at University College London Hospital (UCLH) and Royal Free Hospital (RFH) are comparable to the previous technique at the Middlesex Hospital (MXH) by analyzing predicted and measured diode results. TBI aims to deliver a homogeneous dose to the entire body, typically using extended SSD fields with beam modulation to limit doses to organs at risk. In vivo dosimetry is used to verify the accuracy of delivered doses. In 2005, when the Middlesex Hospital was decommissioned and merged with UCLH, both UCLH and the RFH introduced updated CT-planned TBI techniques, based on the old MXH technique. More CT slices and in vivo measurement points were used by both; UCLH introduced a beam modulation technique using MLC segments, while RFH updated to a combination of lead compensators and bolus. Semiconductor diodes were used to measure entrance and exit doses in several anatomical locations along the entire body. Diode results from both centers for over five years of treatments were analyzed and compared to the previous MXH technique for accuracy and precision of delivered doses. The most stable location was the field center with standard deviations of 4.1% (MXH), 3.7% (UCLH), and 1.7% (RFH). The least stable position was the ankles. Mean variation with fraction number was within 1.5% for all three techniques. In vivo dosimetry can be used to verify complex modulated CT-planned TBI, and demonstrate improvements and limitations in techniques. The results show that the new UCLH technique is no worse than the previous MXH one and comparable to the current RFH technique.

  18. SOME STATISTICAL ISSUES RELATED TO MULTIPLE LINEAR REGRESSION MODELING OF BEACH BACTERIA CONCENTRATIONS

    EPA Science Inventory

    As a fast and effective technique, the multiple linear regression (MLR) method has been widely used in modeling and prediction of beach bacteria concentrations. Among previous works on this subject, however, several issues were insufficiently or inconsistently addressed. Those is...

  19. Teaching Calculus Students How to Study.

    ERIC Educational Resources Information Center

    Boelkins, Matthew R.; Pfaff, Thomas J.

    1998-01-01

    Addresses the problem of poor study habits in calculus students and presents techniques to teach students how to study consistently and effectively. Concludes that many students greatly appreciate the added structure, work harder than in previous courses, and witness newfound success as a consequence. (Author/ASK)

  20. Advances in dual-tone development for pitch frequency doubling

    NASA Astrophysics Data System (ADS)

    Fonseca, Carlos; Somervell, Mark; Scheer, Steven; Kuwahara, Yuhei; Nafus, Kathleen; Gronheid, Roel; Tarutani, Shinji; Enomoto, Yuuichiro

    2010-04-01

    Dual-tone development (DTD) has been previously proposed as a potential cost-effective double patterning technique1. DTD was reported as early as in the late 1990's2. The basic principle of dual-tone imaging involves processing exposed resist latent images in both positive tone (aqueous base) and negative tone (organic solvent) developers. Conceptually, DTD has attractive cost benefits since it enables pitch doubling without the need for multiple etch steps of patterned resist layers. While the concept for DTD technique is simple to understand, there are many challenges that must be overcome and understood in order to make it a manufacturing solution. Previous work by the authors demonstrated feasibility of DTD imaging for 50nm half-pitch features at 0.80NA (k1 = 0.21) and discussed challenges lying ahead for printing sub-40nm half-pitch features with DTD. While previous experimental results suggested that clever processing on the wafer track can be used to enable DTD beyond 50nm halfpitch, it also suggest that identifying suitable resist materials or chemistries is essential for achieving successful imaging results with novel resist processing methods on the wafer track. In this work, we present recent advances in the search for resist materials that work in conjunction with novel resist processing methods on the wafer track to enable DTD. Recent experimental results with new resist chemistries, specifically designed for DTD, are presented in this work. We also present simulation studies that help and support identifying resist properties that could enable DTD imaging, which ultimately lead to producing viable DTD resist materials.

  1. Mapping land cover from satellite images: A basic, low cost approach

    NASA Technical Reports Server (NTRS)

    Elifrits, C. D.; Barney, T. W.; Barr, D. J.; Johannsen, C. J.

    1978-01-01

    Simple, inexpensive methodologies developed for mapping general land cover and land use categories from LANDSAT images are reported. One methodology, a stepwise, interpretive, direct tracing technique was developed through working with university students from different disciplines with no previous experience in satellite image interpretation. The technique results in maps that are very accurate in relation to actual land cover and relative to the small investment in skill, time, and money needed to produce the products.

  2. Approaches to Classroom-Based Computational Science.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…

  3. The Impact of Life Events on Job Satisfaction

    ERIC Educational Resources Information Center

    Georgellis, Yannis; Lange, Thomas; Tabvuma, Vurain

    2012-01-01

    Employing fixed effects regression techniques on longitudinal data, we investigate how life events affect employees' job satisfaction. Unlike previous work-life research, exploring mostly contemporaneous correlations, we look for evidence of adaptation in the years following major life events. We find evidence of adaptation following the first…

  4. An expert system shell for inferring vegetation characteristics: Implementation of additional techniques (task E)

    NASA Technical Reports Server (NTRS)

    Harrison, P. Ann

    1992-01-01

    The NASA VEGetation Workbench (VEG) is a knowledge based system that infers vegetation characteristics from reflectance data. The VEG subgoal PROPORTION.GROUND.COVER has been completed and a number of additional techniques that infer the proportion ground cover of a sample have been implemented. Some techniques operate on sample data at a single wavelength. The techniques previously incorporated in VEG for other subgoals operated on data at a single wavelength so implementing the additional single wavelength techniques required no changes to the structure of VEG. Two techniques which use data at multiple wavelengths to infer proportion ground cover were also implemented. This work involved modifying the structure of VEG so that multiple wavelength techniques could be incorporated. All the new techniques were tested using both the VEG 'Research Mode' and the 'Automatic Mode.'

  5. The Persistence of the Gender Gap in Introductory Physics

    NASA Astrophysics Data System (ADS)

    Kost, Lauren E.; Pollock, Steven J.; Finkelstein, Noah D.

    2008-10-01

    We previously showed[l] that despite teaching with interactive engagement techniques, the gap in performance between males and females on conceptual learning surveys persisted from pre- to posttest, at our institution. Such findings were counter to previously published work[2]. Our current work analyzes factors that may influence the observed gender gap in our courses. Posttest conceptual assessment data are modeled using both multiple regression and logistic regression analyses to estimate the gender gap in posttest scores after controlling for background factors that vary by gender. We find that at our institution the gender gap persists in interactive physics classes, but is largely due to differences in physics and math preparation and incoming attitudes and beliefs.

  6. An experimental study of nonlinear dynamic system identification

    NASA Technical Reports Server (NTRS)

    Stry, Greselda I.; Mook, D. Joseph

    1990-01-01

    A technique for robust identification of nonlinear dynamic systems is developed and illustrated using both simulations and analog experiments. The technique is based on the Minimum Model Error optimal estimation approach. A detailed literature review is included in which fundamental differences between the current approach and previous work is described. The most significant feature of the current work is the ability to identify nonlinear dynamic systems without prior assumptions regarding the form of the nonlinearities, in constrast to existing nonlinear identification approaches which usually require detailed assumptions of the nonlinearities. The example illustrations indicate that the method is robust with respect to prior ignorance of the model, and with respect to measurement noise, measurement frequency, and measurement record length.

  7. Partial information decomposition as a spatiotemporal filter.

    PubMed

    Flecker, Benjamin; Alford, Wesley; Beggs, John M; Williams, Paul L; Beer, Randall D

    2011-09-01

    Understanding the mechanisms of distributed computation in cellular automata requires techniques for characterizing the emergent structures that underlie information processing in such systems. Recently, techniques from information theory have been brought to bear on this problem. Building on this work, we utilize the new technique of partial information decomposition to show that previous information-theoretic measures can confound distinct sources of information. We then propose a new set of filters and demonstrate that they more cleanly separate out the background domains, particles, and collisions that are typically associated with information storage, transfer, and modification in cellular automata.

  8. Exploring Mass Perception with Markov Chain Monte Carlo

    ERIC Educational Resources Information Center

    Cohen, Andrew L.; Ross, Michael G.

    2009-01-01

    Several previous studies have examined the ability to judge the relative mass of objects in idealized collisions. With a newly developed technique of psychological Markov chain Monte Carlo sampling (A. N. Sanborn & T. L. Griffiths, 2008), this work explores participants; perceptions of different collision mass ratios. The results reveal…

  9. Teaching Shakespeare, II.

    ERIC Educational Resources Information Center

    Salomone, Ronald E., Ed.

    1985-01-01

    Because of the wide and continuing interest in a previous issue on techniques for teaching works by Shakespeare, this journal issue presents 19 additional articles on a broad range of Shakespeare related topics. Following an introduction, the titles of the articles and their authors are as follows: (1) "Making Changes/Making Sense"…

  10. Metallurgical Research Relating to the Development of Metals and Alloys for Use in the High-Temperature Components of Jet-Engines, Gas Turbines and Other Aircraft Propulsion Systems

    NASA Technical Reports Server (NTRS)

    1948-01-01

    Considerable work has been done on report preparation. All items listed in the March program report will be reported during July. Fundamental studies are in progress to establish the fundamental processes by which treatments and composition control properties of commercial alloys at high temperatures. As yet work has been confined to Low-Carbon N155 alloy and progress has been reported twice previously. The work is divided into two sections: studies of solution treated and aged material and studies of rolled structures. Electron microscopic work has been started as an additional technique for the studies. Brief descriptions of experimental techniques used, results, and interpretation of the data obtained since the last report covering this field are summarized below. Since the work outlined is to a large extent still in progress, the discussion given is to be considered tentative and subject to further modification as additional data becomes available.

  11. Robust Statistics and Regularization for Feature Extraction and UXO Discrimination

    DTIC Science & Technology

    2011-07-01

    July 11, 2011 real data we find that this technique has an improved probability of finding all ordnance in a test data set, relative to previously...many sites. Tests on larger data sets should still be carried out. In previous work we considered a bootstrapping approach to selecting the operating...Marginalizing over x we obtain the probability that the ith order statistic in the test data belongs to the T class (55) P (T |x(i)) = ∞∫ −∞ P (T |x)p(x

  12. CyberPetri at CDX 2016: Real-time Network Situation Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arendt, Dustin L.; Best, Daniel M.; Burtner, Edwin R.

    CyberPetri is a novel visualization technique that provides a flexible map of the network based on available characteristics, such as IP address, operating system, or service. Previous work introduced CyberPetri as a visualization feature in Ocelot, a network defense tool that helped security analysts understand and respond to an active defense scenario. In this paper we present a case study in which we use the CyberPetri visualization technique to support real-time situation awareness during the 2016 Cyber Defense Exercise.

  13. Severity Summarization and Just in Time Alert Computation in mHealth Monitoring.

    PubMed

    Pathinarupothi, Rahul Krishnan; Alangot, Bithin; Rangan, Ekanath

    2017-01-01

    Mobile health is fast evolving into a practical solution to remotely monitor high-risk patients and deliver timely intervention in case of emergencies. Building upon our previous work on a fast and power efficient summarization framework for remote health monitoring applications, called RASPRO (Rapid Alerts Summarization for Effective Prognosis), we have developed a real-time criticality detection technique, which ensures meeting physician defined interventional time. We also present the results from initial testing of this technique.

  14. The aggregated unfitted finite element method for elliptic problems

    NASA Astrophysics Data System (ADS)

    Badia, Santiago; Verdugo, Francesc; Martín, Alberto F.

    2018-07-01

    Unfitted finite element techniques are valuable tools in different applications where the generation of body-fitted meshes is difficult. However, these techniques are prone to severe ill conditioning problems that obstruct the efficient use of iterative Krylov methods and, in consequence, hinders the practical usage of unfitted methods for realistic large scale applications. In this work, we present a technique that addresses such conditioning problems by constructing enhanced finite element spaces based on a cell aggregation technique. The presented method, called aggregated unfitted finite element method, is easy to implement, and can be used, in contrast to previous works, in Galerkin approximations of coercive problems with conforming Lagrangian finite element spaces. The mathematical analysis of the new method states that the condition number of the resulting linear system matrix scales as in standard finite elements for body-fitted meshes, without being affected by small cut cells, and that the method leads to the optimal finite element convergence order. These theoretical results are confirmed with 2D and 3D numerical experiments.

  15. Managing a work-life balance: the experiences of midwives working in a group practice setting.

    PubMed

    Fereday, Jennifer; Oster, Candice

    2010-06-01

    To explore how a group of midwives achieved a work-life balance working within a caseload model of care with flexible work hours and on-call work. in-depth interviews were conducted and the data were analysed using a data-driven thematic analysis technique. Children, Youth and Women's Health Service (CYWHS) (previously Women's and Children's Hospital), Adelaide, where a midwifery service known as Midwifery Group Practice (MGP) offers a caseload model of care to women within a midwife-managed unit. 17 midwives who were currently working, or had previously worked, in MGP. analysis of the midwives' individual experiences provided insight into how midwives managed the flexible hours and on-call work to achieve a sustainable work-life balance within a caseload model of care. it is important for midwives working in MGP to actively manage the flexibility of their role with time on call. Organisational, team and individual structure influenced how flexibility of hours was managed; however, a period of adjustment was required to achieve this balance. the study findings offer a description of effective, sustainable strategies to manage flexible hours and on-call work that may assist other midwives working in a similar role or considering this type of work setting. Copyright 2008 Elsevier Ltd. All rights reserved.

  16. Real-time generation of infrared ocean scene based on GPU

    NASA Astrophysics Data System (ADS)

    Jiang, Zhaoyi; Wang, Xun; Lin, Yun; Jin, Jianqiu

    2007-12-01

    Infrared (IR) image synthesis for ocean scene has become more and more important nowadays, especially for remote sensing and military application. Although a number of works present ready-to-use simulations, those techniques cover only a few possible ways of water interacting with the environment. And the detail calculation of ocean temperature is rarely considered by previous investigators. With the advance of programmable features of graphic card, many algorithms previously limited to offline processing have become feasible for real-time usage. In this paper, we propose an efficient algorithm for real-time rendering of infrared ocean scene using the newest features of programmable graphics processors (GPU). It differs from previous works in three aspects: adaptive GPU-based ocean surface tessellation, sophisticated balance equation of thermal balance for ocean surface, and GPU-based rendering for infrared ocean scene. Finally some results of infrared image are shown, which are in good accordance with real images.

  17. Using Multi-Spacecraft Technique to Identify the Structure of Magnetic Field in CMEs

    NASA Astrophysics Data System (ADS)

    Al-haddad, N. A.; Jacobs, C.; Poedts, S.; Moestl, C.; Farrugia, C. J.; Lugaz, N.

    2013-12-01

    In order to understand the magnetic field structure of coronal mass ejections (CMEs), it is often required to investigate its local configuration at different positions of the CME. While this could be very challenging to implement observationally; it is rather applicable when using numerical simulations. In this work, we study the properties of a simulated CME using multi-spacecraft technique. We have shown previously how the reconstruction of magnetic fields from a single spacecraft, may yield misleading results. Here, we look into the reconstruction of the magnetic field using sets of two, and three spacecrafts at different longitudes, and discuss the effectiveness of this technique. This type of work can pave the way for future out-of-the-ecliptic missions such as Solar Probe or Solar Orbiter. Grad-Shafranov reconstruction of simulated satellite measurements of a CME containing writhed field lines.

  18. The development of additive manufacturing technique for nickel-base alloys: A review

    NASA Astrophysics Data System (ADS)

    Zadi-Maad, Ahmad; Basuki, Arif

    2018-04-01

    Nickel-base alloys are an attractive alloy due to its excellent mechanical properties, a high resistance to creep deformation, corrosion, and oxidation. However, it is a hard task to control performance when casting or forging for this material. In recent years, additive manufacturing (AM) process has been implemented to replace the conventional directional solidification process for the production of nickel-base alloys. Due to its potentially lower cost and flexibility manufacturing process, AM is considered as a substitute technique for the existing. This paper provides a comprehensive review of the previous work related to the AM techniques for Ni-base alloys while highlighting current challenges and methods to solving them. The properties of conventionally manufactured Ni-base alloys are also compared with the AM fabricated alloys. The mechanical properties obtained from tension, hardness and fatigue test are included, along with discussions of the effect of post-treatment process. Recommendations for further work are also provided.

  19. Digital mapping techniques '06 - Workshop proceedings

    USGS Publications Warehouse

    Soller, David R.

    2007-01-01

    The Digital Mapping Techniques `06 (DMT`06) workshop was attended by more than 110 technical experts from 51 agencies, universities, and private companies, including representatives from 27 state geological surveys (see Appendix A of these Proceedings). This workshop was similar in nature to the previous nine meetings, which were held in Lawrence, Kansas (Soller, 1997), Champaign, Illinois (Soller, 1998), Madison, Wisconsin (Soller, 1999), Lexington, Kentucky (Soller, 2000), Tuscaloosa, Alabama (Soller, 2001), Salt Lake City, Utah (Soller, 2002), Millersville, Pennsylvania (Soller, 2003), Portland, Oregon (Soller, 2004), and Baton Rouge, Louisiana (Soller, 2005). This year?s meeting was hosted by the Ohio Geological Survey, from June 11-14, 2006, on the Ohio State University campus in Columbus, Ohio. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops.Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, the latter of which was formed in August 1996 to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database - and for the State and Federal geological surveys - to provide more high-quality digital maps to the public.At the 2006 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, "publishing" includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.

  20. An improved algorithm of image processing technique for film thickness measurement in a horizontal stratified gas-liquid two-phase flow

    NASA Astrophysics Data System (ADS)

    Kuntoro, Hadiyan Yusuf; Hudaya, Akhmad Zidni; Dinaryanto, Okto; Majid, Akmal Irfan; Deendarlianto

    2016-06-01

    Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (hL) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.

  1. Comparison of digital and conventional impression techniques: evaluation of patients' perception, treatment comfort, effectiveness and clinical outcomes.

    PubMed

    Yuzbasioglu, Emir; Kurt, Hanefi; Turunc, Rana; Bilir, Halenur

    2014-01-30

    The purpose of this study was to compare two impression techniques from the perspective of patient preferences and treatment comfort. Twenty-four (12 male, 12 female) subjects who had no previous experience with either conventional or digital impression participated in this study. Conventional impressions of maxillary and mandibular dental arches were taken with a polyether impression material (Impregum, 3 M ESPE), and bite registrations were made with polysiloxane bite registration material (Futar D, Kettenbach). Two weeks later, digital impressions and bite scans were performed using an intra-oral scanner (CEREC Omnicam, Sirona). Immediately after the impressions were made, the subjects' attitudes, preferences and perceptions towards impression techniques were evaluated using a standardized questionnaire. The perceived source of stress was evaluated using the State-Trait Anxiety Scale. Processing steps of the impression techniques (tray selection, working time etc.) were recorded in seconds. Statistical analyses were performed with the Wilcoxon Rank test, and p < 0.05 was considered significant. There were significant differences among the groups (p < 0.05) in terms of total working time and processing steps. Patients stated that digital impressions were more comfortable than conventional techniques. Digital impressions resulted in a more time-efficient technique than conventional impressions. Patients preferred the digital impression technique rather than conventional techniques.

  2. An improved algorithm of image processing technique for film thickness measurement in a horizontal stratified gas-liquid two-phase flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuntoro, Hadiyan Yusuf, E-mail: hadiyan.y.kuntoro@mail.ugm.ac.id; Majid, Akmal Irfan; Deendarlianto, E-mail: deendarlianto@ugm.ac.id

    Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methodsmore » and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (h{sub L}) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.« less

  3. Refinement of horizontal resolution in dynamical downscaling of climate information using WRF: Costs, benefits, and lessons learned

    EPA Science Inventory

    Dynamical downscaling techniques have previously been developed by the U.S. Environmental Protection Agency (EPA) using a nested WRF at 108- and 36-km. Subsequent work extended one-way nesting down to 12-km resolution. Recently, the EPA Office of Research and Development used com...

  4. Preliminary Experiments with a Triple-Layer Phoswich Detector for Radioxenon Detection

    DTIC Science & Technology

    2008-09-01

    Figure 7b; with a significant attenuation which was predicted by our MCNP modeling (Farsoni et al., 2007). The 81 keV peak in the NaI spectrum has a...analysis technique and confirmed our previous MCNP modeling. Our future work includes use of commercially available radioxenon gas (133Xe) to test

  5. Controls on interannual variation in evapotranspiration and water use efficiency in a mature, furrow-irrigated peach orchard

    USDA-ARS?s Scientific Manuscript database

    Evapotranspiration (ET) and water use efficiency (WUE) in peach orchards has previously been observed in young (less than 5-8 years old), drip irrigated orchards using micrometeorological techniques such as Eddy Covariance or large-weighing lysimeters. However, no work has been reported on ET and W...

  6. Optical digital chaos cryptography

    NASA Astrophysics Data System (ADS)

    Arenas-Pingarrón, Álvaro; González-Marcos, Ana P.; Rivas-Moscoso, José M.; Martín-Pereda, José A.

    2007-10-01

    In this work we present a new way to mask the data in a one-user communication system when direct sequence - code division multiple access (DS-CDMA) techniques are used. The code is generated by a digital chaotic generator, originally proposed by us and previously reported for a chaos cryptographic system. It is demonstrated that if the user's data signal is encoded with a bipolar phase-shift keying (BPSK) technique, usual in DS-CDMA, it can be easily recovered from a time-frequency domain representation. To avoid this situation, a new system is presented in which a previous dispersive stage is applied to the data signal. A time-frequency domain analysis is performed, and the devices required at the transmitter and receiver end, both user-independent, are presented for the optical domain.

  7. Digital Mapping Techniques '10-Workshop Proceedings, Sacramento, California, May 16-19, 2010

    USGS Publications Warehouse

    Soller, David R.; Soller, David R.

    2012-01-01

    The Digital Mapping Techniques '10 (DMT'10) workshop was attended by 110 technical experts from 40 agencies, universities, and private companies, including representatives from 19 State geological surveys (see Appendix A). This workshop, hosted by the California Geological Survey, May 16-19, 2010, in Sacramento, California, was similar in nature to the previous 13 meetings (see Appendix B). The meeting was coordinated by the U.S. Geological Survey's (USGS) National Geologic Map Database project. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was again successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products ("publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  8. Respiratory Artefact Removal in Forced Oscillation Measurements: A Machine Learning Approach.

    PubMed

    Pham, Thuy T; Thamrin, Cindy; Robinson, Paul D; McEwan, Alistair L; Leong, Philip H W

    2017-08-01

    Respiratory artefact removal for the forced oscillation technique can be treated as an anomaly detection problem. Manual removal is currently considered the gold standard, but this approach is laborious and subjective. Most existing automated techniques used simple statistics and/or rejected anomalous data points. Unfortunately, simple statistics are insensitive to numerous artefacts, leading to low reproducibility of results. Furthermore, rejecting anomalous data points causes an imbalance between the inspiratory and expiratory contributions. From a machine learning perspective, such methods are unsupervised and can be considered simple feature extraction. We hypothesize that supervised techniques can be used to find improved features that are more discriminative and more highly correlated with the desired output. Features thus found are then used for anomaly detection by applying quartile thresholding, which rejects complete breaths if one of its features is out of range. The thresholds are determined by both saliency and performance metrics rather than qualitative assumptions as in previous works. Feature ranking indicates that our new landmark features are among the highest scoring candidates regardless of age across saliency criteria. F1-scores, receiver operating characteristic, and variability of the mean resistance metrics show that the proposed scheme outperforms previous simple feature extraction approaches. Our subject-independent detector, 1IQR-SU, demonstrated approval rates of 80.6% for adults and 98% for children, higher than existing methods. Our new features are more relevant. Our removal is objective and comparable to the manual method. This is a critical work to automate forced oscillation technique quality control.

  9. Automatic 2D and 3D segmentation of liver from Computerised Tomography

    NASA Astrophysics Data System (ADS)

    Evans, Alun

    As part of the diagnosis of liver disease, a Computerised Tomography (CT) scan is taken of the patient, which the clinician then uses for assistance in determining the presence and extent of the disease. This thesis presents the background, methodology, results and future work of a project that employs automated methods to segment liver tissue. The clinical motivation behind this work is the desire to facilitate the diagnosis of liver disease such as cirrhosis or cancer, assist in volume determination for liver transplantation, and possibly assist in measuring the effect of any treatment given to the liver. Previous attempts at automatic segmentation of liver tissue have relied on 2D, low-level segmentation techniques, such as thresholding and mathematical morphology, to obtain the basic liver structure. The derived boundary can then be smoothed or refined using more advanced methods. The 2D results presented in this thesis improve greatly on this previous work by using a topology adaptive active contour model to accurately segment liver tissue from CT images. The use of conventional snakes for liver segmentation is difficult due to the presence of other organs closely surrounding the liver this new technique avoids this problem by adding an inflationary force to the basic snake equation, and initialising the snake inside the liver. The concepts underlying the 2D technique are extended to 3D, and results of full 3D segmentation of the liver are presented. The 3D technique makes use of an inflationary active surface model which is adaptively reparameterised, according to its size and local curvature, in order that it may more accurately segment the organ. Statistical analysis of the accuracy of the segmentation is presented for 18 healthy liver datasets, and results of the segmentation of unhealthy livers are also shown. The novel work developed during the course of this project has possibilities for use in other areas of medical imaging research, for example the segmentation of internal liver structures, and the segmentation and classification of unhealthy tissue. The possibilities of this future work are discussed towards the end of the report.

  10. Novel measurement techniques (development and analysis of silicon solar cells near 20% effciency)

    NASA Technical Reports Server (NTRS)

    Wolf, M.; Newhouse, M.

    1986-01-01

    Work in identifying, developing, and analyzing techniques for measuring bulk recombination rates, and surface recombination velocities and rates in all regions of high-efficiency silicon solar cells is presented. The accuracy of the previously developed DC measurement system was improved by adding blocked interference filters. The system was further automated by writing software that completely samples the unkown solar cell regions with data of numerous recombination velocity and lifetime pairs. The results can be displayed in three dimensions and the best fit can be found numerically using the simplex minimization algorithm. Also described is a theoretical methodology to analyze and compare existing dynamic measurement techniques.

  11. Novel measurement techniques (development and analysis of silicon solar cells near 20% effciency)

    NASA Astrophysics Data System (ADS)

    Wolf, M.; Newhouse, M.

    Work in identifying, developing, and analyzing techniques for measuring bulk recombination rates, and surface recombination velocities and rates in all regions of high-efficiency silicon solar cells is presented. The accuracy of the previously developed DC measurement system was improved by adding blocked interference filters. The system was further automated by writing software that completely samples the unkown solar cell regions with data of numerous recombination velocity and lifetime pairs. The results can be displayed in three dimensions and the best fit can be found numerically using the simplex minimization algorithm. Also described is a theoretical methodology to analyze and compare existing dynamic measurement techniques.

  12. Technologies for Nondestructive Evaluation of Surfaces and Thin Coatings

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The effort included in this project included several related activities encompassing basic understanding, technological development, customer identification and commercial transfer of several methodologies for nondestructive evaluation of surfaces and thin surface coatings. Consistent with the academic environment, students were involved in the effort working with established investigators to further their training, provide a nucleus of experienced practitioners in the new technologies during their industrial introduction, and utilize their talents for project goals. As will be seen in various portions of the report, some of the effort has led to commercialization. This process has spawned other efforts related to this project which are supported from outside sources. These activities are occupying the efforts of some of the people who were previously supported within this grant and its predecessors. The most advanced of the supported technologies is thermography, for which the previous joint efforts of the investigators and NASA researchers have developed several techniques for extending the utility of straight thermographic inspection by producing methods of interpretation and analysis accessible to automatic image processing with computer data analysis. The effort reported for this technology has been to introduce the techniques to new user communities, who are then be able to add to the effective uses of existing products with only slight development work. In a related development, analysis of a thermal measurement situation in past efforts led to a new insight into the behavior of simple temperature probes. This insight, previously reported to the narrow community in which the particular measurement was made, was reported to the community of generic temperature measurement experts this year. In addition to the propagation of mature thermographic techniques, the development of a thermoelastic imaging system has been an important related development. Part of the work carried out in the effort reported here has been to prepare reports introducing the newly commercially available thermoelastic measurements to the appropriate user communities.

  13. Hexagonalization of correlation functions II: two-particle contributions

    NASA Astrophysics Data System (ADS)

    Fleury, Thiago; Komatsu, Shota

    2018-02-01

    In this work, we compute one-loop planar five-point functions in N=4 super-Yang-Mills using integrability. As in the previous work, we decompose the correlation functions into hexagon form factors and glue them using the weight factors which depend on the cross-ratios. The main new ingredient in the computation, as compared to the four-point functions studied in the previous paper, is the two-particle mirror contribution. We develop techniques to evaluate it and find agreement with the perturbative results in all the cases we analyzed. In addition, we consider next-to-extremal four-point functions, which are known to be protected, and show that the sum of one-particle and two-particle contributions at one loop adds up to zero as expected. The tools developed in this work would be useful for computing higher-particle contributions which would be relevant for more complicated quantities such as higher-loop corrections and non-planar correlators.

  14. Warping an atlas derived from serial histology to 5 high-resolution MRIs.

    PubMed

    Tullo, Stephanie; Devenyi, Gabriel A; Patel, Raihaan; Park, Min Tae M; Collins, D Louis; Chakravarty, M Mallar

    2018-06-19

    Previous work from our group demonstrated the use of multiple input atlases to a modified multi-atlas framework (MAGeT-Brain) to improve subject-based segmentation accuracy. Currently, segmentation of the striatum, globus pallidus and thalamus are generated from a single high-resolution and -contrast MRI atlas derived from annotated serial histological sections. Here, we warp this atlas to five high-resolution MRI templates to create five de novo atlases. The overall goal of this work is to use these newly warped atlases as input to MAGeT-Brain in an effort to consolidate and improve the workflow presented in previous manuscripts from our group, allowing for simultaneous multi-structure segmentation. The work presented details the methodology used for the creation of the atlases using a technique previously proposed, where atlas labels are modified to mimic the intensity and contrast profile of MRI to facilitate atlas-to-template nonlinear transformation estimation. Dice's Kappa metric was used to demonstrate high quality registration and segmentation accuracy of the atlases. The final atlases are available at https://github.com/CobraLab/atlases/tree/master/5-atlas-subcortical.

  15. Application of nonlinear ultrasonics to inspection of stainless steel for dry storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ulrich, Timothy James II; Anderson, Brain E.; Remillieux, Marcel C.

    This report summarized technical work conducted by LANL staff an international collaborators in support of the UFD Storage Experimentation effort. The focus of the current technical work is on the detection and imaging of a failure mechanism known as stress corrosion cracking (SCC) in stainless steel using the nonlinear ultrasonic technique known as TREND. One of the difficulties faced in previous work is in finding samples that contain realistically sized SCC. This year such samples were obtained from EPRI. Reported here are measurements made on these samples. One of the key findings is the ability to detect subsurface changes tomore » the direction in which a crack is penetrating into the sample. This result follows from last year's report that demonstrated the ability of TREND techniques to image features below the sample surface. A new collaboration was established with AGH University of Science and Technology, Krakow, Poland.« less

  16. Investigation of Active Interrogation Techniques to Detect Special Nuclear Material in Maritime Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Thomas Martin; Patton, Bruce W

    The detection and interdiction of special nuclear material (SNM) is still a high-priority focus area for many organizations around the world. One method that is commonly considered a leading candidate in the detection of SNM is active interrogation (AI). AI is different from its close relative, passive interrogation, in that an active source is used to enhance or create a detectable signal (usually fission) from SNM, particularly in shielded scenarios or scenarios where the SNM has a low activity. The use of AI thus makes the detection of SNM easier or, in some scenarios, even enables previously impossible detection. Inmore » this work the signal from prompt neutrons and photons as well as delayed neutrons and photons will be combined, as is typically done in AI. In previous work AI has been evaluated experimentally and computationally. However, for the purposes of this work, past scenarios are considered lightly shielded and tightly coupled spatially. At most, the previous work interrogated the contents of one standard cargo container (2.44 x 2.60 x 6.10 m) and the source and detector were both within a few meters of the object being interrogated. A few examples of this type of previous work can be found in references 1 and 2. Obviously, more heavily shielded AI scenarios will require larger source intensities, larger detector surface areas (larger detectors or more detectors), greater detector efficiencies, longer count times, or some combination of these.« less

  17. Effects of geometric nonlinearity in an adhered microbeam for measuring the work of adhesion

    NASA Astrophysics Data System (ADS)

    Fang, Wenqiang; Mok, Joyce; Kesari, Haneesh

    2018-03-01

    Design against adhesion in microelectromechanical devices is predicated on the ability to quantify this phenomenon in microsystems. Previous research related the work of adhesion for an adhered microbeam to the beam's unadhered length, and as such, interferometric techniques were developed to measure that length. We propose a new vibration-based technique that can be easily implemented with existing atomic force microscopy tools or similar metrology systems. To make such a technique feasible, we analysed a model of the adhered microbeam using the nonlinear beam theory put forth by Woinowsky-Krieger. We found a new relation between the work of adhesion and the unadhered length; this relation is more accurate than the one by Mastrangelo & Hsu (Mastrangelo & Hsu 1993 J. Microelectromech. S., 2, 44-55. (doi:10.1109/84.232594)) which is commonly used. Then, we derived a closed-form approximate relationship between the microbeam's natural frequency and its unadhered length. Results obtained from this analytical formulation are in good agreement with numerical results from three-dimensional nonlinear finite-element analysis.

  18. Rotational temperatures of Venus upper atmosphere as measured by SOIR on board Venus Express

    NASA Astrophysics Data System (ADS)

    Mahieux, A.; Vandaele, A. C.; Robert, S.; Wilquet, V.; Drummond, R.; López Valverde, M. A.; López Puertas, M.; Funke, B.; Bertaux, J. L.

    2015-08-01

    SOIR is a powerful infrared spectrometer flying on board the Venus Express spacecraft since mid-2006. It sounds the Venus atmosphere above the cloud layer using the solar occultation technique. In the recorded spectra, absorption structures from many species are observed, among them carbon dioxide, the main constituent of the Venus atmosphere. Previously, temperature vertical profiles were derived from the carbon dioxide density retrieved from the SOIR spectra by assuming hydrostatic equilibrium. These profiles show a permanent cold layer at 125 km with temperatures of ~100 K, surrounded by two warmer layers at 90 and 140 km, reaching temperatures of ~200 K and 250-300 K, respectively. In this work, temperature profiles are derived from the SOIR spectra using another technique based on the ro-vibrational structure of carbon dioxide observed in the spectra. The error budget is extensively investigated. Temperature profiles obtained by both techniques are comparable within their respective uncertainties and they confirm the vertical structure previously determined from SOIR spectra.

  19. Microfabrication of through holes in polydimethylsiloxane (PDMS) sheets using a laser plasma EUV source (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Makimura, Tetsuya; Urai, Hikari; Niino, Hiroyuki

    2017-03-01

    Polydimethylsiloxane (PDMS) is a material used for cell culture substrates / bio-chips and micro total analysis systems / lab-on-chips due to its flexibility, chemical / thermo-dynamic stability, bio-compatibility, transparency and moldability. For further development, it is inevitable to develop a technique to fabricate precise three dimensional structures on micrometer-scale at high aspect ratio. In the previous works, we reported a technique for high-quality micromachining of PDMS without chemical modification, by means of photo direct machining using laser plasma EUV sources. In the present work, we have investigated fabrication of through holes. The EUV radiations around 10 nm were generated by irradiation of Ta targets with Nd:YAG laser light (10 ns, 500 mJ/pulse). The generated EUV radiations were focused using an ellipsoidal mirror. It has a narrower incident angle than those in the previous works in order to form a EUV beam with higher directivity, so that higher aspect structures can be fabricated. The focused EUV beam was incident on PDMS sheets with a thickness of 15 micrometers, through holes in a contact mask placed on top of them. Using a contact mask with holes with a diameter of three micrometers, complete through holes with a diameter of two micrometers are fabricated in the PDMS sheet. Using a contact mask with two micrometer holes, however, ablation holes almost reaches to the back side of the PDMS sheet. The fabricated structures can be explained in terms of geometrical optics. Thus, we have developed a technique for micromachining of PDMS sheets at high aspect ratios.

  20. What works with worked examples: Extending self-explanation and analogical comparison to synthesis problems

    NASA Astrophysics Data System (ADS)

    Badeau, Ryan; White, Daniel R.; Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.

    2017-12-01

    The ability to solve physics problems that require multiple concepts from across the physics curriculum—"synthesis" problems—is often a goal of physics instruction. Three experiments were designed to evaluate the effectiveness of two instructional methods employing worked examples on student performance with synthesis problems; these instructional techniques, analogical comparison and self-explanation, have previously been studied primarily in the context of single-concept problems. Across three experiments with students from introductory calculus-based physics courses, both self-explanation and certain kinds of analogical comparison of worked examples significantly improved student performance on a target synthesis problem, with distinct improvements in recognition of the relevant concepts. More specifically, analogical comparison significantly improved student performance when the comparisons were invoked between worked synthesis examples. In contrast, similar comparisons between corresponding pairs of worked single-concept examples did not significantly improve performance. On a more complicated synthesis problem, self-explanation was significantly more effective than analogical comparison, potentially due to differences in how successfully students encoded the full structure of the worked examples. Finally, we find that the two techniques can be combined for additional benefit, with the trade-off of slightly more time on task.

  1. Evaluation of control laws and actuator locations for control systems applicable to deformable astronomical telescope mirrors

    NASA Technical Reports Server (NTRS)

    Ostroff, A. J.

    1973-01-01

    Some of the major difficulties associated with large orbiting astronomical telescopes are the cost of manufacturing the primary mirror to precise tolerances and the maintaining of diffraction-limited tolerances while in orbit. One successfully demonstrated approach for minimizing these problem areas is the technique of actively deforming the primary mirror by applying discrete forces to the rear of the mirror. A modal control technique, as applied to active optics, has previously been developed and analyzed. The modal control technique represents the plant to be controlled in terms of its eigenvalues and eigenfunctions which are estimated via numerical approximation techniques. The report includes an extension of previous work using the modal control technique and also describes an optimal feedback controller. The equations for both control laws are developed in state-space differential form and include such considerations as stability, controllability, and observability. These equations are general and allow the incorporation of various mode-analyzer designs; two design approaches are presented. The report also includes a technique for placing actuator and sensor locations at points on the mirror based upon the flexibility matrix of the uncontrolled or unobserved modes of the structure. The locations selected by this technique are used in the computer runs which are described. The results are based upon three different initial error distributions, two mode-analyzer designs, and both the modal and optimal control laws.

  2. Predictive Cache Modeling and Analysis

    DTIC Science & Technology

    2011-11-01

    metaheuristic /bin-packing algorithm to optimize task placement based on task communication characterization. Our previous work on task allocation showed...Cache Miss Minimization Technology To efficiently explore combinations and discover nearly-optimal task-assignment algorithms , we extended to our...it was possible to use our algorithmic techniques to decrease network bandwidth consumption by ~25%. In this effort, we adapted these existing

  3. A Double Take: The Practical and Ethical Dilemmas of Teaching the Visual Method of Photo Elicitation

    ERIC Educational Resources Information Center

    Wakefield, Caroline; Watt, Sal

    2014-01-01

    This paper advocates the teaching of photo elicitation in higher education as a valuable data collection technique and draws on our experience of teaching this visual method across two consecutive postgraduate cohorts. Building on previous work (Watt & Wakefield, 2014) and based on a former concern regarding student duty of care, a…

  4. Experimental study of catalytic hydrogenation by using an in-situ hydrogen measuring technique. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiang, S.H.; Klinzing, G.E.; Cheng, Y.S.

    1984-12-01

    An in-situ technique for measuring hydrogen concentration (partial pressure) had been previously used to measure static properties (hydrogen solubilities, vapor pressures of hydrocarbons, etc.). Because of its good precision (2% relative error) and relatively short respond time (9.7 to 2.0 seconds at 589 to 728K), the technique was successfully applied to a dynamic study of hydrogenation reactions in this work. Furthermore, the technique is to be tested for industrial uses. Hydrogen/1-methylnaphthalene system was experimentally investigated in a one-liter autoclave equipped with a magnetically driven stirrer and temperature controlling devices. Catalytic hydrogenation of 1-methylnaphthalene was studied in the presence of sulfidedmore » Co-Mo-Al2O3 catalyst. In addition, the vapor/liquid equilibrium relationship was determined by using this technique. Hydrogenation reaction runs were performed at temperatures of 644.1, 658.0 and 672.0K and pressures up to 9.0 MPa. The ring hydrogenation, resulting in 1- and 5-methyltetralin, was found to be the dominant reaction. This is in agreement with cited literature. Effects of hydrogen partial pressure, operating temperature, as well as presulfided catalyst are also investigated and discussed in this work. The vapor pressure of 1-methylnaphthalene was measured over a temperature range of 555.2 to 672.0K. The results are in good agreement with literature data. Measurements for hydrogen solubility in 1-methylnaphthalene were conducted over temperature and pressure range of 598 to 670K and 5.2 to 8.8 MPa, respectively. Similar to previously reported results, the hydrogen solubility increases with increasing temperature when total pressure is held constant. A linear relation is found between the hydrogen solubility and hydrogen partial pressure. 21 refs., 13 figs., 10 tabs.« less

  5. Models of performance of evolutionary program induction algorithms based on indicators of problem difficulty.

    PubMed

    Graff, Mario; Poli, Riccardo; Flores, Juan J

    2013-01-01

    Modeling the behavior of algorithms is the realm of evolutionary algorithm theory. From a practitioner's point of view, theory must provide some guidelines regarding which algorithm/parameters to use in order to solve a particular problem. Unfortunately, most theoretical models of evolutionary algorithms are difficult to apply to realistic situations. However, in recent work (Graff and Poli, 2008, 2010), where we developed a method to practically estimate the performance of evolutionary program-induction algorithms (EPAs), we started addressing this issue. The method was quite general; however, it suffered from some limitations: it required the identification of a set of reference problems, it required hand picking a distance measure in each particular domain, and the resulting models were opaque, typically being linear combinations of 100 features or more. In this paper, we propose a significant improvement of this technique that overcomes the three limitations of our previous method. We achieve this through the use of a novel set of features for assessing problem difficulty for EPAs which are very general, essentially based on the notion of finite difference. To show the capabilities or our technique and to compare it with our previous performance models, we create models for the same two important classes of problems-symbolic regression on rational functions and Boolean function induction-used in our previous work. We model a variety of EPAs. The comparison showed that for the majority of the algorithms and problem classes, the new method produced much simpler and more accurate models than before. To further illustrate the practicality of the technique and its generality (beyond EPAs), we have also used it to predict the performance of both autoregressive models and EPAs on the problem of wind speed forecasting, obtaining simpler and more accurate models that outperform in all cases our previous performance models.

  6. Modeling and prototyping of biometric systems using dataflow programming

    NASA Astrophysics Data System (ADS)

    Minakova, N.; Petrov, I.

    2018-01-01

    The development of biometric systems is one of the labor-intensive processes. Therefore, the creation and analysis of approaches and techniques is an urgent task at present. This article presents a technique of modeling and prototyping biometric systems based on dataflow programming. The technique includes three main stages: the development of functional blocks, the creation of a dataflow graph and the generation of a prototype. A specially developed software modeling environment that implements this technique is described. As an example of the use of this technique, an example of the implementation of the iris localization subsystem is demonstrated. A variant of modification of dataflow programming is suggested to solve the problem related to the undefined order of block activation. The main advantage of the presented technique is the ability to visually display and design the model of the biometric system, the rapid creation of a working prototype and the reuse of the previously developed functional blocks.

  7. Radiation Hardening by Software Techniques on FPGAs: Flight Experiment Evaluation and Results

    NASA Technical Reports Server (NTRS)

    Schmidt, Andrew G.; Flatley, Thomas

    2017-01-01

    We present our work on implementing Radiation Hardening by Software (RHBSW) techniques on the Xilinx Virtex5 FPGAs PowerPC 440 processors on the SpaceCube 2.0 platform. The techniques have been matured and tested through simulation modeling, fault emulation, laser fault injection and now in a flight experiment, as part of the Space Test Program- Houston 4-ISS SpaceCube Experiment 2.0 (STP-H4-ISE 2.0). This work leverages concepts such as heartbeat monitoring, control flow assertions, and checkpointing, commonly used in the High Performance Computing industry, and adapts them for use in remote sensing embedded systems. These techniques are extremely low overhead (typically <1.3%), enabling a 3.3x gain in processing performance as compared to the equivalent traditionally radiation hardened processor. The recently concluded STP-H4 flight experiment was an opportunity to upgrade the RHBSW techniques for the Virtex5 FPGA and demonstrate them on-board the ISS to achieve TRL 7. This work details the implementation of the RHBSW techniques, that were previously developed for the Virtex4-based SpaceCube 1.0 platform, on the Virtex5-based SpaceCube 2.0 flight platform. The evaluation spans the development and integration with flight software, remotely uploading the new experiment to the ISS SpaceCube 2.0 platform, and conducting the experiment continuously for 16 days before the platform was decommissioned. The experiment was conducted on two PowerPCs embedded within the Virtex5 FPGA devices and the experiment collected 19,400 checkpoints, processed 253,482 status messages, and incurred 0 faults. These results are highly encouraging and future work is looking into longer duration testing as part of the STP-H5 flight experiment.

  8. Comparison of digital and conventional impression techniques: evaluation of patients’ perception, treatment comfort, effectiveness and clinical outcomes

    PubMed Central

    2014-01-01

    Background The purpose of this study was to compare two impression techniques from the perspective of patient preferences and treatment comfort. Methods Twenty-four (12 male, 12 female) subjects who had no previous experience with either conventional or digital impression participated in this study. Conventional impressions of maxillary and mandibular dental arches were taken with a polyether impression material (Impregum, 3 M ESPE), and bite registrations were made with polysiloxane bite registration material (Futar D, Kettenbach). Two weeks later, digital impressions and bite scans were performed using an intra-oral scanner (CEREC Omnicam, Sirona). Immediately after the impressions were made, the subjects’ attitudes, preferences and perceptions towards impression techniques were evaluated using a standardized questionnaire. The perceived source of stress was evaluated using the State-Trait Anxiety Scale. Processing steps of the impression techniques (tray selection, working time etc.) were recorded in seconds. Statistical analyses were performed with the Wilcoxon Rank test, and p < 0.05 was considered significant. Results There were significant differences among the groups (p < 0.05) in terms of total working time and processing steps. Patients stated that digital impressions were more comfortable than conventional techniques. Conclusions Digital impressions resulted in a more time-efficient technique than conventional impressions. Patients preferred the digital impression technique rather than conventional techniques. PMID:24479892

  9. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  10. Investigation on laser-assisted tissue repair with NIR millisecond-long light pulses and Indocyanine Green-biopolymeric patches

    NASA Astrophysics Data System (ADS)

    Matteini, Paolo; Banchelli, Martina; Cottat, Maximilien; Osticioli, Iacopo; de Angelis, Marella; Rossi, Francesca; Pini, Roberto

    2016-03-01

    In previous works a minimally invasive laser-assisted technique for vascular repair was presented. The technique rests on the photothermal adhesion of a biocompatible and bioresorbable patch containing Indocyanine Green that is brought into contact with the site to be repaired. Afterward the use of NIR millisecond-long light pulses generates a strong welding effect between the patch and the underlying tissue and in turn the repair of the wound. This technique was shown to be effective in animal model and provides several advantages over conventional suturing methods. Here we investigate and discuss the optical stability of the ICG-biopolymeric patches and the photothermal effects induced to the irradiated tissue.

  11. Torque Transient of Magnetically Drive Flow for Viscosity Measurement

    NASA Technical Reports Server (NTRS)

    Ban, Heng; Li, Chao; Su, Ching-Hua; Lin, Bochuan; Scripa, Rosalia N.; Lehoczky, Sandor L.

    2004-01-01

    Viscosity is a good indicator of structural changes for complex liquids, such as semiconductor melts with chain or ring structures. This paper discusses the theoretical and experimental results of the transient torque technique for non-intrusive viscosity measurement. Such a technique is essential for the high temperature viscosity measurement of high pressure and toxic semiconductor melts. In this paper, our previous work on oscillating cup technique was expanded to the transient process of a magnetically driven melt flow in a damped oscillation system. Based on the analytical solution for the fluid flow and cup oscillation, a semi-empirical model was established to extract the fluid viscosity. The analytical and experimental results indicated that such a technique has the advantage of short measurement time and straight forward data analysis procedures

  12. What about the Misgav-Ladach surgical technique in patients with previous cesarean sections?

    PubMed

    Bolze, Pierre-Adrien; Massoud, Mona; Gaucherand, Pascal; Doret, Muriel

    2013-03-01

    The Misgav-Ladach technique is recommended worldwide to perform cesarean sections but there is no consensus about the appropriate technique to use in patients with previous cesarean sections. This study evaluated the feasibility of the Misgav-Ladach technique in patients with previous cesarean sections. This prospective cohort study included all women undergoing cesarean section after 36 weeks of gestation over a 5-month period, with the Misgav-Ladach technique as first choice, whatever the previous number of cesarean sections. Among the 204 patients included, the Misgav-Ladach technique was successful in 100%, 80%, and 65.6% of patients with no, one, and multiple previous cesarean sections, respectively. When successful, the Misgav-Ladach technique was associated with a shorter incision to birth interval in patients with no previous cesarean section compared with patients with one or multiple previous cesarean sections. Anterior rectus aponeurosis fibrosis and severe peritoneal adherences were the two main reasons explaining the Misgav-Ladach technique failure. The Misgav-Ladach technique is possible in over three-fourths of patients with previous cesarean sections with a slight increase in incision to birth interval compared with patients without previous cesarean section. Further studies comparing the Misgav-Ladach and the Pfannenstiel techniques in women with previous cesarean should be done. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  13. Towards parameter-free classification of sound effects in movies

    NASA Astrophysics Data System (ADS)

    Chu, Selina; Narayanan, Shrikanth; Kuo, C.-C. J.

    2005-08-01

    The problem of identifying intense events via multimedia data mining in films is investigated in this work. Movies are mainly characterized by dialog, music, and sound effects. We begin our investigation with detecting interesting events through sound effects. Sound effects are neither speech nor music, but are closely associated with interesting events such as car chases and gun shots. In this work, we utilize low-level audio features including MFCC and energy to identify sound effects. It was shown in previous work that the Hidden Markov model (HMM) works well for speech/audio signals. However, this technique requires a careful choice in designing the model and choosing correct parameters. In this work, we introduce a framework that will avoid such necessity and works well with semi- and non-parametric learning algorithms.

  14. Use of One Time Pad Algorithm for Bit Plane Security Improvement

    NASA Astrophysics Data System (ADS)

    Suhardi; Suwilo, Saib; Budhiarti Nababan, Erna

    2017-12-01

    BPCS (Bit-Plane Complexity Segmentation) which is one of the steganography techniques that utilizes the human vision characteristics that cannot see the change in binary patterns that occur in the image. This technique performs message insertion by making a switch to a high-complexity bit-plane or noise-like regions with bits of secret messages. Bit messages that were previously stored precisely result the message extraction process to be done easily by rearranging a set of previously stored characters in noise-like region in the image. Therefore the secret message becomes easily known by others. In this research, the process of replacing bit plane with message bits is modified by utilizing One Time Pad cryptography technique which aims to increase security in bit plane. In the tests performed, the combination of One Time Pad cryptographic algorithm to the steganography technique of BPCS works well in the insertion of messages into the vessel image, although in insertion into low-dimensional images is poor. The comparison of the original image with the stegoimage looks identical and produces a good quality image with a mean value of PSNR above 30db when using a largedimensional image as the cover messages.

  15. Microrheometric upconversion-based techniques for intracellular viscosity measurements

    NASA Astrophysics Data System (ADS)

    Rodríguez-Sevilla, Paloma; Zhang, Yuhai; de Sousa, Nuno; Marqués, Manuel I.; Sanz-Rodríguez, Francisco; Jaque, Daniel; Liu, Xiaogang; Haro-González, Patricia

    2017-08-01

    Rheological parameters (viscosity, creep compliance and elasticity) play an important role in cell function and viability. For this reason different strategies have been developed for their study. In this work, two new microrheometric techniques are presented. Both methods take advantage of the analysis of the polarized emission of an upconverting particle to determine its orientation inside the optical trap. Upconverting particles are optical materials that are able to convert infrared radiation into visible light. Their usefulness has been further boosted by the recent demonstration of their three-dimensional control and tracking by single beam infrared optical traps. In this work it is demonstrated that optical torques are responsible of the stable orientation of the upconverting particle inside the trap. Moreover, numerical calculations and experimental data allowed to use the rotation dynamics of the optically trapped upconverting particle for environmental sensing. In particular, the cytoplasm viscosity could be measured by using the rotation time and thermal fluctuations of an intracellular optically trapped upconverting particle, by means of the two previously mentioned microrheometric techniques.

  16. Sci-Thur PM - Colourful Interactions: Highlights 08: ARC TBI using Single-Step Optimized VMAT Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudson, Alana; Gordon, Deborah; Moore, Roseanne

    Purpose: This work outlines a new TBI delivery technique to replace a lateral POP full bolus technique. The new technique is done with VMAT arc delivery, without bolus, treating the patient prone and supine. The benefits of the arc technique include: increased patient experience and safety, better dose conformity, better organ at risk sparing, decreased therapist time and reduction of therapist injuries. Methods: In this work we build on a technique developed by Jahnke et al. We use standard arc fields with gantry speeds corrected for varying distance to the patient followed by a single step VMAT optimization on amore » patient CT to increase dose inhomogeneity and to reduce dose to the lungs (vs. blocks). To compare the arc TBI technique to our full bolus technique, we produced plans on patient CTs for both techniques and evaluated several dosimetric parameters using an ANOVA test. Results and Conclusions: The arc technique is able reduce both the hot areas to the body (D2% reduced from 122.2% to 111.8% p<0.01) and the lungs (mean lung dose reduced from 107.5% to 99.1%, p<0.01), both statistically significant, while maintaining coverage (D98% = 97.8% vs. 94.6%, p=0.313, not statistically significant). We developed a more patient and therapist-friendly TBI treatment technique that utilizes single-step optimized VMAT plans. It was found that this technique was dosimetrically equivalent to our previous lateral technique in terms of coverage and statistically superior in terms of reduced lung dose.« less

  17. MR-Based Assessment of Bone Marrow Fat in Osteoporosis, Diabetes, and Obesity

    PubMed Central

    Cordes, Christian; Baum, Thomas; Dieckmeyer, Michael; Ruschke, Stefan; Diefenbach, Maximilian N.; Hauner, Hans; Kirschke, Jan S.; Karampinos, Dimitrios C.

    2016-01-01

    Bone consists of the mineralized component (i.e., cortex and trabeculae) and the non-mineralized component (i.e., bone marrow). Most of the routine clinical bone imaging uses X-ray-based techniques and focuses on the mineralized component. However, bone marrow adiposity has been also shown to have a strong linkage with bone health. Specifically, multiple previous studies have demonstrated a negative association between bone marrow fat fraction (BMFF) and bone mineral density. Magnetic resonance imaging (MRI) and magnetic resonance spectroscopy (MRS) are ideal imaging techniques for non-invasively investigating the properties of bone marrow fat. In the present work, we first review the most important MRI and MRS methods for assessing properties of bone marrow fat, including methodologies for measuring BMFF and bone marrow fatty acid composition parameters. Previous MRI and MRS studies measuring BMFF and fat unsaturation in the context of osteoporosis are then reviewed. Finally, previous studies investigating the relationship between bone marrow fat, other fat depots, and bone health in patients with obesity and type 2 diabetes are presented. In summary, MRI and MRS are powerful non-invasive techniques for measuring properties of bone marrow fat in osteoporosis, obesity, and type 2 diabetes and can assist in future studies investigating the pathophysiology of bone changes in the above clinical scenarios. PMID:27445977

  18. Absolute rate of the reaction of C l(2P) with methane from 200-500 K

    NASA Technical Reports Server (NTRS)

    Whytock, D. A.; Lee, J. H.; Michael, J. V.; Payne, W. A.; Stief, L. J.

    1976-01-01

    Rate constants for the reaction of atomic chlorine with methane have been measured from 200-500K using the flash photolysis-resonance fluorescence technique. When the results from fourteen equally spaced experimental determinations are plotted in Arrhenius form a definite curvature is noted. The results are compared to previous work and are theoretically discussed.

  19. Simulation into Reality: Some Effects of Simulation Techniques on Organizational Communication Students.

    ERIC Educational Resources Information Center

    Allen, Richard K.

    In an attempt to discover improved classroom teaching methods, a class was turned into a business organization as a way of bringing life to the previously covered lectures and textual materials. The simulated games were an attempt to get people to work toward a common goal with all of the power plays, secret meetings, brainstorming, anger, and…

  20. A Re-examination of the Black English Copula. Working Papers in Sociolinguistics, No. 66.

    ERIC Educational Resources Information Center

    Baugh, John

    A corpus of Black English (BEV) data is re-examined with exclusive attention to the "is" form of the copula. This analysis differs from previous examinations in that more constraints have been introduced, and the Cedergren/Sankoff computer program for multivariant analysis has been employed. The analytic techniques that are used allow for a finer…

  1. The Priority Inversion Problem and Real-Time Symbolic Model Checking

    DTIC Science & Technology

    1993-04-23

    real time systems unpredictable in subtle ways. This makes it more difficult to implement and debug such systems. Our work discusses this problem and presents one possible solution. The solution is formalized and verified using temporal logic model checking techniques. In order to perform the verification, the BDD-based symbolic model checking algorithm given in previous works was extended to handle real-time properties using the bounded until operator. We believe that this algorithm, which is based on discrete time, is able to handle many real-time properties

  2. Sparse Matrix for ECG Identification with Two-Lead Features.

    PubMed

    Tseng, Kuo-Kun; Luo, Jiao; Hegarty, Robert; Wang, Wenmin; Haiting, Dong

    2015-01-01

    Electrocardiograph (ECG) human identification has the potential to improve biometric security. However, improvements in ECG identification and feature extraction are required. Previous work has focused on single lead ECG signals. Our work proposes a new algorithm for human identification by mapping two-lead ECG signals onto a two-dimensional matrix then employing a sparse matrix method to process the matrix. And that is the first application of sparse matrix techniques for ECG identification. Moreover, the results of our experiments demonstrate the benefits of our approach over existing methods.

  3. Digital Mapping Techniques '05--Workshop Proceedings, Baton Rouge, Louisiana, April 24-27, 2005

    USGS Publications Warehouse

    Soller, David R.

    2005-01-01

    Intorduction: The Digital Mapping Techniques '05 (DMT'05) workshop was attended by more than 100 technical experts from 47 agencies, universities, and private companies, including representatives from 25 state geological surveys (see Appendix A). This workshop was similar in nature to the previous eight meetings, held in Lawrence, Kansas (Soller, 1997), in Champaign, Illinois (Soller, 1998), in Madison, Wisconsin (Soller, 1999), in Lexington, Kentucky (Soller, 2000), in Tuscaloosa, Alabama (Soller, 2001), in Salt Lake City, Utah (Soller, 2002), in Millersville, Pennsylvania (Soller, 2003), and in Portland, Oregon (Soller, 2004). This year's meeting was hosted by the Louisiana Geological Survey, from April 24-27, 2005, on the Louisiana State University campus in Baton Rouge, Louisiana. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and to renew friendships and collegial work begun at past DMT workshops. Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, which was formed in August 1996, to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database?and for the State and Federal geological surveys?to provide more high-quality digital maps to the public. At the 2005 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; 6) continued development of the National Geologic Map Database; and 7) progress toward building and implementing a standard geologic map data model and standard science language for the U.S. and for North America.

  4. Many-body-theory study of lithium photoionization

    NASA Technical Reports Server (NTRS)

    Chang, T. N.; Poe, R. T.

    1975-01-01

    A detailed theoretical calculation is carried out for the photoionization of lithium at low energies within the framework of Brueckner-Goldstone perturbational approach. In this calculation extensive use is made of the recently developed multiple-basis-set technique. Through this technique all second-order perturbation terms, plus a number of important classes of terms to infinite order, have been taken into account. Analysis of the results enables one to resolve the discrepancies between two previous works on this subject. The detailed calculation also serves as a test on the convergence of the many-body perturbation-expansion approach.

  5. Linear and nonlinear stability of the Blasius boundary layer

    NASA Technical Reports Server (NTRS)

    Bertolotti, F. P.; Herbert, TH.; Spalart, P. R.

    1992-01-01

    Two new techniques for the study of the linear and nonlinear instability in growing boundary layers are presented. The first technique employs partial differential equations of parabolic type exploiting the slow change of the mean flow, disturbance velocity profiles, wavelengths, and growth rates in the streamwise direction. The second technique solves the Navier-Stokes equation for spatially evolving disturbances using buffer zones adjacent to the inflow and outflow boundaries. Results of both techniques are in excellent agreement. The linear and nonlinear development of Tollmien-Schlichting (TS) waves in the Blasius boundary layer is investigated with both techniques and with a local procedure based on a system of ordinary differential equations. The results are compared with previous work and the effects of non-parallelism and nonlinearity are clarified. The effect of nonparallelism is confirmed to be weak and, consequently, not responsible for the discrepancies between measurements and theoretical results for parallel flow.

  6. Evaluation of ultrasonics and optimized radiography for 2219-T87 aluminum weldments

    NASA Technical Reports Server (NTRS)

    Clotfelter, W. N.; Hoop, J. M.; Duren, P. C.

    1975-01-01

    Ultrasonic studies are described which are specifically directed toward the quantitative measurement of randomly located defects previously found in aluminum welds with radiography or with dye penetrants. Experimental radiographic studies were also made to optimize techniques for welds of the thickness range to be used in fabricating the External Tank of the Space Shuttle. Conventional and innovative ultrasonic techniques were applied to the flaw size measurement problem. Advantages and disadvantages of each method are discussed. Flaw size data obtained ultrasonically were compared to radiographic data and to real flaw sizes determined by destructive measurements. Considerable success was achieved with pulse echo techniques and with 'pitch and catch' techniques. The radiographic work described demonstrates that careful selection of film exposure parameters for a particular application must be made to obtain optimized flaw detectability. Thus, film exposure techniques can be improved even though radiography is an old weld inspection method.

  7. Chemical speciation of individual airborne particles by the combined use of quantitative energy-dispersive electron probe X-ray microanalysis and attenuated total reflection Fourier transform-infrared imaging techniques.

    PubMed

    Song, Young-Chul; Ryu, JiYeon; Malek, Md Abdul; Jung, Hae-Jin; Ro, Chul-Un

    2010-10-01

    In our previous work, it was demonstrated that the combined use of attenuated total reflectance (ATR) FT-IR imaging and quantitative energy-dispersive electron probe X-ray microanalysis (ED-EPMA), named low-Z particle EPMA, had the potential for characterization of individual aerosol particles. Additionally, the speciation of individual mineral particles was performed on a single particle level by the combined use of the two techniques, demonstrating that simultaneous use of the two single particle analytical techniques is powerful for the detailed characterization of externally heterogeneous mineral particle samples and has great potential for characterization of atmospheric mineral dust aerosols. These single particle analytical techniques provide complementary information on the physicochemical characteristics of the same individual particles, such as low-Z particle EPMA on morphology and elemental concentrations and the ATR-FT-IR imaging on molecular species, crystal structures, functional groups, and physical states. In this work, this analytical methodology was applied to characterize an atmospheric aerosol sample collected in Incheon, Korea. Overall, 118 individual particles were observed to be primarily NaNO(3)-containing, Ca- and/or Mg-containing, silicate, and carbonaceous particles, although internal mixing states of the individual particles proved complicated. This work demonstrates that more detailed physiochemical properties of individual airborne particles can be obtained using this approach than when either the low-Z particle EPMA or ATR-FT-IR imaging technique is used alone.

  8. The interaction of pulsed eddy current with metal surface crack for various coils

    NASA Astrophysics Data System (ADS)

    Yang, Hung-Chi; Tai, Cheng-Chi

    2002-05-01

    We study the interaction of pulsed eddy current (PEC) with metal surface cracks using various coils that have different geometric sizes. In the previous work, we have showed that the PEC technique can be used to inspect electrical-discharge-machined (EDM) notches with depth from 0.5 mm to 9 mm. The results showed that the relationship between PEC signals and crack depth is obvious. In this work, we further try a series of coils with different radii, heights, turns and shapes. We will discuss the effects of these coil parameters on the PEC signal. Some other critical problems of PEC measurements such as signal drift that caused by heating effect of coil currents will be studied. We also show more experiments on fatigue cracks to demonstrate the capability of PEC technique for cracks inspection.

  9. AYUSH: A Technique for Extending Lifetime of SRAM-NVM Hybrid Caches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mittal, Sparsh; Vetter, Jeffrey S

    2014-01-01

    Recently, researchers have explored way-based hybrid SRAM-NVM (non-volatile memory) last level caches (LLCs) to bring the best of SRAM and NVM together. However, the limited write endurance of NVMs restricts the lifetime of these hybrid caches. We present AYUSH, a technique to enhance the lifetime of hybrid caches, which works by using data-migration to preferentially use SRAM for storing frequently-reused data. Microarchitectural simulations confirm that AYUSH achieves larger improvement in lifetime than a previous technique and also maintains performance and energy efficiency. For single, dual and quad-core workloads, the average increase in cache lifetime with AYUSH is 6.90X, 24.06X andmore » 47.62X, respectively.« less

  10. Preparing Colorful Astronomical Images II

    NASA Astrophysics Data System (ADS)

    Levay, Z. G.; Frattare, L. M.

    2002-12-01

    We present additional techniques for using mainstream graphics software (Adobe Photoshop and Illustrator) to produce composite color images and illustrations from astronomical data. These techniques have been used on numerous images from the Hubble Space Telescope to produce photographic, print and web-based products for news, education and public presentation as well as illustrations for technical publication. We expand on a previous paper to present more detail and additional techniques, taking advantage of new or improved features available in the latest software versions. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to work with scaled images, masks, text and graphics in multiple semi-transparent layers and channels.

  11. Single-particle mineralogy of Chinese soil particles by the combined use of low-Z particle electron probe X-ray microanalysis and attenuated total reflectance-FT-IR imaging techniques.

    PubMed

    Malek, Md Abdul; Kim, Bowha; Jung, Hae-Jin; Song, Young-Chul; Ro, Chul-Un

    2011-10-15

    Our previous work on the speciation of individual mineral particles of micrometer size by the combined use of attenuated total reflectance FT-IR (ATR-FT-IR) imaging and a quantitative energy-dispersive electron probe X-ray microanalysis technique (EPMA), low-Z particle EPMA, demonstrated that the combined use of these two techniques is a powerful approach for looking at the single-particle mineralogy of externally heterogeneous minerals. In this work, this analytical methodology was applied to characterize six soil samples collected at arid areas in China, in order to identify mineral types present in the samples. The six soil samples were collected from two types of soil, i.e., loess and desert soils, for which overall 665 particles were analyzed on a single particle basis. The six soil samples have different mineralogical characteristics, which were clearly differentiated in this work. As this analytical methodology provides complementary information, the ATR-FT-IR imaging on mineral types, and low-Z particle EPMA on the morphology and elemental concentrations, on the same individual particles, more detailed information can be obtained using this approach than when either low-Z particle EPMA or ATR-FT-IR imaging techniques are used alone, which has a great potential for the characterization of Asian dust and mineral dust particles. © 2011 American Chemical Society

  12. Motion-compensated compressed sensing for dynamic imaging

    NASA Astrophysics Data System (ADS)

    Sundaresan, Rajagopalan; Kim, Yookyung; Nadar, Mariappan S.; Bilgin, Ali

    2010-08-01

    The recently introduced Compressed Sensing (CS) theory explains how sparse or compressible signals can be reconstructed from far fewer samples than what was previously believed possible. The CS theory has attracted significant attention for applications such as Magnetic Resonance Imaging (MRI) where long acquisition times have been problematic. This is especially true for dynamic MRI applications where high spatio-temporal resolution is needed. For example, in cardiac cine MRI, it is desirable to acquire the whole cardiac volume within a single breath-hold in order to avoid artifacts due to respiratory motion. Conventional MRI techniques do not allow reconstruction of high resolution image sequences from such limited amount of data. Vaswani et al. recently proposed an extension of the CS framework to problems with partially known support (i.e. sparsity pattern). In their work, the problem of recursive reconstruction of time sequences of sparse signals was considered. Under the assumption that the support of the signal changes slowly over time, they proposed using the support of the previous frame as the "known" part of the support for the current frame. While this approach works well for image sequences with little or no motion, motion causes significant change in support between adjacent frames. In this paper, we illustrate how motion estimation and compensation techniques can be used to reconstruct more accurate estimates of support for image sequences with substantial motion (such as cardiac MRI). Experimental results using phantoms as well as real MRI data sets illustrate the improved performance of the proposed technique.

  13. Strong Langmuir Turbulence and Four-Wave Mixing

    NASA Astrophysics Data System (ADS)

    Glanz, James

    1991-02-01

    The staircase expansion is a new mathematical technique for deriving reduced, nonlinear-PDE descriptions from the plasma-moment equations. Such descriptions incorporate only the most significant linear and nonlinear terms of more complex systems. The technique is used to derive a set of Dawson-Zakharov or "master" equations, which unify and generalize previous work and show the limitations of models commonly used to describe nonlinear plasma waves. Fundamentally new wave-evolution equations are derived that admit of exact nonlinear solutions (solitary waves). Analytic calculations illustrate the competition between well-known effects of self-focusing, which require coupling to ion motion, and pure-electron nonlinearities, which are shown to be especially important in curved geometries. Also presented is an N -moment hydrodynamic model derived from the Vlasov equation. In this connection, the staircase expansion is shown to remain useful for all values of N >= 3. The relevance of the present work to nonlocally truncated hierarchies, which more accurately model dissipation, is briefly discussed. Finally, the general formalism is applied to the problem of electromagnetic emission from counterpropagating Langmuir pumps. It is found that previous treatments have neglected order-unity effects that increase the emission significantly. Detailed numerical results are presented to support these conclusions. The staircase expansion--so called because of its appearance when written out--should be effective whenever the largest contribution to the nonlinear wave remains "close" to some given frequency. Thus the technique should have application to studies of wake-field acceleration schemes and anomalous damping of plasma waves.

  14. A Coherent VLSI Environment

    DTIC Science & Technology

    1987-03-31

    processors . The symmetry-breaking algorithms give efficient ways to convert probabilistic algorithms to deterministic algorithms. Some of the...techniques have been applied to construct several efficient linear- processor algorithms for graph problems, including an O(lg* n)-time algorithm for (A + 1...On n-node graphs, the algorithm works in O(log 2 n) time using only n processors , in contrast to the previous best algorithm which used about n3

  15. Special Course on Interaction of Propagation and Digital Transmission Techniques

    DTIC Science & Technology

    1986-10-01

    military roles. Many working systems have been demonstrated, and there are a number of fully operational civil systems, see for example (Western Union ...provided by the previous analogue systems, impose increased bandwidth demands which are difficult to fulfill in the spectrally-congested European...AGARD, 1984, "Propagation influences on digital transmission systems - problems and solutions", AGARD CP No.363. 2. Western Union

  16. Techniques and Measurements. Seychelles Integrated Science. [Teacher and Pupil Booklets]. Unit 1.

    ERIC Educational Resources Information Center

    Brophy, M.; Fryars, M.

    Seychelles Integrated Science (SIS), a 3-year laboratory-based science program for students (ages 11-15) in upper primary grades 7, 8, and 9, was developed from an extensive evaluation and modification of previous P7-P9 materials. This P7 SIS unit is designed to: (1) introduce students to and familiarize them with working in the school laboratory;…

  17. A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion

    NASA Astrophysics Data System (ADS)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-09-01

    Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.

  18. Best available techniques (BATs) for oil spill response in the Mediterranean Sea: calm sea and presence of economic activities.

    PubMed

    Guidi, Giambattista; Sliskovic, Merica; Violante, Anna Carmela; Vukic, Luka

    2016-01-01

    An oil spill is the accidental or intentional discharge of petroleum products into the environment due to human activities. Although oil spills are actually just a little percent of the total world oil pollution problem, they represent the most visible form of it. The impact on the ecosystems can be severe as well as the impact on economic activities. Oil spill cleanup is a very difficult and expensive activity, and many techniques are available for it. In previous works, a methodology based on different kinds of criteria in order to come to the most satisfactory technique was proposed and the relative importance of each impact criterion on the basis of the Saaty's Analytic Hierarchy Process (AHP) was also evaluated. After a review of the best available techniques (BATs) available for oil spill response, this work suggests criteria for BATs' selection when oil spills occur in the Mediterranean Sea under well-defined circumstances: calm sea and presence of economic activities in the affected area. A group of experts with different specializations evaluated the alternative BATs by means of AHP method taking into account their respective advantages and disadvantages.

  19. Identification of transformer fault based on dissolved gas analysis using hybrid support vector machine-modified evolutionary particle swarm optimisation

    PubMed Central

    2018-01-01

    Early detection of power transformer fault is important because it can reduce the maintenance cost of the transformer and it can ensure continuous electricity supply in power systems. Dissolved Gas Analysis (DGA) technique is commonly used to identify oil-filled power transformer fault type but utilisation of artificial intelligence method with optimisation methods has shown convincing results. In this work, a hybrid support vector machine (SVM) with modified evolutionary particle swarm optimisation (EPSO) algorithm was proposed to determine the transformer fault type. The superiority of the modified PSO technique with SVM was evaluated by comparing the results with the actual fault diagnosis, unoptimised SVM and previous reported works. Data reduction was also applied using stepwise regression prior to the training process of SVM to reduce the training time. It was found that the proposed hybrid SVM-Modified EPSO (MEPSO)-Time Varying Acceleration Coefficient (TVAC) technique results in the highest correct identification percentage of faults in a power transformer compared to other PSO algorithms. Thus, the proposed technique can be one of the potential solutions to identify the transformer fault type based on DGA data on site. PMID:29370230

  20. Identification of transformer fault based on dissolved gas analysis using hybrid support vector machine-modified evolutionary particle swarm optimisation.

    PubMed

    Illias, Hazlee Azil; Zhao Liang, Wee

    2018-01-01

    Early detection of power transformer fault is important because it can reduce the maintenance cost of the transformer and it can ensure continuous electricity supply in power systems. Dissolved Gas Analysis (DGA) technique is commonly used to identify oil-filled power transformer fault type but utilisation of artificial intelligence method with optimisation methods has shown convincing results. In this work, a hybrid support vector machine (SVM) with modified evolutionary particle swarm optimisation (EPSO) algorithm was proposed to determine the transformer fault type. The superiority of the modified PSO technique with SVM was evaluated by comparing the results with the actual fault diagnosis, unoptimised SVM and previous reported works. Data reduction was also applied using stepwise regression prior to the training process of SVM to reduce the training time. It was found that the proposed hybrid SVM-Modified EPSO (MEPSO)-Time Varying Acceleration Coefficient (TVAC) technique results in the highest correct identification percentage of faults in a power transformer compared to other PSO algorithms. Thus, the proposed technique can be one of the potential solutions to identify the transformer fault type based on DGA data on site.

  1. Tone and Broadband Noise Separation from Acoustic Data of a Scale-Model Counter-Rotating Open Rotor

    NASA Technical Reports Server (NTRS)

    Sree, David; Stephens, David B.

    2014-01-01

    Renewed interest in contra-rotating open rotor technology for aircraft propulsion application has prompted the development of advanced diagnostic tools for better design and improved acoustical performance. In particular, the determination of tonal and broadband components of open rotor acoustic spectra is essential for properly assessing the noise control parameters and also for validating the open rotor noise simulation codes. The technique of phase averaging has been employed to separate the tone and broadband components from a single rotor, but this method does not work for the two-shaft contra-rotating open rotor. A new signal processing technique was recently developed to process the contra-rotating open rotor acoustic data. The technique was first tested using acoustic data taken of a hobby aircraft open rotor propeller, and reported previously. The intent of the present work is to verify and validate the applicability of the new technique to a realistic one-fifth scale open rotor model which has 12 forward and 10 aft contra-rotating blades operating at realistic forward flight Mach numbers and tip speeds. The results and discussions of that study are presented in this paper.

  2. Tone and Broadband Noise Separation from Acoustic Data of a Scale-Model Contra-Rotating Open Rotor

    NASA Technical Reports Server (NTRS)

    Sree, Dave; Stephens, David B.

    2014-01-01

    Renewed interest in contra-rotating open rotor technology for aircraft propulsion application has prompted the development of advanced diagnostic tools for better design and improved acoustical performance. In particular, the determination of tonal and broadband components of open rotor acoustic spectra is essential for properly assessing the noise control parameters and also for validating the open rotor noise simulation codes. The technique of phase averaging has been employed to separate the tone and broadband components from a single rotor, but this method does not work for the two-shaft contra-rotating open rotor. A new signal processing technique was recently developed to process the contra-rotating open rotor acoustic data. The technique was first tested using acoustic data taken of a hobby aircraft open rotor propeller, and reported previously. The intent of the present work is to verify and validate the applicability of the new technique to a realistic one-fifth scale open rotor model which has 12 forward and 10 aft contra-rotating blades operating at realistic forward flight Mach numbers and tip speeds. The results and discussions of that study are presented in this paper.

  3. Code Modernization of VPIC

    NASA Astrophysics Data System (ADS)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  4. [Neuroplasticity as a basis for early rehabilitation of stroke patients].

    PubMed

    Putilina, M V

    2011-01-01

    The review is devoted to the current state of the problem of early rehabilitation of stroke patients. The rate of primary disability in patients after stroke is 3.2 per 10000 population but only 20% of previously working patients return to work. Early rehabilitation is treatment actions during a period following stroke. Adequate treatment during this period may decrease the extent of brain damage and improve disease outcome. The complexity of rehabilitation consists in using several complementary pharmacological and non-pharmacological rehabilitation measures. Appearance of new techniques of rehabilitation treatment aimed at neuroplasticity stimulation increases treatment potential of rehabilitative technologies.

  5. OSCILLATOR STRENGTHS OF VIBRIONIC EXCITATIONS OF NITROGEN DETERMINED BY THE DIPOLE (γ, γ) METHOD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ya-Wei; Kang, Xu; Xu, Long-Quan

    2016-03-10

    The oscillator strengths of the valence-shell excitations of molecular nitrogen have significant applicational values in studies of the Earth's atmosphere and interstellar gases. In this work, the absolute oscillator strengths of the valence-shell excitations of molecular nitrogen in 12.3–13.4 eV were measured by the novel dipole (γ, γ) method, in which the high-resolution inelastic X-ray scattering is operated at a negligibly small momentum transfer and can simulate the photoabsorption process. Because the experimental technique used in the present work is distinctly different from those used previously, the present experimental results give an independent cross-check to previous experimental and theoretical data.more » The excellent coincidence of the present results with the dipole (e, e) and those that were extrapolated indicates that the present oscillator strengths can serve as benchmark data.« less

  6. Organic materials in the wall paintings in Pompei: a case study of Insula del Centenario

    PubMed Central

    2012-01-01

    Background The present research concerns the Roman wall paintings preserved at Insula del Centenario (IX, 8), the important Pompeian block situated in the Regio IX, along Via di Nola. Results The aims of this research are two: to verify the presence of lipidic and proteinaceous material to spread the pigments, and to identify organic matter in painting materials owing to previous restoration works. The samples collected from the wall paintings of different rooms have been investigated by Fourier Transform Infrared Spectroscopy (FT-IR), and Gas Chromatography/ Mass Spectrometry (GC/MS). Conclusions The analytical results show that these Roman wall paintings were realized without the use of lipidic and proteinaceous materials, supposedly in fresco technique. Moreover, it was detected that wax, egg, and animal glue were used in previous restoration works for protective purpose and to restore the wall paintings to their original brilliant colours. PMID:23006771

  7. Multiclass feature selection for improved pediatric brain tumor segmentation

    NASA Astrophysics Data System (ADS)

    Ahmed, Shaheen; Iftekharuddin, Khan M.

    2012-03-01

    In our previous work, we showed that fractal-based texture features are effective in detection, segmentation and classification of posterior-fossa (PF) pediatric brain tumor in multimodality MRI. We exploited an information theoretic approach such as Kullback-Leibler Divergence (KLD) for feature selection and ranking different texture features. We further incorporated the feature selection technique with segmentation method such as Expectation Maximization (EM) for segmentation of tumor T and non tumor (NT) tissues. In this work, we extend the two class KLD technique to multiclass for effectively selecting the best features for brain tumor (T), cyst (C) and non tumor (NT). We further obtain segmentation robustness for each tissue types by computing Bay's posterior probabilities and corresponding number of pixels for each tissue segments in MRI patient images. We evaluate improved tumor segmentation robustness using different similarity metric for 5 patients in T1, T2 and FLAIR modalities.

  8. Electromagnetic radiation screening of semiconductor devices for long life applications

    NASA Technical Reports Server (NTRS)

    Hall, T. C.; Brammer, W. G.

    1972-01-01

    A review is presented of the mechanism of interaction of electromagnetic radiation in various spectral ranges, with various semiconductor device defects. Previous work conducted in this area was analyzed as to its pertinence to the current problem. The task was studied of implementing electromagnetic screening methods in the wavelength region determined to be most effective. Both scanning and flooding type stimulation techniques are discussed. While the scanning technique offers a considerably higher yield of useful information, a preliminary investigation utilizing the flooding approach is first recommended because of the ease of implementation, lower cost and ability to provide go-no-go information in semiconductor screening.

  9. Characterization of hydrogel printer for direct cell-laden scaffolds

    NASA Astrophysics Data System (ADS)

    Whulanza, Yudan; Arsyan, Rendria; Saragih, Agung Shamsuddin

    2018-02-01

    The additive manufacturing technology has been massively developed since the last decade. The technology was previously known as rapid prototyping techniques that aimed to produce a prototyping product in fast and economical way. Currently, this technique is also applied to fabricate microstructure utilized in tissue engineering technology. Here, we introduce a 3D printer which using hydrogel gelatin to realize cell laden scaffold with dimension around 50-100 µm. However, in order to fabricate such a precise dimension, an optimum working parameters are required to control the physical properties of gelatin. At the end of our study, we formulated the best parameters to perform the product as we desired.

  10. Laser balancing system for high material removal rates

    NASA Technical Reports Server (NTRS)

    Jones, M. G.; Georgalas, G.; Ortiz, A. L.

    1984-01-01

    A laser technique to remove material in excess of 10 mg/sec from a spinning rotor is described. This material removal rate is 20 times greater than previously reported for a surface speed of 30 m/sec. Material removal enhancement was achieved by steering a focused laser beam with moving optics to increase the time of laser energy interaction with a particular location on the circumferential surface of a spinning rotor. A neodymium:yttrium aluminum garnet (Nd:YAG) pulse laser was used in this work to evaluate material removal for carbon steel, 347 stainless steel, Inconal 718, and titanium 6-4. This technique is applicable to dynamic laser balancing.

  11. Curveslam: Utilizing Higher Level Structure In Stereo Vision-Based Navigation

    DTIC Science & Technology

    2012-01-01

    consider their applica- tion to SLAM . The work of [31] [32] develops a spline-based SLAM framework, but this is only for application to LIDAR -based SLAM ...Existing approaches to visual Simultaneous Localization and Mapping ( SLAM ) typically utilize points as visual feature primitives to represent landmarks...regions of interest. Further, previous SLAM techniques that propose the use of higher level structures often place constraints on the environment, such as

  12. Long-term residual dry matter mapping for monitoring California hardwood rangelands

    Treesearch

    Norman R. Harris; William E. Frost; Neil K. McDougald; Melvin R. George; Donald L. Nielsen

    2002-01-01

    Long-term residual dry matter mapping on the San Joaquin Experimental Range provides a working example of this monitoring technique for grazing management and research. Residual dry matter (RDM) is the amount of old plant material left on the ground at the beginning of a new growing season. RDM indicates the previous season’s use and can be used to describe the health...

  13. Integrated Formulation of Beacon-Based Exception Analysis for Multimissions

    NASA Technical Reports Server (NTRS)

    Mackey, Ryan; James, Mark; Park, Han; Zak, Mickail

    2003-01-01

    Further work on beacon-based exception analysis for multimissions (BEAM), a method of real-time, automated diagnosis of a complex electromechanical systems, has greatly expanded its capability and suitability of application. This expanded formulation, which fully integrates physical models and symbolic analysis, is described. The new formulation of BEAM expands upon previous advanced techniques for analysis of signal data, utilizing mathematical modeling of the system physics, and expert-system reasoning,

  14. Improving national-scale invasion maps: Tamarisk in the western United States

    USGS Publications Warehouse

    Jarnevich, C.S.; Evangelista, P.; Stohlgren, T.J.; Morisette, J.

    2011-01-01

    New invasions, better field data, and novel spatial-modeling techniques often drive the need to revisit previous maps and models of invasive species. Such is the case with the at least 10 species of Tamarix, which are invading riparian systems in the western United States and expanding their range throughout North America. In 2006, we developed a National Tamarisk Map by using a compilation of presence and absence locations with remotely sensed data and statistical modeling techniques. Since the publication of that work, our database of Tamarix distributions has grown significantly. Using the updated database of species occurrence, new predictor variables, and the maximum entropy (Maxent) model, we have revised our potential Tamarix distribution map for the western United States. Distance-to-water was the strongest predictor in the model (58.1%), while mean temperature of the warmest quarter was the second best predictor (18.4%). Model validation, averaged from 25 model iterations, indicated that our analysis had strong predictive performance (AUC = 0.93) and that the extent of Tamarix distributions is much greater than previously thought. The southwestern United States had the greatest suitable habitat, and this result differed from the 2006 model. Our work highlights the utility of iterative modeling for invasive species habitat modeling as new information becomes available. ?? 2011.

  15. GPU-accelerated two dimensional synthetic aperture focusing for photoacoustic microscopy

    NASA Astrophysics Data System (ADS)

    Liu, Siyu; Feng, Xiaohua; Gao, Fei; Jin, Haoran; Zhang, Ruochong; Luo, Yunqi; Zheng, Yuanjin

    2018-02-01

    Acoustic resolution photoacoustic microscopy (AR-PAM) generally suffers from limited depth of focus, which had been extended by synthetic aperture focusing techniques (SAFTs). However, for three dimensional AR-PAM, current one dimensional (1D) SAFT and its improved version like cross-shaped SAFT do not provide isotropic resolution in the lateral direction. The full potential of the SAFT remains to be tapped. To this end, two dimensional (2D) SAFT with fast computing architecture is proposed in this work. Explained by geometric modeling and Fourier acoustics theories, 2D-SAFT provide the narrowest post-focusing capability, thus to achieve best lateral resolution. Compared with previous 1D-SAFT techniques, the proposed 2D-SAFT improved the lateral resolution by at least 1.7 times and the signal-to-noise ratio (SNR) by about 10 dB in both simulation and experiments. Moreover, the improved 2D-SAFT algorithm is accelerated by a graphical processing unit that reduces the long period of reconstruction to only a few seconds. The proposed 2D-SAFT is demonstrated to outperform previous reported 1D SAFT in the aspects of improving the depth of focus, imaging resolution, and SNR with fast computational efficiency. This work facilitates future studies on in vivo deeper and high-resolution photoacoustic microscopy beyond several centimeters.

  16. Semantic segmentation of mFISH images using convolutional networks.

    PubMed

    Pardo, Esteban; Morgado, José Mário T; Malpica, Norberto

    2018-04-30

    Multicolor in situ hybridization (mFISH) is a karyotyping technique used to detect major chromosomal alterations using fluorescent probes and imaging techniques. Manual interpretation of mFISH images is a time consuming step that can be automated using machine learning; in previous works, pixel or patch wise classification was employed, overlooking spatial information which can help identify chromosomes. In this work, we propose a fully convolutional semantic segmentation network for the interpretation of mFISH images, which uses both spatial and spectral information to classify each pixel in an end-to-end fashion. The semantic segmentation network developed was tested on samples extracted from a public dataset using cross validation. Despite having no labeling information of the image it was tested on, our algorithm yielded an average correct classification ratio (CCR) of 87.41%. Previously, this level of accuracy was only achieved with state of the art algorithms when classifying pixels from the same image in which the classifier has been trained. These results provide evidence that fully convolutional semantic segmentation networks may be employed in the computer aided diagnosis of genetic diseases with improved performance over the current image analysis methods. © 2018 International Society for Advancement of Cytometry. © 2018 International Society for Advancement of Cytometry.

  17. Examining the Gender Gap in Introductory Physics

    NASA Astrophysics Data System (ADS)

    Kost, Lauren; Pollock, Steven; Finkelstein, Noah

    2009-05-01

    Our previous research[1] showed that despite the use of interactive engagement techniques in the introductory physics course, the gap in performance between males and females on a mechanics conceptual learning survey persisted from pre- to post-test, at our institution. Such findings were counter to previously published work[2]. Follow-up studies[3] identified correlations between student performance on the conceptual learning survey and students' prior physics and math knowledge and their incoming attitudes and beliefs about physics and learning physics. The results indicate that the gender gap at our institution is predominantly associated with differences in males' and females' previous physics and math knowledge, and attitudes and beliefs. Our current work extends these results in two ways: 1) we look at the gender gap in the second semester of the introductory sequence and find results similar to those in the first semester course and 2) we identify ways in which males and females differentially experience several aspects of the introductory course. [1] Pollock, et al, Phys Rev: ST: PER 3, 010107. [2] Lorenzo, et al, Am J Phys 74, 118. [3] Kost, et al, PERC Proceedings 2008.

  18. Digital mammography: physical principles and future applications.

    PubMed

    Gambaccini, Mauro; Baldelli, Paola

    2003-01-01

    Mammography is currently considered the best tool for the detection of breast cancer, pathology with a rate of incidence in constant increase. To produce the radiological picture a screen film combination is conventionally used. One of the inherent limitations of screen- film combination is the fact that the detection, display and storage processes are one and the same, making it impossible to separately optimize each stage. These limitations can be overcome with digital systems. In this work we evaluate the main characteristics of digital detectors available on the market and we compare the performance of digital and conventional systems. Digital mammography, due to the possibility to process images, offers many potential advantages, among these the possibility to introduce the dual-energy technique which employs the composition of two digital images obtained with two different energies to enhance the inherent contrast of pathologies by removing the uniform background. This technique was previously tested by using synchrotron monochromatic beam and a digital detector, and then the Senographe 2000D full-field digital system manufactured by GE Medical Systems. In this work we present preliminary results and the future applications of this technique.

  19. Photo- and thermally stimulated luminescence of polyminerals extracted from herbs and spices

    NASA Astrophysics Data System (ADS)

    Cruz-Zaragoza, E.; Marcazzó, J.; Chernov, V.

    2012-08-01

    Ionizing radiation processing is a widely employed method for preservative treatment of foodstuffs. Usually it is possible to detect irradiated herbs and spices by resorting to luminescence techniques, in particular photo- and thermostimulated luminescence. For these techniques to be useful, it is necessary to characterize the response to radiation of each particular herb or spice. In this work, the thermoluminescence (TL) and photostimulated luminescence (PSL) properties of inorganic polymineral fractions extracted from commercial herbs and spices previously irradiated for disinfestation purposes have been analyzed. Samples of mint, cinnamon, chamomile, paprika, black pepper, coriander and Jamaica flower were irradiated from 50 to 400 Gy by using a beta source. The X-ray diffraction (XRD) analysis has shown that the mineral fractions consist mainly of quartz and feldspars. The PSL and TL response as a function of the absorbed dose, and their fading at room temperature have been determined. The TL glow curves have been deconvolved in order to obtain characteristic kinetics parameters in each case. The results of this work show that PSL and TL are reliable techniques for detection and analysis of irradiated foodstuffs.

  20. Investigating the probability of detection of typical cavity shapes through modelling and comparison of geophysical techniques

    NASA Astrophysics Data System (ADS)

    James, P.

    2011-12-01

    With a growing need for housing in the U.K., the government has proposed increased development of brownfield sites. However, old mine workings and natural cavities represent a potential hazard before, during and after construction on such sites, and add further complication to subsurface parameters. Cavities are hence a limitation to certain redevelopment and their detection is an ever important consideration. The current standard technique for cavity detection is a borehole grid, which is intrusive, non-continuous, slow and expensive. A new robust investigation standard in the detection of cavities is sought and geophysical techniques offer an attractive alternative. Geophysical techniques have previously been utilised successfully in the detection of cavities in various geologies, but still has an uncertain reputation in the engineering industry. Engineers are unsure of the techniques and are inclined to rely on well known techniques than utilise new technologies. Bad experiences with geophysics are commonly due to the indiscriminate choice of particular techniques. It is imperative that a geophysical survey is designed with the specific site and target in mind at all times, and the ability and judgement to rule out some, or all, techniques. To this author's knowledge no comparative software exists to aid technique choice. Also, previous modelling software limit the shapes of bodies and hence typical cavity shapes are not represented. Here, we introduce 3D modelling software (Matlab) which computes and compares the response to various cavity targets from a range of techniques (gravity, gravity gradient, magnetic, magnetic gradient and GPR). Typical near surface cavity shapes are modelled including shafts, bellpits, various lining and capping materials, and migrating voids. The probability of cavity detection is assessed in typical subsurface and noise conditions across a range of survey parameters. Techniques can be compared and the limits of detection distance assessed. The density of survey points required to achieve a required probability of detection can be calculated. The software aids discriminate choice of technique, improves survey design, and increases the likelihood of survey success; all factors sought in the engineering industry. As a simple example, the response from magnetometry, gravimetry, and gravity gradient techniques above an example 3m deep, 1m cube air cavity in limestone across a 15m grid was calculated. The maximum responses above the cavity are small (amplitudes of 0.018nT, 0.0013mGal, 8.3eotvos respectively), but at typical site noise levels the detection reliability is over 50% for the gradient gravity method on a single survey line. Increasing the number of survey points across the site increases the reliability of detection of the anomaly by the addition of probabilities. We can calculate the probability of detection at different profile spacings to assess the best possible survey design. At 1m spacing the overall probability of by the gradient gravity method is over 90%, and over 60% for magnetometry (at 3m spacing the probability drops to 32%). The use of modelling in near surface surveys is a useful tool to assess the feasibility of a range of techniques to detect subtle signals. Future work will integrate this work with borehole measured parameters.

  1. A Dataset and a Technique for Generalized Nuclear Segmentation for Computational Pathology.

    PubMed

    Kumar, Neeraj; Verma, Ruchika; Sharma, Sanuj; Bhargava, Surabhi; Vahadane, Abhishek; Sethi, Amit

    2017-07-01

    Nuclear segmentation in digital microscopic tissue images can enable extraction of high-quality features for nuclear morphometrics and other analysis in computational pathology. Conventional image processing techniques, such as Otsu thresholding and watershed segmentation, do not work effectively on challenging cases, such as chromatin-sparse and crowded nuclei. In contrast, machine learning-based segmentation can generalize across various nuclear appearances. However, training machine learning algorithms requires data sets of images, in which a vast number of nuclei have been annotated. Publicly accessible and annotated data sets, along with widely agreed upon metrics to compare techniques, have catalyzed tremendous innovation and progress on other image classification problems, particularly in object recognition. Inspired by their success, we introduce a large publicly accessible data set of hematoxylin and eosin (H&E)-stained tissue images with more than 21000 painstakingly annotated nuclear boundaries, whose quality was validated by a medical doctor. Because our data set is taken from multiple hospitals and includes a diversity of nuclear appearances from several patients, disease states, and organs, techniques trained on it are likely to generalize well and work right out-of-the-box on other H&E-stained images. We also propose a new metric to evaluate nuclear segmentation results that penalizes object- and pixel-level errors in a unified manner, unlike previous metrics that penalize only one type of error. We also propose a segmentation technique based on deep learning that lays a special emphasis on identifying the nuclear boundaries, including those between the touching or overlapping nuclei, and works well on a diverse set of test images.

  2. Symbolically Modeling Concurrent MCAPI Executions

    NASA Technical Reports Server (NTRS)

    Fischer, Topher; Mercer, Eric; Rungta, Neha

    2011-01-01

    Improper use of Inter-Process Communication (IPC) within concurrent systems often creates data races which can lead to bugs that are challenging to discover. Techniques that use Satisfiability Modulo Theories (SMT) problems to symbolically model possible executions of concurrent software have recently been proposed for use in the formal verification of software. In this work we describe a new technique for modeling executions of concurrent software that use a message passing API called MCAPI. Our technique uses an execution trace to create an SMT problem that symbolically models all possible concurrent executions and follows the same sequence of conditional branch outcomes as the provided execution trace. We check if there exists a satisfying assignment to the SMT problem with respect to specific safety properties. If such an assignment exists, it provides the conditions that lead to the violation of the property. We show how our method models behaviors of MCAPI applications that are ignored in previously published techniques.

  3. Anti-forensics of chromatic aberration

    NASA Astrophysics Data System (ADS)

    Mayer, Owen; Stamm, Matthew C.

    2015-03-01

    Over the past decade, a number of information forensic techniques have been developed to identify digital image manipulation and falsification. Recent research has shown, however, that an intelligent forger can use anti-forensic countermeasures to disguise their forgeries. In this paper, an anti-forensic technique is proposed to falsify the lateral chromatic aberration present in a digital image. Lateral chromatic aberration corresponds to the relative contraction or expansion between an image's color channels that occurs due to a lens's inability to focus all wavelengths of light on the same point. Previous work has used localized inconsistencies in an image's chromatic aberration to expose cut-and-paste image forgeries. The anti-forensic technique presented in this paper operates by estimating the expected lateral chromatic aberration at an image location, then removing deviations from this estimate caused by tampering or falsification. Experimental results are presented that demonstrate that our anti-forensic technique can be used to effectively disguise evidence of an image forgery.

  4. Detecting and disentangling nonlinear structure from solar flux time series

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.; Roszman, L.

    1992-01-01

    Interest in solar activity has grown in the past two decades for many reasons. Most importantly for flight dynamics, solar activity changes the atmospheric density, which has important implications for spacecraft trajectory and lifetime prediction. Building upon the previously developed Rayleigh-Benard nonlinear dynamic solar model, which exhibits many dynamic behaviors observed in the Sun, this work introduces new chaotic solar forecasting techniques. Our attempt to use recently developed nonlinear chaotic techniques to model and forecast solar activity has uncovered highly entangled dynamics. Numerical techniques for decoupling additive and multiplicative white noise from deterministic dynamics and examines falloff of the power spectra at high frequencies as a possible means of distinguishing deterministic chaos from noise than spectrally white or colored are presented. The power spectral techniques presented are less cumbersome than current methods for identifying deterministic chaos, which require more computationally intensive calculations, such as those involving Lyapunov exponents and attractor dimension.

  5. Faster the better: a reliable technique to sample anopluran lice in large hosts.

    PubMed

    Leonardi, María Soledad

    2014-06-01

    Among Anoplura, the family Echinophthiriidae includes those species that infest mainly the pinnipeds. Working with large hosts implies methodological considerations as the time spent in the sampling, and the way in that the animal is restrained. Previous works on echinophthiriids combined a diverse array of analyses including field counts of lice and in vitro observations. To collect lice, the authors used forceps, and each louse was collected individually. This implied a long manipulation time, i.e., ≈60 min and the need to physically and/or chemically immobilize the animal. The present work described and discussed for the first a sample technique that minimized the manipulation time and also avoiding the use of anesthesia. This methodology implied combing the host's pelage with a fine-tooth plastic comb, as used in the treatment of human pediculosis, and keeping the comb with the lice retained in a Ziploc® bag with ethanol. This technique was used successfully in studies of population dynamic, habitat selection, and transmission pattern, being a reliable methodology. Lice are collected entirely and are in a good condition to prepare them for mounting for studying under light or scanning electron microscopy. Moreover, the use of the plastic comb protects from damaging taxonomically important structures as spines being also recommended to reach taxonomic or morphological goals.

  6. HEAVY AND THERMAL OIL RECOVERY PRODUCTION MECHANISMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony R. Kovscek

    2003-04-01

    This technical progress report describes work performed from January 1 through March 31, 2003 for the project ''Heavy and Thermal Oil Recovery Production Mechanisms,'' DE-FC26-00BC15311. In this project, a broad spectrum of research is undertaken related to thermal and heavy-oil recovery. The research tools and techniques span from pore-level imaging of multiphase fluid flow to definition of reservoir-scale features through streamline-based history matching techniques. During this period, previous analysis of experimental data regarding multidimensional imbibition to obtain shape factors appropriate for dual-porosity simulation was verified by comparison among analytic, dual-porosity simulation, and fine-grid simulation. We continued to study the mechanismsmore » by which oil is produced from fractured porous media at high pressure and high temperature. Temperature has a beneficial effect on recovery and reduces residual oil saturation. A new experiment was conducted on diatomite core. Significantly, we show that elevated temperature induces fines release in sandstone cores and this behavior may be linked to wettability. Our work in the area of primary production of heavy oil continues with field cores and crude oil. On the topic of reservoir definition, work continued on developing techniques that integrate production history into reservoir models using streamline-based properties.« less

  7. Machine Learning Techniques for Stellar Light Curve Classification

    NASA Astrophysics Data System (ADS)

    Hinners, Trisha A.; Tat, Kevin; Thorp, Rachel

    2018-07-01

    We apply machine learning techniques in an attempt to predict and classify stellar properties from noisy and sparse time-series data. We preprocessed over 94 GB of Kepler light curves from the Mikulski Archive for Space Telescopes (MAST) to classify according to 10 distinct physical properties using both representation learning and feature engineering approaches. Studies using machine learning in the field have been primarily done on simulated data, making our study one of the first to use real light-curve data for machine learning approaches. We tuned our data using previous work with simulated data as a template and achieved mixed results between the two approaches. Representation learning using a long short-term memory recurrent neural network produced no successful predictions, but our work with feature engineering was successful for both classification and regression. In particular, we were able to achieve values for stellar density, stellar radius, and effective temperature with low error (∼2%–4%) and good accuracy (∼75%) for classifying the number of transits for a given star. The results show promise for improvement for both approaches upon using larger data sets with a larger minority class. This work has the potential to provide a foundation for future tools and techniques to aid in the analysis of astrophysical data.

  8. Ultrafast chirped optical waveform recording using referenced heterodyning and a time microscope

    DOEpatents

    Bennett, Corey Vincent

    2010-06-15

    A new technique for capturing both the amplitude and phase of an optical waveform is presented. This technique can capture signals with many THz of bandwidths in a single shot (e.g., temporal resolution of about 44 fs), or be operated repetitively at a high rate. That is, each temporal window (or frame) is captured single shot, in real time, but the process may be run repeatedly or single-shot. This invention expands upon previous work in temporal imaging by adding heterodyning, which can be self-referenced for improved precision and stability, to convert frequency chirp (the second derivative of phase with respect to time) into a time varying intensity modulation. By also including a variety of possible demultiplexing techniques, this process is scalable to recoding continuous signals.

  9. Ultrafast chirped optical waveform recorder using referenced heterodyning and a time microscope

    DOEpatents

    Bennett, Corey Vincent [Livermore, CA

    2011-11-22

    A new technique for capturing both the amplitude and phase of an optical waveform is presented. This technique can capture signals with many THz of bandwidths in a single shot (e.g., temporal resolution of about 44 fs), or be operated repetitively at a high rate. That is, each temporal window (or frame) is captured single shot, in real time, but the process may be run repeatedly or single-shot. This invention expands upon previous work in temporal imaging by adding heterodyning, which can be self-referenced for improved precision and stability, to convert frequency chirp (the second derivative of phase with respect to time) into a time varying intensity modulation. By also including a variety of possible demultiplexing techniques, this process is scalable to recoding continuous signals.

  10. How to mathematically optimize drug regimens using optimal control.

    PubMed

    Moore, Helen

    2018-02-01

    This article gives an overview of a technique called optimal control, which is used to optimize real-world quantities represented by mathematical models. I include background information about the historical development of the technique and applications in a variety of fields. The main focus here is the application to diseases and therapies, particularly the optimization of combination therapies, and I highlight several such examples. I also describe the basic theory of optimal control, and illustrate each of the steps with an example that optimizes the doses in a combination regimen for leukemia. References are provided for more complex cases. The article is aimed at modelers working in drug development, who have not used optimal control previously. My goal is to make this technique more accessible in the biopharma community.

  11. The application analysis of the multi-angle polarization technique for ocean color remote sensing

    NASA Astrophysics Data System (ADS)

    Zhang, Yongchao; Zhu, Jun; Yin, Huan; Zhang, Keli

    2017-02-01

    The multi-angle polarization technique, which uses the intensity of polarized radiation as the observed quantity, is a new remote sensing means for earth observation. With this method, not only can the multi-angle light intensity data be provided, but also the multi-angle information of polarized radiation can be obtained. So, the technique may solve the problems, those could not be solved with the traditional remote sensing methods. Nowadays, the multi-angle polarization technique has become one of the hot topics in the field of the international quantitative research on remote sensing. In this paper, we firstly introduce the principles of the multi-angle polarization technique, then the situations of basic research and engineering applications are particularly summarized and analysed in 1) the peeled-off method of sun glitter based on polarization, 2) the ocean color remote sensing based on polarization, 3) oil spill detection using polarization technique, 4) the ocean aerosol monitoring based on polarization. Finally, based on the previous work, we briefly present the problems and prospects of the multi-angle polarization technique used in China's ocean color remote sensing.

  12. Examples of challenges and opportunities in visual analysis in the digital humanities

    NASA Astrophysics Data System (ADS)

    Rushmeier, Holly; Pintus, Ruggero; Yang, Ying; Wong, Christiana; Li, David

    2015-03-01

    The massive digitization of books and manuscripts has converted millions of works that were once only physical into electronic documents. This conversion has made it possible for scholars to study large bodies of work, rather than just individual texts. This has offered new opportunities for scholarship in the humanities. Much previous work on digital collections has relied on optical character recognition and focused on the textual content of books. New work is emerging that is analyzing the visual layout and content of books and manuscripts. We present two different digital humanities projects in progress that present new opportunities for extracting data about the past, with new challenges for designing systems for scholars to interact with this data. The first project we consider is the layout and spectral content of thousands of pages from medieval manuscripts. We present the techniques used to study content variations in sets of similar manuscripts, and to study material variations that may indicate the location of manuscript production. The second project is the analysis of representations in the complete archive of Vogue magazine over 120 years. We present samples of applying computer vision techniques to understanding the changes in representation of women over time.

  13. Prediction of Return-to-original-work after an Industrial Accident Using Machine Learning and Comparison of Techniques

    PubMed Central

    2018-01-01

    Background Many studies have tried to develop predictors for return-to-work (RTW). However, since complex factors have been demonstrated to predict RTW, it is difficult to use them practically. This study investigated whether factors used in previous studies could predict whether an individual had returned to his/her original work by four years after termination of the worker's recovery period. Methods An initial logistic regression analysis of 1,567 participants of the fourth Panel Study of Worker's Compensation Insurance yielded odds ratios. The participants were divided into two subsets, a training dataset and a test dataset. Using the training dataset, logistic regression, decision tree, random forest, and support vector machine models were established, and important variables of each model were identified. The predictive abilities of the different models were compared. Results The analysis showed that only earned income and company-related factors significantly affected return-to-original-work (RTOW). The random forest model showed the best accuracy among the tested machine learning models; however, the difference was not prominent. Conclusion It is possible to predict a worker's probability of RTOW using machine learning techniques with moderate accuracy. PMID:29736160

  14. Characterization of the Saccharomyces cerevisiae ATP-Interactome using the iTRAQ-SPROX Technique

    NASA Astrophysics Data System (ADS)

    Geer, M. Ariel; Fitzgerald, Michael C.

    2016-02-01

    The stability of proteins from rates of oxidation (SPROX) technique was used in combination with an isobaric mass tagging strategy to identify adenosine triphosphate (ATP) interacting proteins in the Saccharomyces cerevisiae proteome. The SPROX methodology utilized in this work enabled 373 proteins in a yeast cell lysate to be assayed for ATP interactions (both direct and indirect) using the non-hydrolyzable ATP analog, adenylyl imidodiphosphate (AMP-PNP). A total of 28 proteins were identified with AMP-PNP-induced thermodynamic stability changes. These protein hits included 14 proteins that were previously annotated as ATP-binding proteins in the Saccharomyces Genome Database (SGD). The 14 non-annotated ATP-binding proteins included nine proteins that were previously found to be ATP-sensitive in an earlier SPROX study using a stable isotope labeling with amino acids in cell culture (SILAC)-based approach. A bioinformatics analysis of the protein hits identified here and in the earlier SILAC-SPROX experiments revealed that many of the previously annotated ATP-binding protein hits were kinases, ligases, and chaperones. In contrast, many of the newly discovered ATP-sensitive proteins were not from these protein classes, but rather were hydrolases, oxidoreductases, and nucleic acid-binding proteins.

  15. Partial discharge characteristics of polymer nanocomposite materials in electrical insulation: a review of sample preparation techniques, analysis methods, potential applications, and future trends.

    PubMed

    Izzati, Wan Akmal; Arief, Yanuar Z; Adzis, Zuraimy; Shafanizam, Mohd

    2014-01-01

    Polymer nanocomposites have recently been attracting attention among researchers in electrical insulating applications from energy storage to power delivery. However, partial discharge has always been a predecessor to major faults and problems in this field. In addition, there is a lot more to explore, as neither the partial discharge characteristic in nanocomposites nor their electrical properties are clearly understood. By adding a small amount of weight percentage (wt%) of nanofillers, the physical, mechanical, and electrical properties of polymers can be greatly enhanced. For instance, nanofillers in nanocomposites such as silica (SiO2), alumina (Al2O3) and titania (TiO2) play a big role in providing a good approach to increasing the dielectric breakdown strength and partial discharge resistance of nanocomposites. Such polymer nanocomposites will be reviewed thoroughly in this paper, with the different experimental and analytical techniques used in previous studies. This paper also provides an academic review about partial discharge in polymer nanocomposites used as electrical insulating material from previous research, covering aspects of preparation, characteristics of the nanocomposite based on experimental works, application in power systems, methods and techniques of experiment and analysis, and future trends.

  16. Protocols for Late Maxillary Protraction in Cleft Lip and Palate Patients at Childrens Hospital Los Angeles

    PubMed Central

    Yen, Stephen L-K

    2011-01-01

    This paper describes the protocols used at Childrens Hospital Los Angeles (CHLA) to protract the maxilla during early adolescence. It is a modification of techniques introduced by Eric Liou with his Alternate Rapid Maxillary Expansion and Constriction (ALT-RAMEC) technique. The main differences between the CHLA protocol and previous maxillary protraction protocols are the age the protraction is attempted, the sutural loosening by alternating weekly expansion with constriction and the use of Class III elastics to support and redirect the protraction by nightly facemask wear. The CHLA protocol entirely depends on patient compliance and must be carefully taught and monitored. In a cooperative patient, the technique can correct a Class III malocclusion that previously would have been treated with LeFort 1 maxillary advancement surgery. Thus, it is not appropriate for patients requiring 2 jaw surgeries to correct mandibular prognathism, occlusal cants or facial asymmetry. The maxillary protraction appears to work by a combination of skeletal advancement, dental compensation and rotation of the occlusal planes. Microscrew/microimplant/temporary anchorage devices have been used with these maxillary protraction protocols to assist in expanding the maxilla, increasing skeletal anchorage during protraction, limiting dental compensations and reducing skeletal relapse. PMID:21765629

  17. A robust approach for a filter-based monocular simultaneous localization and mapping (SLAM) system.

    PubMed

    Munguía, Rodrigo; Castillo-Toledo, Bernardino; Grau, Antoni

    2013-07-03

    Simultaneous localization and mapping (SLAM) is an important problem to solve in robotics theory in order to build truly autonomous mobile robots. This work presents a novel method for implementing a SLAM system based on a single camera sensor. The SLAM with a single camera, or monocular SLAM, is probably one of the most complex SLAM variants. In this case, a single camera, which is freely moving through its environment, represents the sole sensor input to the system. The sensors have a large impact on the algorithm used for SLAM. Cameras are used more frequently, because they provide a lot of information and are well adapted for embedded systems: they are light, cheap and power-saving. Nevertheless, and unlike range sensors, which provide range and angular information, a camera is a projective sensor providing only angular measurements of image features. Therefore, depth information (range) cannot be obtained in a single step. In this case, special techniques for feature system-initialization are needed in order to enable the use of angular sensors (as cameras) in SLAM systems. The main contribution of this work is to present a novel and robust scheme for incorporating and measuring visual features in filtering-based monocular SLAM systems. The proposed method is based in a two-step technique, which is intended to exploit all the information available in angular measurements. Unlike previous schemes, the values of parameters used by the initialization technique are derived directly from the sensor characteristics, thus simplifying the tuning of the system. The experimental results show that the proposed method surpasses the performance of previous schemes.

  18. Surgical management of first branchial cleft anomaly presenting as infected retroauricular mass using a microscopic dissection technique.

    PubMed

    Chan, Kai-Chieh; Chao, Wei-Chieh; Wu, Che-Ming

    2012-01-01

    This is a detailed description of the clinical and anatomical presentation of the first branchial cleft anomaly presenting as retroauricular infected mass. Our experience with a microscopic dissection with control of the sinus lumen from within the cyst is also described. Between 2001 and 2008, patients with the final histologic diagnosis of first branchial cleft anomaly in the retroauricular area were managed with a microscopic dissection technique with control of the sinus lumen from within the cyst. Classifications were done in accordance with Work, Olsen, and Chilla. Outcomes measured intervention as a function of disease recurrence and complications including facial nerve function was used. Eight patients with a mean age of 14.2 years were enrolled, and this included 4 females and 4 males. Four type 1 and 4 type 2 lesions as per the Work's and Chilla's classification were found, and there were 5 sinuses, 2 fistulae, and 1 cyst according to Olsen's classification. All patients presented to the department with acute infection at the time of diagnosis. Five of the 8 patients had previous surgical treatment, 2 of those had up to 3 previous operations. None of the patients were complicated by disease recurrence or had surgical related complications (facial nerve paresis or paralysis, infection, canal stenosis) requiring reoperation with more than 1 year of follow-up. First branchial cleft anomaly presenting as retroauricular infected mass can be effectively treated by adopting a microscopic dissection technique with control of the sinus lumen from within the cyst. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Methods and Piezoelectric Imbedded Sensors for Damage Detection in Composite Plates Under Ambient and Cryogenic Conditions

    NASA Technical Reports Server (NTRS)

    Engberg, Robert; Ooi, Teng K.

    2004-01-01

    New methods for structural health monitoring are being assessed, especially in high-performance, extreme environment, safety-critical applications. One such application is for composite cryogenic fuel tanks. The work presented here attempts to characterize and investigate the feasibility of using imbedded piezoelectric sensors to detect cracks and delaminations under cryogenic and ambient conditions. A variety of damage detection methods and different Sensors are employed in the different composite plate samples to aid in determining an optimal algorithm, sensor placement strategy, and type of imbedded sensor to use. Variations of frequency, impedance measurements, and pulse echoing techniques of the sensors are employed and compared. Statistical and analytic techniques are then used to determine which method is most desirable for a specific type of damage. These results are furthermore compared with previous work using externally mounted sensors. Results and optimized methods from this work can then be incorporated into a larger composite structure to validate and assess its structural health. This could prove to be important in the development and qualification of any 2" generation reusable launch vehicle using composites as a structural element.

  20. High-speed time-reversed ultrasonically encoded (TRUE) optical focusing inside dynamic scattering media at 793 nm

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Lai, Puxiang; Ma, Cheng; Xu, Xiao; Suzuki, Yuta; Grabar, Alexander A.; Wang, Lihong V.

    2014-03-01

    Time-reversed ultrasonically encoded (TRUE) optical focusing is an emerging technique that focuses light deep into scattering media by phase-conjugating ultrasonically encoded diffuse light. In previous work, the speed of TRUE focusing was limited to no faster than 1 Hz by the response time of the photorefractive phase conjugate mirror, or the data acquisition and streaming speed of the digital camera; photorefractive-crystal-based TRUE focusing was also limited to the visible spectral range. These time-consuming schemes prevent this technique from being applied in vivo, since living biological tissue has a speckle decorrelation time on the order of a millisecond. In this work, using a Tedoped Sn2P2S6 photorefractive crystal at a near-infrared wavelength of 793 nm, we achieved TRUE focusing inside dynamic scattering media having a speckle decorrelation time as short as 7.7 ms. As the achieved speed approaches the tissue decorrelation rate, this work is an important step forward toward in vivo applications of TRUE focusing in deep tissue imaging, photodynamic therapy, and optical manipulation.

  1. Quantum dynamics calculations using symmetrized, orthogonal Weyl-Heisenberg wavelets with a phase space truncation scheme. III. Representations and calculations.

    PubMed

    Poirier, Bill; Salam, A

    2004-07-22

    In a previous paper [J. Theo. Comput. Chem. 2, 65 (2003)], one of the authors (B.P.) presented a method for solving the multidimensional Schrodinger equation, using modified Wilson-Daubechies wavelets, and a simple phase space truncation scheme. Unprecedented numerical efficiency was achieved, enabling a ten-dimensional calculation of nearly 600 eigenvalues to be performed using direct matrix diagonalization techniques. In a second paper [J. Chem. Phys. 121, 1690 (2004)], and in this paper, we extend and elaborate upon the previous work in several important ways. The second paper focuses on construction and optimization of the wavelength functions, from theoretical and numerical viewpoints, and also examines their localization. This paper deals with their use in representations and eigenproblem calculations, which are extended to 15-dimensional systems. Even higher dimensionalities are possible using more sophisticated linear algebra techniques. This approach is ideally suited to rovibrational spectroscopy applications, but can be used in any context where differential equations are involved.

  2. A Synthesis of Solar Cycle Prediction Techniques

    NASA Technical Reports Server (NTRS)

    Hathaway, David H.; Wilson, Robert M.; Reichmann, Edwin J.

    1999-01-01

    A number of techniques currently in use for predicting solar activity on a solar cycle timescale are tested with historical data. Some techniques, e.g., regression and curve fitting, work well as solar activity approaches maximum and provide a month-by-month description of future activity, while others, e.g., geomagnetic precursors, work well near solar minimum but only provide an estimate of the amplitude of the cycle. A synthesis of different techniques is shown to provide a more accurate and useful forecast of solar cycle activity levels. A combination of two uncorrelated geomagnetic precursor techniques provides a more accurate prediction for the amplitude of a solar activity cycle at a time well before activity minimum. This combined precursor method gives a smoothed sunspot number maximum of 154 plus or minus 21 at the 95% level of confidence for the next cycle maximum. A mathematical function dependent on the time of cycle initiation and the cycle amplitude is used to describe the level of solar activity month by month for the next cycle. As the time of cycle maximum approaches a better estimate of the cycle activity is obtained by including the fit between previous activity levels and this function. This Combined Solar Cycle Activity Forecast gives, as of January 1999, a smoothed sunspot maximum of 146 plus or minus 20 at the 95% level of confidence for the next cycle maximum.

  3. Multiscale Analysis of Solar Image Data

    NASA Astrophysics Data System (ADS)

    Young, C. A.; Myers, D. C.

    2001-12-01

    It is often said that the blessing and curse of solar physics is that there is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also cursed us with an increased amount of higher complexity data than previous missions. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present a preliminary analysis of multiscale techniques applied to solar image data. Specifically, we explore the use of the 2-d wavelet transform and related transforms with EIT, LASCO and TRACE images. This work was supported by NASA contract NAS5-00220.

  4. Venture Evaluation and Review Technique (VERT). Users’/Analysts’ Manual

    DTIC Science & Technology

    1979-10-01

    real world. Additionally, activity pro- cessing times could be entered as a normal, uniform or triangular distribution. Activity times can also be...work or tasks, or if the unit activities are such abstractions of the real world that the estimation of the time , cost and performance parameters for...utilized in that con- straining capacity. 7444 The network being processed has passed all the previous error checks. It currently has a real time

  5. Absolute rate of the reaction of Cl(p-2) with molecular hydrogen from 200 - 500 K

    NASA Technical Reports Server (NTRS)

    Whytock, D. A.; Lee, J. H.; Michael, J. V.; Payne, W. A.; Stief, L. J.

    1976-01-01

    Rate constants for the reaction of atomic chlorine with hydrogen are measured from 200 - 500 K using the flash photolysis-resonance fluorescence technique. The results are compared with previous work and are discussed with particular reference to the equilibrium constant for the reaction and to relative rate data for chlorine atom reactions. Theoretical calculations, using the BEBO method with tunneling, give excellent agreement with experiment.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The plpdfa software is a product of an LDRD project at LLNL entitked "Adaptive Sampling for Very High Throughput Data Streams" (tracking number 11-ERD-035). This software was developed by a graduate student summer intern, Chris Challis, who worked under project PI Dan Merl furing the summer of 2011. The software the source code is implementing is a statistical analysis technique for clustering and classification of text-valued data. The method had been previously published by the PI in the open literature.

  7. Do Expert Swimmers Have Expert Technique? Comment on "Arm Coordination and Performance Level in the 400-m Front Crawl" by Schnitzler, Seifert, and Chollet (2011)

    ERIC Educational Resources Information Center

    Havriluk, Rod

    2012-01-01

    In a recent article by Schnitzler, Seifert, and Chollet (2011), they used an index of coordination (IdC) to quantify arm synchronization in swimming, which has become a practical standard to measure gaps (negative IdC) and overlaps (positive IdC) in arm propulsion. Their previous work supported an increase in IdC as swimming velocity and…

  8. Electro-pumped whispering gallery mode ZnO microlaser array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, G. Y.; State Key Laboratory of Bioelectronics, School of Electronic Science and Engineering, Southeast University, Nanjing 210096; Li, J. T.

    2015-01-12

    By employing vapor-phase transport method, ZnO microrods are fabricated and directly assembled on p-GaN substrate to form a heterostructural microlaser array, which avoids of the relatively complicated etching process comparing previous work. Under applied forward bias, whispering gallery mode ZnO ultraviolet lasing is obtained from the as-fabricated heterostructural microlaser array. The device's electroluminescence originates from three distinct electron-hole recombination processes in the heterojunction interface, and whispering gallery mode ultraviolet lasing is obtained when the applied voltage is beyond the lasing threshold. This work may present a significant step towards future fabrication of a facile technique for micro/nanolasers.

  9. Nanostructure studies of strongly correlated materials.

    PubMed

    Wei, Jiang; Natelson, Douglas

    2011-09-01

    Strongly correlated materials exhibit an amazing variety of phenomena, including metal-insulator transitions, colossal magnetoresistance, and high temperature superconductivity, as strong electron-electron and electron-phonon couplings lead to competing correlated ground states. Recently, researchers have begun to apply nanostructure-based techniques to this class of materials, examining electronic transport properties on previously inaccessible length scales, and applying perturbations to drive systems out of equilibrium. We review progress in this area, particularly emphasizing work in transition metal oxides (Fe(3)O(4), VO(2)), manganites, and high temperature cuprate superconductors. We conclude that such nanostructure-based studies have strong potential to reveal new information about the rich physics at work in these materials.

  10. Security in MANETs using reputation-adjusted routing

    NASA Astrophysics Data System (ADS)

    Ondi, Attila; Hoffman, Katherine; Perez, Carlos; Ford, Richard; Carvalho, Marco; Allen, William

    2009-04-01

    Mobile Ad-Hoc Networks enable communication in various dynamic environments, including military combat operations. Their open and shared communication medium enables new forms of attack that are not applicable for traditional wired networks. Traditional security mechanisms and defense techniques are not prepared to cope with the new attacks and the lack of central authorities make identity verifications difficult. This work extends our previous work in the Biologically Inspired Tactical Security Infrastructure to provide a reputation-based weighing mechanism for linkstate routing protocols to protect the network from attackers that are corrupting legitimate network traffic. Our results indicate that the approach is successful in routing network traffic around compromised computers.

  11. Muscle length changes during swimming in scup: sonomicrometry verifies the anatomical high-speed cine technique.

    PubMed

    Coughlin, D J; Valdes, L; Rome, L C

    1996-02-01

    Recent attempts to determine how fish muscles are used to power swimming have employed the work loop technique (driving isolated muscles using their in vivo strain and stimulation pattern). These muscle strains have in turn been determined from the anatomical high-speed cine technique. In this study, we used an independent technique, sonomicrometry, to attempt to verify these strain measurements and the conclusions based on them. We found that the strain records measured from sonomicrometry and the anatomical-cine techniques were very similar. The ratio of the strain measured from sonomicrometry to that from the anatomical-cine technique was remarkably close to unity (1.046 +/- 0.013, mean +/- S.E.M., N = 15, for transducers placed on the muscle surface and corrected for muscle depth, and 0.921 +/- 0.028, N = 8, in cases where the transducers were inserted to the average depth of the red muscle). These measurements also showed that red muscle shortening occurs simultaneously with local backbone curvature, unlike previous results which suggested that white muscle shortening during the escape response occurs prior to the change in local backbone curvature.

  12. Violence prevention in special education schools - an integrated practice?

    PubMed

    Pihl, Patricia; Grytnes, Regine; Andersen, Lars Peter S

    2018-06-01

    Research has shown that employees in special education settings are at high risk for work-related threats and violence. Previous research has not yet been able to identify the essential components of training programs that offer protection from work-related threats and violence. Therefore, the aim of this study was to explore how employees in special education schools deal with prevention of work-related threats and violence. Group interviews were conducted with 14 employees working at 5 special education schools. Results show that employees use a wide range of prevention strategies drawing on specific violence prevention techniques as well as professional pedagogical approaches. We propose that the prevention of threats and violence in special education schools can be understood as an integrated pedagogical practice operating on three interrelated levels. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. The use of genetic programming to develop a predictor of swash excursion on sandy beaches

    NASA Astrophysics Data System (ADS)

    Passarella, Marinella; Goldstein, Evan B.; De Muro, Sandro; Coco, Giovanni

    2018-02-01

    We use genetic programming (GP), a type of machine learning (ML) approach, to predict the total and infragravity swash excursion using previously published data sets that have been used extensively in swash prediction studies. Three previously published works with a range of new conditions are added to this data set to extend the range of measured swash conditions. Using this newly compiled data set we demonstrate that a ML approach can reduce the prediction errors compared to well-established parameterizations and therefore it may improve coastal hazards assessment (e.g. coastal inundation). Predictors obtained using GP can also be physically sound and replicate the functionality and dependencies of previous published formulas. Overall, we show that ML techniques are capable of both improving predictability (compared to classical regression approaches) and providing physical insight into coastal processes.

  14. Microaneurysm detection with radon transform-based classification on retina images.

    PubMed

    Giancardo, L; Meriaudeau, F; Karnowski, T P; Li, Y; Tobin, K W; Chaum, E

    2011-01-01

    The creation of an automatic diabetic retinopathy screening system using retina cameras is currently receiving considerable interest in the medical imaging community. The detection of microaneurysms is a key element in this effort. In this work, we propose a new microaneurysms segmentation technique based on a novel application of the radon transform, which is able to identify these lesions without any previous knowledge of the retina morphological features and with minimal image preprocessing. The algorithm has been evaluated on the Retinopathy Online Challenge public dataset, and its performance compares with the best current techniques. The performance is particularly good at low false positive ratios, which makes it an ideal candidate for diabetic retinopathy screening systems.

  15. Prediction of light aircraft interior noise

    NASA Technical Reports Server (NTRS)

    Howlett, J. T.; Morales, D. A.

    1976-01-01

    At the present time, predictions of aircraft interior noise depend heavily on empirical correction factors derived from previous flight measurements. However, to design for acceptable interior noise levels and to optimize acoustic treatments, analytical techniques which do not depend on empirical data are needed. This paper describes a computerized interior noise prediction method for light aircraft. An existing analytical program (developed for commercial jets by Cockburn and Jolly in 1968) forms the basis of some modal analysis work which is described. The accuracy of this modal analysis technique for predicting low-frequency coupled acoustic-structural natural frequencies is discussed along with trends indicating the effects of varying parameters such as fuselage length and diameter, structural stiffness, and interior acoustic absorption.

  16. A Robust Absorbing Boundary Condition for Compressible Flows

    NASA Technical Reports Server (NTRS)

    Loh, Ching Y.; orgenson, Philip C. E.

    2005-01-01

    An absorbing non-reflecting boundary condition (NRBC) for practical computations in fluid dynamics and aeroacoustics is presented with theoretical proof. This paper is a continuation and improvement of a previous paper by the author. The absorbing NRBC technique is based on a first principle of non reflecting, which contains the essential physics that a plane wave solution of the Euler equations remains intact across the boundary. The technique is theoretically shown to work for a large class of finite volume approaches. When combined with the hyperbolic conservation laws, the NRBC is simple, robust and truly multi-dimensional; no additional implementation is needed except the prescribed physical boundary conditions. Several numerical examples in multi-dimensional spaces using two different finite volume schemes are illustrated to demonstrate its robustness in practical computations. Limitations and remedies of the technique are also discussed.

  17. Dynamic test input generation for multiple-fault isolation

    NASA Technical Reports Server (NTRS)

    Schaefer, Phil

    1990-01-01

    Recent work is Causal Reasoning has provided practical techniques for multiple fault diagnosis. These techniques provide a hypothesis/measurement diagnosis cycle. Using probabilistic methods, they choose the best measurements to make, then update fault hypotheses in response. For many applications such as computers and spacecraft, few measurement points may be accessible, or values may change quickly as the system under diagnosis operates. In these cases, a hypothesis/measurement cycle is insufficient. A technique is presented for a hypothesis/test-input/measurement diagnosis cycle. In contrast to generating tests a priori for determining device functionality, it dynamically generates tests in response to current knowledge about fault probabilities. It is shown how the mathematics previously used for measurement specification can be applied to the test input generation process. An example from an efficient implementation called Multi-Purpose Causal (MPC) is presented.

  18. Runtime support for parallelizing data mining algorithms

    NASA Astrophysics Data System (ADS)

    Jin, Ruoming; Agrawal, Gagan

    2002-03-01

    With recent technological advances, shared memory parallel machines have become more scalable, and offer large main memories and high bus bandwidths. They are emerging as good platforms for data warehousing and data mining. In this paper, we focus on shared memory parallelization of data mining algorithms. We have developed a series of techniques for parallelization of data mining algorithms, including full replication, full locking, fixed locking, optimized full locking, and cache-sensitive locking. Unlike previous work on shared memory parallelization of specific data mining algorithms, all of our techniques apply to a large number of common data mining algorithms. In addition, we propose a reduction-object based interface for specifying a data mining algorithm. We show how our runtime system can apply any of the technique we have developed starting from a common specification of the algorithm.

  19. Enantioresolution in electrokinetic chromatography-complete filling technique using sulfated gamma-cyclodextrin. Software-free topological anticipation.

    PubMed

    Escuder-Gilabert, Laura; Martín-Biosca, Yolanda; Medina-Hernández, María José; Sagrado, Salvador

    2016-10-07

    Few papers have tried to predict the resolution ability of chiral selectors in capillary electrophoresis for the separation of the enantiomers of chiral compounds. In a previous work, we have used molecular information available on-line to establish enantioresolution levels of basic compounds using highly sulfated β-CD (HS-β-CD) as chiral selector in electrokinetic chromatography-complete filling technique (EKC-CFT). The present study is a continuation of this previous work, introducing some novelties. In this work, the ability of sulfated γ-cyclodextrin (S-γ-CD) as chiral selector in EKC-CFT is modelled for the first time. Thirty-three structurally unrelated cationic and neutral compounds (drugs and pesticides) are studied. Categorical enantioresolution levels (RsC, 0 or 1) are assigned from experimental enantioresolution values obtained at different S-γ-CD concentrations. Novel topological parameters connected to the chiral carbon (C * -parameters) are introduced. Four C * -parameters and a topological parameter of the whole molecule (aromatic atom count) are the most important variables according to a discriminant partial least squares-variable selection process. It suggests the preponderance of the topology adjacent to the chiral carbon to anticipate the RsC levels. A software-free anticipation protocol for new molecules is proposed. Over the current set of molecules evaluated, 100% of correct anticipations (resolved and non-resolved compounds) are obtained, while anticipation of some compounds remains undetermined. A criterion is introduced to alert on compounds which should not be anticipated. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Parser Combinators: a Practical Application for Generating Parsers for NMR Data

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Ellis, Heidi JC; Gryk, Michael R.

    2013-01-01

    Nuclear Magnetic Resonance (NMR) spectroscopy is a technique for acquiring protein data at atomic resolution and determining the three-dimensional structure of large protein molecules. A typical structure determination process results in the deposition of a large data sets to the BMRB (Bio-Magnetic Resonance Data Bank). This data is stored and shared in a file format called NMR-Star. This format is syntactically and semantically complex making it challenging to parse. Nevertheless, parsing these files is crucial to applying the vast amounts of biological information stored in NMR-Star files, allowing researchers to harness the results of previous studies to direct and validate future work. One powerful approach for parsing files is to apply a Backus-Naur Form (BNF) grammar, which is a high-level model of a file format. Translation of the grammatical model to an executable parser may be automatically accomplished. This paper will show how we applied a model BNF grammar of the NMR-Star format to create a free, open-source parser, using a method that originated in the functional programming world known as “parser combinators”. This paper demonstrates the effectiveness of a principled approach to file specification and parsing. This paper also builds upon our previous work [1], in that 1) it applies concepts from Functional Programming (which is relevant even though the implementation language, Java, is more mainstream than Functional Programming), and 2) all work and accomplishments from this project will be made available under standard open source licenses to provide the community with the opportunity to learn from our techniques and methods. PMID:24352525

  1. A case-crossover study of transient risk factors influence on occupational injuries: a study protocol based on a review of previous studies.

    PubMed

    Oesterlund, Anna H; Lander, Flemming; Lauritsen, Jens

    2016-10-01

    The occupational injury incident rate remains relatively high in the European Union. The case-crossover study gives a unique opportunity to study transient risk factors that normally would be very difficult to approach. Studies like this have been carried out in both America and Asia, but so far no relevant research has been conducted in Europe. Case-crossover studies of occupational injuries were collected from PubMed and Embase and read through. Previous experiences concerning method, exposure and outcome, time-related measurements and construction of the questionnaire were taken into account in the preparation of a pilot study. Consequently, experiences from the pilot study were used to design the study protocol. Approximately 2000 patients with an occupational injury will be recruited from the emergency departments in Herning and Odense, Denmark. A standardised questionnaire will be used to collect basic demographic data and information on eight transient risk factors. Based on previous studies and knowledge on occupational injuries the transient risk factors we chose to examine were: time pressure, performing a task with a different method/using unaccustomed technique, change in working surroundings, using a phone, disagreement, feeling ill, being distracted and using malfunctioning machinery/tools or work material. Exposure time 'just before the injury' will be compared with two control periods, 'previous day at the same time of the injury' (pair match) and the previous work week (usual frequency). This study protocol describes a unique opportunity to calculate the effect of transient risk factors on occupational injuries in a European setting. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. An Algorithm to Automatically Generate the Combinatorial Orbit Counting Equations

    PubMed Central

    Melckenbeeck, Ine; Audenaert, Pieter; Michoel, Tom; Colle, Didier; Pickavet, Mario

    2016-01-01

    Graphlets are small subgraphs, usually containing up to five vertices, that can be found in a larger graph. Identification of the graphlets that a vertex in an explored graph touches can provide useful information about the local structure of the graph around that vertex. Actually finding all graphlets in a large graph can be time-consuming, however. As the graphlets grow in size, more different graphlets emerge and the time needed to find each graphlet also scales up. If it is not needed to find each instance of each graphlet, but knowing the number of graphlets touching each node of the graph suffices, the problem is less hard. Previous research shows a way to simplify counting the graphlets: instead of looking for the graphlets needed, smaller graphlets are searched, as well as the number of common neighbors of vertices. Solving a system of equations then gives the number of times a vertex is part of each graphlet of the desired size. However, until now, equations only exist to count graphlets with 4 or 5 nodes. In this paper, two new techniques are presented. The first allows to generate the equations needed in an automatic way. This eliminates the tedious work needed to do so manually each time an extra node is added to the graphlets. The technique is independent on the number of nodes in the graphlets and can thus be used to count larger graphlets than previously possible. The second technique gives all graphlets a unique ordering which is easily extended to name graphlets of any size. Both techniques were used to generate equations to count graphlets with 4, 5 and 6 vertices, which extends all previous results. Code can be found at https://github.com/IneMelckenbeeck/equation-generator and https://github.com/IneMelckenbeeck/graphlet-naming. PMID:26797021

  3. Non-Markovian near-infrared Q branch of HCl diluted in liquid Ar.

    PubMed

    Padilla, Antonio; Pérez, Justo

    2013-08-28

    By using a non-Markovian spectral theory based in the Kubo cumulant expansion technique, we have qualitatively studied the infrared Q branch observed in the fundamental absorption band of HCl diluted in liquid Ar. The statistical parameters of the anisotropic interaction present in this spectral theory were calculated by means of molecular dynamics techniques, and found that the values of the anisotropic correlation times are significantly greater (by a factor of two) than those previously obtained by fitting procedures or microscopic cell models. This fact is decisive for the observation in the theoretical spectral band of a central Q resonance which is absent in the abundant previous researches carried out with the usual theories based in Kubo cumulant expansion techniques. Although the theory used in this work only allows a qualitative study of the Q branch, we can employ it to study the unknown characteristics of the Q resonance which are difficult to obtain with the quantum simulation techniques recently developed. For example, in this study we have found that the Q branch is basically a non-Markovian (or memory) effect produced by the spectral line interferences, where the PR interferential profile basically determines the Q branch spectral shape. Furthermore, we have found that the Q resonance is principally generated by the first rotational states of the first two vibrational levels, those more affected by the action of the dissolvent.

  4. Guidance of Nonlinear Nonminimum-Phase Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Devasia, Santosh

    1996-01-01

    The research work has advanced the inversion-based guidance theory for: systems with non-hyperbolic internal dynamics; systems with parameter jumps; and systems where a redesign of the output trajectory is desired. A technique to achieve output tracking for nonminimum phase linear systems with non-hyperbolic and near non-hyperbolic internal dynamics was developed. This approach integrated stable inversion techniques, that achieve exact-tracking, with approximation techniques, that modify the internal dynamics to achieve desirable performance. Such modification of the internal dynamics was used (a) to remove non-hyperbolicity which is an obstruction to applying stable inversion techniques and (b) to reduce large preactuation times needed to apply stable inversion for near non-hyperbolic cases. The method was applied to an example helicopter hover control problem with near non-hyperbolic internal dynamics for illustrating the trade-off between exact tracking and reduction of preactuation time. Future work will extend these results to guidance of nonlinear non-hyperbolic systems. The exact output tracking problem for systems with parameter jumps was considered. Necessary and sufficient conditions were derived for the elimination of switching-introduced output transient. While previous works had studied this problem by developing a regulator that maintains exact tracking through parameter jumps (switches), such techniques are, however, only applicable to minimum-phase systems. In contrast, our approach is also applicable to nonminimum-phase systems and leads to bounded but possibly non-causal solutions. In addition, for the case when the reference trajectories are generated by an exosystem, we developed an exact-tracking controller which could be written in a feedback form. As in standard regulator theory, we also obtained a linear map from the states of the exosystem to the desired system state, which was defined via a matrix differential equation.

  5. Measurement of fracture properties of concrete at high strain rates

    PubMed Central

    Cendón, D. A.; Sánchez-Gálvez, V.; Gálvez, F.

    2017-01-01

    An analysis of the spalling technique of concrete bars using the modified Hopkinson bar was carried out. A new experimental configuration is proposed adding some variations to previous works. An increased length for concrete specimens was chosen and finite-element analysis was used for designing a conic projectile to obtain a suitable triangular impulse wave. The aim of this initial work is to establish an experimental framework which allows a simple and direct analysis of concrete subjected to high strain rates. The efforts and configuration of these primary tests, as well as the selected geometry and dimensions for the different elements, have been focused to achieve a simple way of identifying the fracture position and so the tensile strength of tested specimens. This dynamic tensile strength can be easily compared with previous values published in literature giving an idea of the accuracy of the method and technique proposed and the possibility to extend it in a near future to obtain other mechanical properties such as the fracture energy. The tests were instrumented with strain gauges, accelerometers and high-speed camera in order to validate the results by different ways. Results of the dynamic tensile strength of the tested concrete are presented. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956510

  6. Active room compensation for sound reinforcement using sound field separation techniques.

    PubMed

    Heuchel, Franz M; Fernandez-Grande, Efren; Agerkvist, Finn T; Shabalina, Elena

    2018-03-01

    This work investigates how the sound field created by a sound reinforcement system can be controlled at low frequencies. An indoor control method is proposed which actively absorbs the sound incident on a reflecting boundary using an array of secondary sources. The sound field is separated into incident and reflected components by a microphone array close to the secondary sources, enabling the minimization of reflected components by means of optimal signals for the secondary sources. The method is purely feed-forward and assumes constant room conditions. Three different sound field separation techniques for the modeling of the reflections are investigated based on plane wave decomposition, equivalent sources, and the Spatial Fourier transform. Simulations and an experimental validation are presented, showing that the control method performs similarly well at enhancing low frequency responses with the three sound separation techniques. Resonances in the entire room are reduced, although the microphone array and secondary sources are confined to a small region close to the reflecting wall. Unlike previous control methods based on the creation of a plane wave sound field, the investigated method works in arbitrary room geometries and primary source positions.

  7. Laser-induced dissociation processes of protonated glucose: dehydration reactions vs cross-ring dissociation

    NASA Astrophysics Data System (ADS)

    Dyakov, Y. A.; Kazaryan, M. A.; Golubkov, M. G.; Gubanova, D. P.; Bulychev, N. A.; Kazaryan, S. M.

    2018-04-01

    Studying the processes occurring in biological systems under irradiation is critically important for understanding the principles of working of biological systems. One of the main problems, which stimulate interest to the processes of photo-induced excitation and ionization of biomolecules, is the necessity of their identification by various mass spectrometry (MS) methods. While simple analysis of small molecules became a standard MS technique long time ago, recognition of large molecules, especially carbohydrates, is still a difficult problem, and requires sophisticated techniques and complicated computer analysis. Due to the large variety of substances in the samples, as far as the complexity of the processes occurring after excitation/ionization of the molecules, the recognition efficiency of MS technique in terms of carbohydrates is still not high enough. Additional theoretical and experimental analysis of ionization and dissociation processes in various kinds of polysaccharides, beginning from the simplest ones, is necessary. In our work, we extent previous theoretical and experimental studies of saccharides, and concentrate our attention to protonated glucose. In this article we paid the most attention to the cross-ring dissociation and water loss reactions due to their importance for identification of various isomers of hydrocarbon molecules (for example, distinguish α- and β-glucose).

  8. Real-time vehicle noise cancellation techniques for gunshot acoustics

    NASA Astrophysics Data System (ADS)

    Ramos, Antonio L. L.; Holm, Sverre; Gudvangen, Sigmund; Otterlei, Ragnvald

    2012-06-01

    Acoustical sniper positioning systems rely on the detection and direction-of-arrival (DOA) estimation of the shockwave and the muzzle blast in order to provide an estimate of a potential snipers location. Field tests have shown that detecting and estimating the DOA of the muzzle blast is a rather difficult task in the presence of background noise sources, e.g., vehicle noise, especially in long range detection and absorbing terrains. In our previous work presented in the 2011 edition of this conference we highlight the importance of improving the SNR of the gunshot signals prior to the detection and recognition stages, aiming at lowering the false alarm and miss-detection rates and, thereby, increasing the reliability of the system. This paper reports on real-time noise cancellation techniques, like Spectral Subtraction and Adaptive Filtering, applied to gunshot signals. Our model assumes the background noise as being short-time stationary and uncorrelated to the impulsive gunshot signals. In practice, relatively long periods without signal occur and can be used to estimate the noise spectrum and its first and second order statistics as required in the spectral subtraction and adaptive filtering techniques, respectively. The results presented in this work are supported with extensive simulations based on real data.

  9. High-resolution spatiotemporal mapping of PM2.5 concentrations at Mainland China using a combined BME-GWR technique

    NASA Astrophysics Data System (ADS)

    Xiao, Lu; Lang, Yichao; Christakos, George

    2018-01-01

    With rapid economic development, industrialization and urbanization, the ambient air PM2.5 has become a major pollutant linked to respiratory, heart and lung diseases. In China, PM2.5 pollution constitutes an extreme environmental and social problem of widespread public concern. In this work we estimate ground-level PM2.5 from satellite-derived aerosol optical depth (AOD), topography data, meteorological data, and pollutant emission using an integrative technique. In particular, Geographically Weighted Regression (GWR) analysis was combined with Bayesian Maximum Entropy (BME) theory to assess the spatiotemporal characteristics of PM2.5 exposure in a large region of China and generate informative PM2.5 space-time predictions (estimates). It was found that, due to its integrative character, the combined BME-GWR method offers certain improvements in the space-time prediction of PM2.5 concentrations over China compared to previous techniques. The combined BME-GWR technique generated realistic maps of space-time PM2.5 distribution, and its performance was superior to that of seven previous studies of satellite-derived PM2.5 concentrations in China in terms of prediction accuracy. The purely spatial GWR model can only be used at a fixed time, whereas the integrative BME-GWR approach accounts for cross space-time dependencies and can predict PM2.5 concentrations in the composite space-time domain. The 10-fold results of BME-GWR modeling (R2 = 0.883, RMSE = 11.39 μg /m3) demonstrated a high level of space-time PM2.5 prediction (estimation) accuracy over China, revealing a definite trend of severe PM2.5 levels from the northern coast toward inland China (Nov 2015-Feb 2016). Future work should focus on the addition of higher resolution AOD data, developing better satellite-based prediction models, and related air pollutants for space-time PM2.5 prediction purposes.

  10. Synthetic Aperture Radar Interferometry Analysis of Ground Deformation within the Coso Geothermal Site, California

    NASA Astrophysics Data System (ADS)

    Brawner, Erik

    Earth's surface movement may cause as a potential hazard to infrastructure and people. Associated earthquake hazards pose a potential side effect of geothermal activity. Modern remote sensing techniques known as Interferometric Synthetic Aperture Radar (InSAR) can measure surface change with a high degree of precision to mm scale movements. Previous work has identified a deformation anomaly within the Coso Geothermal site in eastern California. Surface changes have not been analyzed since the 1990s, allowing a decade of geothermal production impact to occur since previously assessed. In this study, InSAR data was acquired and analyzed between the years 2005 and 2010. Acquired by the ENVISAT satellite from both ascending and descending modes. This provides an independent dataset from previous work. Incorporating data generated from a new sensor covering a more modern temporal study period. Analysis of this time period revealed a subsidence anomaly in correlation with the extents of the geothermal production area under current operation. Maximum subsidence rates in the region reached approximately 3.8 cm/yr. A similar rate assessed from previous work throughout the 1990s. The correlation of subsidence patterns suggests a linear source of deformation from measurements spanning multiple decades. Regions of subsidence branch out from the main anomaly to the North-Northeast and to the South where additional significant peaks of subsidence occurring. The extents of the deformation anomaly directly correlate with the dispersal of geothermal production well site locations. Depressurization within the geothermal system provides a leading cause to surface subsidence from excessive extraction of hydrothermal fluids. As a result of minimal reinjection of production fluids.

  11. Habit Discontinuity, Self-Activation, and the Diminishing Influence of Context Change: Evidence from the UK Understanding Society Survey.

    PubMed

    Thomas, Gregory Owen; Poortinga, Wouter; Sautkina, Elena

    2016-01-01

    Repeated behaviours in stable contexts can become automatic habits. Habits are resistant to information-based techniques to change behaviour, but are contextually cued, so a change in behaviour context (e.g., location) weakens habit strength and can facilitate greater consideration of the behaviour. This idea was demonstrated in previous work, whereby people with strong environmental attitudes have lower car use, but only after recently moving home. We examine the habit discontinuity hypothesis by analysing the Understanding Society dataset with 18,053 individuals representative of the UK population, measuring time since moving home, travel mode to work, and strength of environmental attitudes. Results support previous findings where car use is significantly lower among those with stronger environmental views (but only after recently moving home), and in addition, demonstrate a trend where this effects decays as the time since moving home increases. We discuss results in light of moving into a new home being a potential 'window of opportunity' to promote pro-environmental behaviours.

  12. Using PEACE Data from the four CLUSTER Spacecraft to Measure Compressibility, Vorticity, and the Taylor Microscale in the Magnetosheath and Plasma Sheet

    NASA Technical Reports Server (NTRS)

    Goldstein, Melvyn L.; Parks, George; Gurgiolo, C.; Fazakerley, Andrew N.

    2008-01-01

    We present determinations of compressibility and vorticity in the magnetosheath and plasma sheet using moments from the four PEACE thermal electron instruments on CLUSTER. The methodology used assumes a linear variation of the moments throughout the volume defined by the four satellites, which allows spatially independent estimates of the divergence, curl, and gradient. Once the vorticity has been computed, it is possible to estimate directly the Taylor microscale. We have shown previously that the technique works well in the solar wind. Because the background flow speed in the magnetosheath and plasma sheet is usually less than the Alfven speed, the Taylor frozen-in-flow approximation cannot be used. Consequently, this four spacecraft approach is the only viable method for obtaining the wave number properties of the ambient fluctuations. Our results using electron velocity moments will be compared with previous work using magnetometer data from the FGM experiment on Cluster.

  13. Habit Discontinuity, Self-Activation, and the Diminishing Influence of Context Change: Evidence from the UK Understanding Society Survey

    PubMed Central

    Thomas, Gregory Owen; Poortinga, Wouter; Sautkina, Elena

    2016-01-01

    Repeated behaviours in stable contexts can become automatic habits. Habits are resistant to information-based techniques to change behaviour, but are contextually cued, so a change in behaviour context (e.g., location) weakens habit strength and can facilitate greater consideration of the behaviour. This idea was demonstrated in previous work, whereby people with strong environmental attitudes have lower car use, but only after recently moving home. We examine the habit discontinuity hypothesis by analysing the Understanding Society dataset with 18,053 individuals representative of the UK population, measuring time since moving home, travel mode to work, and strength of environmental attitudes. Results support previous findings where car use is significantly lower among those with stronger environmental views (but only after recently moving home), and in addition, demonstrate a trend where this effects decays as the time since moving home increases. We discuss results in light of moving into a new home being a potential ‘window of opportunity’ to promote pro-environmental behaviours. PMID:27120333

  14. Reordering of Nuclear Quantum States in Rare Isotopes

    NASA Astrophysics Data System (ADS)

    Flanagan, Kieran

    2010-02-01

    A key question in modern nuclear physics relates to the ordering of quantum states, and whether the predictions made by the shell model hold true far from stability. Recent innovations in technology and techniques at radioactive beam facilities have allowed access to rare isotopes previously inaccessible to experimentalists. Measurements that have been performed in several regions of the nuclear chart have yielded surprising and dramatic changes in nuclear structure, where level ordering is quite different than expected from previous theoretical descriptions. In order to reconcile the difference between experiment and theory, new shell-model interactions have been proposed, which include the role of the tensor force as part of the monopole term from the expansion of the residual proton-neutron interaction. This has motivated a series of laser spectroscopy experiments that have studied the neutron-rich copper and gallium isotopes at the ISOLDE facility. This work has deduced without nuclear-model dependence the spin, moments and charge radii. The results of this work and their implications for nuclear structure near ^78Ni will be discussed. )

  15. Local regression type methods applied to the study of geophysics and high frequency financial data

    NASA Astrophysics Data System (ADS)

    Mariani, M. C.; Basu, K.

    2014-09-01

    In this work we applied locally weighted scatterplot smoothing techniques (Lowess/Loess) to Geophysical and high frequency financial data. We first analyze and apply this technique to the California earthquake geological data. A spatial analysis was performed to show that the estimation of the earthquake magnitude at a fixed location is very accurate up to the relative error of 0.01%. We also applied the same method to a high frequency data set arising in the financial sector and obtained similar satisfactory results. The application of this approach to the two different data sets demonstrates that the overall method is accurate and efficient, and the Lowess approach is much more desirable than the Loess method. The previous works studied the time series analysis; in this paper our local regression models perform a spatial analysis for the geophysics data providing different information. For the high frequency data, our models estimate the curve of best fit where data are dependent on time.

  16. Recent approaches in sensitive enantioseparations by CE.

    PubMed

    Sánchez-Hernández, Laura; Castro-Puyana, María; Marina, María Luisa; Crego, Antonio L

    2012-01-01

    The latest strategies and instrumental improvements for enhancing the detection sensitivity in chiral analysis by CE are reviewed in this work. Following the previous reviews by García-Ruiz et al. (Electrophoresis 2006, 27, 195-212) and Sánchez-Hernández et al. (Electrophoresis 2008, 29, 237-251; Electrophoresis 2010, 31, 28-43), this review includes those papers that were published during the period from June 2009 to May 2011. These works describe the use of offline and online sample treatment techniques, online sample preconcentration techniques based on electrophoretic principles, and alternative detection systems to UV-Vis to increase the detection sensitivity. The application of the above-mentioned strategies, either alone or combined, to improve the sensitivity in the enantiomeric analysis of a broad range of samples, such as pharmaceutical, biological, food and environmental samples, enables to decrease the limits of detection up to 10⁻¹² M. The use of microchips to achieve sensitive chiral separations is also discussed. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. A 3D joint interpretation of magnetotelluric and seismic tomographic models: The case of the volcanic island of Tenerife

    NASA Astrophysics Data System (ADS)

    García-Yeguas, Araceli; Ledo, Juanjo; Piña-Varas, Perla; Prudencio, Janire; Queralt, Pilar; Marcuello, Alex; Ibañez, Jesús M.; Benjumea, Beatriz; Sánchez-Alzola, Alberto; Pérez, Nemesio

    2017-12-01

    In this work we have done a 3D joint interpretation of magnetotelluric and seismic tomography models. Previously we have described different techniques to infer the inner structure of the Earth. We have focused on volcanic regions, specifically on Tenerife Island volcano (Canary Islands, Spain). In this area, magnetotelluric and seismic tomography studies have been done separately. The novelty of the present work is the combination of both techniques in Tenerife Island. For this aim we have applied Fuzzy Clusters Method at different depths obtaining several clusters or classes. From the results, a geothermal system has been inferred below Teide volcano, in the center of Tenerife Island. An edifice hydrothermally altered and full of fluids is situated below Teide, ending at 600 m below sea level. From this depth the resistivity and VP values increase downwards. We also observe a clay cap structure, a typical feature in geothermal systems related with low resistivity and low VP values.

  18. Poly(imide-siloxane) segmented copolymer structural adhesives prepared by bulk and solution thermal imidization

    NASA Technical Reports Server (NTRS)

    Bott, R. H.; Summers, J. D.; Arnold, C. A.; Blankenship, C. P., Jr.; Taylor, L. T.

    1988-01-01

    The improved properties that have been demonstrated through thermal solution imidization in the case of polyimides and poly(imide-siloxane) segmented copolymers suggests significant potential for application of these new materials. Specifically, the enhancement in solubility, moisture reduction, and processability observed through this solution technique is quite dramatic. Previous work has shown that the presence of low amounts of siloxane does not detract significantly from the lap shear strength of these materials to titanium in the case of bulk thermal imidization synthesis. In addition, the siloxane incorporation results in the added advantage of resistance to hot, wet environments. This added durability is presumably due to the hydrophobic siloxane segments preventing the uptake of water at the critical interphase between the adhesive and the adherend. This paper discusses the extension of this work to the solution imidization synthesis technique recently developed in our laboratory. Results dealing with the absolute bond strengths as well as durability and failure surface analysis will be presented.

  19. Concept of contrast transfer function for edge illumination x-ray phase-contrast imaging and its comparison with the free-space propagation technique.

    PubMed

    Diemoz, Paul C; Vittoria, Fabio A; Olivo, Alessandro

    2016-05-16

    Previous studies on edge illumination (EI) X-ray phase-contrast imaging (XPCi) have investigated the nature and amplitude of the signal provided by this technique. However, the response of the imaging system to different object spatial frequencies was never explicitly considered and studied. This is required in order to predict the performance of a given EI setup for different classes of objects. To this scope, in the present work we derive analytical expressions for the contrast transfer function of an EI imaging system, using the approximation of near-field regime, and study its dependence upon the main experimental parameters. We then exploit these results to compare the frequency response of an EI system with respect of that of a free-space propagation XPCi one. The results achieved in this work can be useful for predicting the signals obtainable for different types of objects and also as a basis for new retrieval methods.

  20. A Robust Approach for a Filter-Based Monocular Simultaneous Localization and Mapping (SLAM) System

    PubMed Central

    Munguía, Rodrigo; Castillo-Toledo, Bernardino; Grau, Antoni

    2013-01-01

    Simultaneous localization and mapping (SLAM) is an important problem to solve in robotics theory in order to build truly autonomous mobile robots. This work presents a novel method for implementing a SLAM system based on a single camera sensor. The SLAM with a single camera, or monocular SLAM, is probably one of the most complex SLAM variants. In this case, a single camera, which is freely moving through its environment, represents the sole sensor input to the system. The sensors have a large impact on the algorithm used for SLAM. Cameras are used more frequently, because they provide a lot of information and are well adapted for embedded systems: they are light, cheap and power-saving. Nevertheless, and unlike range sensors, which provide range and angular information, a camera is a projective sensor providing only angular measurements of image features. Therefore, depth information (range) cannot be obtained in a single step. In this case, special techniques for feature system-initialization are needed in order to enable the use of angular sensors (as cameras) in SLAM systems. The main contribution of this work is to present a novel and robust scheme for incorporating and measuring visual features in filtering-based monocular SLAM systems. The proposed method is based in a two-step technique, which is intended to exploit all the information available in angular measurements. Unlike previous schemes, the values of parameters used by the initialization technique are derived directly from the sensor characteristics, thus simplifying the tuning of the system. The experimental results show that the proposed method surpasses the performance of previous schemes. PMID:23823972

  1. Work and Family Characteristics as Predictors of Early Retirement in Married Men and Women

    PubMed Central

    Kubicek, Bettina; Korunka, Christian; Hoonakker, Peter; Raymo, James M.

    2010-01-01

    This study presents an integrative model of early retirement using data from the Wisconsin Longitudinal Study. The model extends prior work by incorporating work-family conflict to capture the interaction between the work and family domains and by assuming proximal and distal predictors of early retirement. More precisely, the model suggests that family and job demands and resources predict family-to-work and work-to-family conflict, respectively. All of these factors are presumed to have only indirect effects on retirement timing via the intervening effect of quality of life measures, that is, marital satisfaction, job satisfaction and health. The authors assume that these three factors constitute predictors of early retirement in addition to socioeconomic status and the availability of a pension plan and health insurance. The model was tested with structural equation modeling techniques, and the results were supportive. Therefore, the proposed model offers a general framework for the integration of previous research findings. PMID:21430790

  2. The challenges that employees who abuse substances experience when returning to work after completion of employee assistance programme (EAP).

    PubMed

    Soeker, Shaheed; Matimba, Tandokazi; Machingura, Last; Msimango, Henry; Moswaane, Bobo; Tom, Sinazo

    2015-01-01

    Employee assistance programs (EAPs) are responsible for helping employees cope with problems such as: mental distress, alcoholism and other drug dependencies, marital and financial difficulties--in short, the whole host of personal and family troubles endemic to the human condition. The study explored the challenges that employees who abuse substances experience when returning to work after the completion of an employee assistance program. The study used a qualitative exploratory descriptive research design. Three male participants and two key informants participated in the study. One semi structured interview was conducted with each one of the participants and one semi structured interview with the key informants. Four themes emerged: 1) Loss of one's worker role identity, 2) Negative influences of the community continues to effect the success of EAP, 3) EAP as a vehicle for change and, 4) Healthy occupations strengthen EAP. This study portrayed the following: how substance abuse effect the worker role of individuals employed in the open labor market, the challenges and facilitators experienced by employees who abuse substances when returning to their previous work roles and how occupation based interventions can be incorporated in EAP programs. Occupational therapists could use the health promotion approach, work simplification, energy conservation techniques and ergonomic analysis techniques.

  3. Comparison of spectroscopically measured finger and forearm tissue ethanol concentration to blood and breath ethanol measurements

    NASA Astrophysics Data System (ADS)

    Ridder, Trent D.; Hull, Edward L.; Ver Steeg, Benjamin J.; Laaksonen, Bentley D.

    2011-02-01

    Previous works investigated a spectroscopic technique that offered a promising alternative to blood and breath assays for determining in vivo alcohol concentration. Although these prior works measured the dorsal forearm, we report the results of a 26-subject clinical study designed to evaluate the spectroscopic technique at a finger measurement site through comparison to contemporaneous forearm spectroscopic, venous blood, and breath measurements. Through both Monte Carlo simulation and experimental data, it is shown that tissue optical probe design has a substantial impact on the effective path-length of photons through the skin and the signal-to-noise ratio of the spectroscopic measurements. Comparison of the breath, blood, and tissue assays demonstrated significant differences in alcohol concentration that are attributable to both assay accuracy and alcohol pharmacokinetics. Similar to past works, a first order kinetic model is used to estimate the fraction of concentration variance explained by alcohol pharmacokinetics (72.6-86.7%). A significant outcome of this work was significantly improved pharmacokinetic agreement with breath (arterial) alcohol of the finger measurement (mean kArt-Fin = 0.111 min-1) relative to the forearm measurement (mean kArt-For = 0.019 min-1) that is likely due to the increased blood perfusion of the finger.

  4. DEER Sensitivity between Iron Centers and Nitroxides in Heme-Containing Proteins Improves Dramatically Using Broadband, High-Field EPR

    PubMed Central

    2016-01-01

    This work demonstrates the feasibility of making sensitive nanometer distance measurements between Fe(III) heme centers and nitroxide spin labels in proteins using the double electron–electron resonance (DEER) pulsed EPR technique at 94 GHz. Techniques to measure accurately long distances in many classes of heme proteins using DEER are currently strongly limited by sensitivity. In this paper we demonstrate sensitivity gains of more than 30 times compared with previous lower frequency (X-band) DEER measurements on both human neuroglobin and sperm whale myoglobin. This is achieved by taking advantage of recent instrumental advances, employing wideband excitation techniques based on composite pulses and exploiting more favorable relaxation properties of low-spin Fe(III) in high magnetic fields. This gain in sensitivity potentially allows the DEER technique to be routinely used as a sensitive probe of structure and conformation in the large number of heme and many other metalloproteins. PMID:27035368

  5. Three-dimensional tracking for efficient fire fighting in complex situations

    NASA Astrophysics Data System (ADS)

    Akhloufi, Moulay; Rossi, Lucile

    2009-05-01

    Each year, hundred millions hectares of forests burn causing human and economic losses. For efficient fire fighting, the personnel in the ground need tools permitting the prediction of fire front propagation. In this work, we present a new technique for automatically tracking fire spread in three-dimensional space. The proposed approach uses a stereo system to extract a 3D shape from fire images. A new segmentation technique is proposed and permits the extraction of fire regions in complex unstructured scenes. It works in the visible spectrum and combines information extracted from YUV and RGB color spaces. Unlike other techniques, our algorithm does not require previous knowledge about the scene. The resulting fire regions are classified into different homogenous zones using clustering techniques. Contours are then extracted and a feature detection algorithm is used to detect interest points like local maxima and corners. Extracted points from stereo images are then used to compute the 3D shape of the fire front. The resulting data permits to build the fire volume. The final model is used to compute important spatial and temporal fire characteristics like: spread dynamics, local orientation, heading direction, etc. Tests conducted on the ground show the efficiency of the proposed scheme. This scheme is being integrated with a fire spread mathematical model in order to predict and anticipate the fire behaviour during fire fighting. Also of interest to fire-fighters, is the proposed automatic segmentation technique that can be used in early detection of fire in complex scenes.

  6. A processing architecture for associative short-term memory in electronic noses

    NASA Astrophysics Data System (ADS)

    Pioggia, G.; Ferro, M.; Di Francesco, F.; DeRossi, D.

    2006-11-01

    Electronic nose (e-nose) architectures usually consist of several modules that process various tasks such as control, data acquisition, data filtering, feature selection and pattern analysis. Heterogeneous techniques derived from chemometrics, neural networks, and fuzzy rules used to implement such tasks may lead to issues concerning module interconnection and cooperation. Moreover, a new learning phase is mandatory once new measurements have been added to the dataset, thus causing changes in the previously derived model. Consequently, if a loss in the previous learning occurs (catastrophic interference), real-time applications of e-noses are limited. To overcome these problems this paper presents an architecture for dynamic and efficient management of multi-transducer data processing techniques and for saving an associative short-term memory of the previously learned model. The architecture implements an artificial model of a hippocampus-based working memory, enabling the system to be ready for real-time applications. Starting from the base models available in the architecture core, dedicated models for neurons, maps and connections were tailored to an artificial olfactory system devoted to analysing olive oil. In order to verify the ability of the processing architecture in associative and short-term memory, a paired-associate learning test was applied. The avoidance of catastrophic interference was observed.

  7. Dry transfer of graphene to dielectrics and flexible substrates using polyimide as a transparent and stable intermediate layer

    NASA Astrophysics Data System (ADS)

    Marchena, Miriam; Wagner, Frederic; Arliguie, Therese; Zhu, Bin; Johnson, Benedict; Fernández, Manuel; Lai Chen, Tong; Chang, Theresa; Lee, Robert; Pruneri, Valerio; Mazumder, Prantik

    2018-07-01

    We demonstrate the direct transfer of graphene from Cu foil to rigid and flexible substrates, such as glass and PET, using as an intermediate layer a thin film of polyimide (PI) mixed with an aminosilane (3-aminopropyltrimethoxysilane) or only PI, respectively. While the dry removal of graphene by an adhesive has been previously demonstrated—being removed from graphite by scotch tape or from a Cu foil by thick epoxy (~20 µm) on Si—our work is the first step towards making a substrate ready for device fabrication using the polymer-free technique. Our approach leads to an article that is transparent, thermally stable—up to 350 °C—and free of polymer residues on the device side of the graphene, which is contrary to the case of the standard wet-transfer process using PMMA. Also, in addition to previous novelty, our technique is fast and easier by using current industrial technology—a hot press and a laminator—with Cu recycling by its mechanical peel-off; it provides high interfacial stability in aqueous media and it is not restricted to a specific material—polyimide and polyamic acids can be used. All the previous reasons demonstrate a feasible process that enables device fabrication.

  8. Neural mechanisms of cue-approach training

    PubMed Central

    Bakkour, Akram; Lewis-Peacock, Jarrod A.; Poldrack, Russell A.; Schonberg, Tom

    2016-01-01

    Biasing choices may prove a useful way to implement behavior change. Previous work has shown that a simple training task (the cue-approach task), which does not rely on external reinforcement, can robustly influence choice behavior by biasing choice toward items that were targeted during training. In the current study, we replicate previous behavioral findings and explore the neural mechanisms underlying the shift in preferences following cue-approach training. Given recent successes in the development and application of machine learning techniques to task-based fMRI data, which have advanced understanding of the neural substrates of cognition, we sought to leverage the power of these techniques to better understand neural changes during cue-approach training that subsequently led to a shift in choice behavior. Contrary to our expectations, we found that machine learning techniques applied to fMRI data during non-reinforced training were unsuccessful in elucidating the neural mechanism underlying the behavioral effect. However, univariate analyses during training revealed that the relationship between BOLD and choices for Go items increases as training progresses compared to choices of NoGo items primarily in lateral prefrontal cortical areas. This new imaging finding suggests that preferences are shifted via differential engagement of task control networks that interact with value networks during cue-approach training. PMID:27677231

  9. Partial Discharge Characteristics of Polymer Nanocomposite Materials in Electrical Insulation: A Review of Sample Preparation Techniques, Analysis Methods, Potential Applications, and Future Trends

    PubMed Central

    Izzati, Wan Akmal; Adzis, Zuraimy; Shafanizam, Mohd

    2014-01-01

    Polymer nanocomposites have recently been attracting attention among researchers in electrical insulating applications from energy storage to power delivery. However, partial discharge has always been a predecessor to major faults and problems in this field. In addition, there is a lot more to explore, as neither the partial discharge characteristic in nanocomposites nor their electrical properties are clearly understood. By adding a small amount of weight percentage (wt%) of nanofillers, the physical, mechanical, and electrical properties of polymers can be greatly enhanced. For instance, nanofillers in nanocomposites such as silica (SiO2), alumina (Al2O3) and titania (TiO2) play a big role in providing a good approach to increasing the dielectric breakdown strength and partial discharge resistance of nanocomposites. Such polymer nanocomposites will be reviewed thoroughly in this paper, with the different experimental and analytical techniques used in previous studies. This paper also provides an academic review about partial discharge in polymer nanocomposites used as electrical insulating material from previous research, covering aspects of preparation, characteristics of the nanocomposite based on experimental works, application in power systems, methods and techniques of experiment and analysis, and future trends. PMID:24558326

  10. Conceptual recurrence plots: revealing patterns in human discourse.

    PubMed

    Angus, Daniel; Smith, Andrew; Wiles, Janet

    2012-06-01

    Human discourse contains a rich mixture of conceptual information. Visualization of the global and local patterns within this data stream is a complex and challenging problem. Recurrence plots are an information visualization technique that can reveal trends and features in complex time series data. The recurrence plot technique works by measuring the similarity of points in a time series to all other points in the same time series and plotting the results in two dimensions. Previous studies have applied recurrence plotting techniques to textual data; however, these approaches plot recurrence using term-based similarity rather than conceptual similarity of the text. We introduce conceptual recurrence plots, which use a model of language to measure similarity between pairs of text utterances, and the similarity of all utterances is measured and displayed. In this paper, we explore how the descriptive power of the recurrence plotting technique can be used to discover patterns of interaction across a series of conversation transcripts. The results suggest that the conceptual recurrence plotting technique is a useful tool for exploring the structure of human discourse.

  11. Advanced imaging technologies for mapping cadaveric lymphatic anatomy: magnetic resonance and computed tomography lymphangiography.

    PubMed

    Pan, W R; Rozen, W M; Stretch, J; Thierry, B; Ashton, M W; Corlett, R J

    2008-09-01

    Lymphatic anatomy has become increasingly clinically important as surgical techniques evolve for investigating and treating cancer metastases. However, due to limited anatomical techniques available, research in this field has been insufficient. The techniques of computed tomography (CT) and magnetic resonance (MR) lymphangiography have not been described previously in the imaging of cadaveric lymphatic anatomy. This preliminary work describes the feasibility of these advanced imaging technologies for imaging lymphatic anatomy. A single, fresh cadaveric lower limb underwent lymphatic dissection and cannulation utilizing microsurgical techniques. Contrast materials for both CT and MR studies were chosen based on their suitability for subsequent clinical use, and imaging was undertaken with a view to mapping lymphatic anatomy. Microdissection studies were compared with imaging findings in each case. Both MR-based and CT-based contrast media in current clinical use were found to be suitable for demonstrating cadaveric lymphatic anatomy upon direct intralymphatic injection. MR lymphangiography and CT lymphangiography are feasible modalities for cadaveric anatomical research for lymphatic anatomy. Future studies including refinements in scanning techniques may offer these technologies to the clinical setting.

  12. Development efforts to improve curved-channel microchannel plates

    NASA Technical Reports Server (NTRS)

    Corbett, M. B.; Feller, W. B.; Laprade, B. N.; Cochran, R.; Bybee, R.; Danks, A.; Joseph, C.

    1993-01-01

    Curved-channel microchannel plate (C-plate) improvements resulting from an ongoing NASA STIS microchannel plate (MCP) development program are described. Performance limitations of previous C-plates led to a development program in support of the STIS MAMA UV photon counter, a second generation instrument on the Hubble Space Telescope. C-plate gain, quantum detection efficiency, dark noise, and imaging distortion, which are influenced by channel curvature non-uniformities, have all been improved through use of a new centrifuge fabrication technique. This technique will be described, along with efforts to improve older, more conventional shearing methods. Process optimization methods used to attain targeted C-plate performance goals will be briefly characterized. Newly developed diagnostic measurement techniques to study image distortion, gain uniformity, input bias angle, channel curvature, and ion feedback, will be described. Performance characteristics and initial test results of the improved C-plates will be reported. Future work and applications will also be discussed.

  13. Solar flare ionization in the mesosphere observed by coherent-scatter radar

    NASA Technical Reports Server (NTRS)

    Parker, J. W.; Bowhill, S. A.

    1986-01-01

    The coherent-scatter technique, as used with the Urbana radar, is able to measure relative changes in electron density at one altitude during the progress of a solar flare when that altitude contains a statistically steady turbulent layer. This work describes the analysis of Urbana coherent-scatter data from the times of 13 solar flares in the period from 1978 to 1983. Previous methods of measuring electron density changes in the D-region are summarized. Models of X-ray spectra, photoionization rates, and ion-recombination reaction schemes are reviewed. The coherent-scatter technique is briefly described, and a model is developed which relates changes in scattered power to changes in electron density. An analysis technique is developed using X-ray flux data from geostationary satellites and coherent scatter data from the Urbana radar which empirically distinguishes between proposed D-region ion-chemical schemes, and estimates the nonflare ion-pair production rate.

  14. Buried structure for increasing fabrication performance of micromaterial by electromigration

    NASA Astrophysics Data System (ADS)

    Kimura, Yasuhiro; Saka, Masumi

    2016-06-01

    The electromigration (EM) technique is a physical synthetic growth method for micro/nanomaterials. EM causes atomic diffusion in a metal line by high-density electron flows. The intentional control of accumulation and relaxation of atoms by EM can lead to the fabrication of a micro/nanomaterial. TiN passivation has been utilized as a component of sample in the EM technique. Although TiN passivation can simplify the cumbersome processes for preparing the sample, the leakage of current naturally occurs because of the conductivity of TiN as a side effect and decreases the performance of micro/nanomaterial fabrication. In the present work, we propose a buried structure, which contributes to significantly decreasing the current for fabricating an Al micromaterial by confining the current flow in the EM technique. The fabrication performance was evaluated based on the threshold current for fabricating an Al micromaterial using the buried structure and the previous structure with the leakage of current.

  15. Correlation techniques to determine model form in robust nonlinear system realization/identification

    NASA Technical Reports Server (NTRS)

    Stry, Greselda I.; Mook, D. Joseph

    1991-01-01

    The fundamental challenge in identification of nonlinear dynamic systems is determining the appropriate form of the model. A robust technique is presented which essentially eliminates this problem for many applications. The technique is based on the Minimum Model Error (MME) optimal estimation approach. A detailed literature review is included in which fundamental differences between the current approach and previous work is described. The most significant feature is the ability to identify nonlinear dynamic systems without prior assumption regarding the form of the nonlinearities, in contrast to existing nonlinear identification approaches which usually require detailed assumptions of the nonlinearities. Model form is determined via statistical correlation of the MME optimal state estimates with the MME optimal model error estimates. The example illustrations indicate that the method is robust with respect to prior ignorance of the model, and with respect to measurement noise, measurement frequency, and measurement record length.

  16. Measurement of absolute regional lung air volumes from near-field x-ray speckles.

    PubMed

    Leong, Andrew F T; Paganin, David M; Hooper, Stuart B; Siew, Melissa L; Kitchen, Marcus J

    2013-11-18

    Propagation-based phase contrast x-ray (PBX) imaging yields high contrast images of the lung where airways that overlap in projection coherently scatter the x-rays, giving rise to a speckled intensity due to interference effects. Our previous works have shown that total and regional changes in lung air volumes can be accurately measured from two-dimensional (2D) absorption or phase contrast images when the subject is immersed in a water-filled container. In this paper we demonstrate how the phase contrast speckle patterns can be used to directly measure absolute regional lung air volumes from 2D PBX images without the need for a water-filled container. We justify this technique analytically and via simulation using the transport-of-intensity equation and calibrate the technique using our existing methods for measuring lung air volume. Finally, we show the full capabilities of this technique for measuring regional differences in lung aeration.

  17. Magnetic induction tomography of objects for security applications

    NASA Astrophysics Data System (ADS)

    Ward, Rob; Joseph, Max; Langley, Abbi; Taylor, Stuart; Watson, Joe C.

    2017-10-01

    A coil array imaging system has been further developed from previous investigations, focusing on designing its application for fast screening of small bags or parcels, with a view to the production of a compact instrument for security applications. In addition to reducing image acquisition times, work was directed toward exploring potential cost effective manufacturing routes. Based on magnetic induction tomography and eddy-current principles, the instrument captured images of conductive targets using a lock-in amplifier, individually multiplexing signals between a primary driver coil and a 20 by 21 imaging array of secondary passive coils constructed using a reproducible multiple tile design. The design was based on additive manufacturing techniques and provided 2 orthogonal imaging planes with an ability to reconstruct images in less than 10 seconds. An assessment of one of the imaging planes is presented. This technique potentially provides a cost effective threat evaluation technique that may compliment conventional radiographic approaches.

  18. Synthesis Methods for Robust Passification and Control

    NASA Technical Reports Server (NTRS)

    Kelkar, Atul G.; Joshi, Suresh M. (Technical Monitor)

    2000-01-01

    The research effort under this cooperative agreement has been essentially the continuation of the work from previous grants. The ongoing work has primarily focused on developing passivity-based control techniques for Linear Time-Invariant (LTI) systems. During this period, there has been a significant progress made in the area of passivity-based control of LTI systems and some preliminary results have also been obtained for nonlinear systems, as well. The prior work has addressed optimal control design for inherently passive as well as non- passive linear systems. For exploiting the robustness characteristics of passivity-based controllers the passification methodology was developed for LTI systems that are not inherently passive. Various methods of passification were first proposed in and further developed. The robustness of passification was addressed for multi-input multi-output (MIMO) systems for certain classes of uncertainties using frequency-domain methods. For MIMO systems, a state-space approach using Linear Matrix Inequality (LMI)-based formulation was presented, for passification of non-passive LTI systems. An LMI-based robust passification technique was presented for systems with redundant actuators and sensors. The redundancy in actuators and sensors was used effectively for robust passification using the LMI formulation. The passification was designed to be robust to an interval-type uncertainties in system parameters. The passification techniques were used to design a robust controller for Benchmark Active Control Technology wing under parametric uncertainties. The results on passive nonlinear systems, however, are very limited to date. Our recent work in this area was presented, wherein some stability results were obtained for passive nonlinear systems that are affine in control.

  19. Evolution Nonlinear Diffusion-Convection PDE Models for Spectrogram Enhancement

    NASA Astrophysics Data System (ADS)

    Dugnol, B.; Fernández, C.; Galiano, G.; Velasco, J.

    2008-09-01

    In previous works we studied the application of PDE-based image processing techniques applied to the spectrogram of audio signals in order to improve the readability of the signal. In particular we considered the implementation of the nonlinear diffusive model proposed by Álvarez, Lions and Morel [1](ALM) combined with a convective term inspired by the differential reassignment proposed by Chassandre-Mottin, Daubechies, Auger and Flandrin [2]-[3]. In this work we consider the possibility of replacing the diffusive model of ALM by diffusive terms in divergence form. In particular we implement finite element approximations of nonlinear diffusive terms studied by Chen, Levine, Rao [4] and Antontsev, Shmarev [5]-[8] with a convective term.

  20. Explosively produced fracture of oil shale

    NASA Astrophysics Data System (ADS)

    Morris, W. A.

    1982-05-01

    Rock fragmentation research in oil shale to develop the blasting technologies and designs required to prepare a rubble bed for a modified in situ retort is reported. Experimental work is outlined, proposed studies in explosive characterization are detailed and progress in numerical calculation techniques to predict fracture of the shale is described. A detailed geologic characterization of two Anvil Points experiment sites is related to previous work at Colony Mine. The second section focuses on computer modeling and theory. The latest generation of the stress wave code SHALE, its three dimensional potential, and the slide line package for it are described. A general stress rate equation that takes energy dependence into account is discussed.

  1. Additive manufacturing for steels: a review

    NASA Astrophysics Data System (ADS)

    Zadi-Maad, A.; Rohib, R.; Irawan, A.

    2018-01-01

    Additive manufacturing (AM) of steels involves the layer by layer consolidation of powder or wire feedstock using a heating beam to form near net shape products. For the past decades, the AM technique reaches the maturation of both research grade and commercial production due to significant research work from academic, government and industrial research organization worldwide. AM process has been implemented to replace the conventional process of steel fabrication due to its potentially lower cost and flexibility manufacturing. This paper provides a review of previous research related to the AM methods followed by current challenges issues. The relationship between microstructure, mechanical properties, and process parameters will be discussed. Future trends and recommendation for further works are also provided.

  2. Study of grid independence of finite element method on MHD free convective casson fluid flow with slip effect

    NASA Astrophysics Data System (ADS)

    Raju, R. Srinivasa; Ramesh, K.

    2018-05-01

    The purpose of this work is to study the grid independence of finite element method on MHD Casson fluid flow past a vertically inclined plate filled in a porous medium in presence of chemical reaction, heat absorption, an external magnetic field and slip effect has been investigated. For this study of grid independence, a mathematical model is developed and analyzed by using appropriate mathematical technique, called finite element method. Grid study discussed with the help of numerical values of velocity, temperature and concentration profiles in tabular form. avourable comparisons with previously published work on various special cases of the problem are obtained.

  3. Z2Pack: Numerical implementation of hybrid Wannier centers for identifying topological materials

    NASA Astrophysics Data System (ADS)

    Gresch, Dominik; Autès, Gabriel; Yazyev, Oleg V.; Troyer, Matthias; Vanderbilt, David; Bernevig, B. Andrei; Soluyanov, Alexey A.

    2017-02-01

    The intense theoretical and experimental interest in topological insulators and semimetals has established band structure topology as a fundamental material property. Consequently, identifying band topologies has become an important, but often challenging, problem, with no exhaustive solution at the present time. In this work we compile a series of techniques, some previously known, that allow for a solution to this problem for a large set of the possible band topologies. The method is based on tracking hybrid Wannier charge centers computed for relevant Bloch states, and it works at all levels of materials modeling: continuous k .p models, tight-binding models, and ab initio calculations. We apply the method to compute and identify Chern, Z2, and crystalline topological insulators, as well as topological semimetal phases, using real material examples. Moreover, we provide a numerical implementation of this technique (the Z2Pack software package) that is ideally suited for high-throughput screening of materials databases for compounds with nontrivial topologies. We expect that our work will allow researchers to (a) identify topological materials optimal for experimental probes, (b) classify existing compounds, and (c) reveal materials that host novel, not yet described, topological states.

  4. The role of collaborative ontology development in the knowledge negotiation process

    NASA Astrophysics Data System (ADS)

    Rivera, Norma

    Interdisciplinary research (IDR) collaboration can be defined as the process of integrating experts' knowledge, perspectives, and resources to advance scientific discovery. The flourishing of more complex research problems, together with the growth of scientific and technical knowledge has resulted in the need for researchers from diverse fields to provide different expertise and points of view to tackle these problems. These collaborations, however, introduce a new set of "culture" barriers as participating experts are trained to communicate in discipline-specific languages, theories, and research practices. We propose that building a common knowledge base for research using ontology development techniques can provide a starting point for interdisciplinary knowledge exchange, negotiation, and integration. The goal of this work is to extend ontology development techniques to support the knowledge negotiation process in IDR groups. Towards this goal, this work presents a methodology that extends previous work in collaborative ontology development and integrates learning strategies and tools to enhance interdisciplinary research practices. We evaluate the effectiveness of applying such methodology in three different scenarios that cover educational and research settings. The results of this evaluation confirm that integrating learning strategies can, in fact, be advantageous to overall collaborative practices in IDR groups.

  5. Finite-temperature Gutzwiller approximation from the time-dependent variational principle

    NASA Astrophysics Data System (ADS)

    Lanatà, Nicola; Deng, Xiaoyu; Kotliar, Gabriel

    2015-08-01

    We develop an extension of the Gutzwiller approximation to finite temperatures based on the Dirac-Frenkel variational principle. Our method does not rely on any entropy inequality, and is substantially more accurate than the approaches proposed in previous works. We apply our theory to the single-band Hubbard model at different fillings, and show that our results compare quantitatively well with dynamical mean field theory in the metallic phase. We discuss potential applications of our technique within the framework of first-principle calculations.

  6. Adjustment of Jacobs' formulation to the case of Mercury

    NASA Astrophysics Data System (ADS)

    Chiappini, M.; de Santis, A.

    1991-04-01

    Magnetic investigations play an important role in studies on the constitution of planetary interiors. One of these techniques (the so-called Jacobs' formulation), appropriately modified, has been applied to the case of Mercury. According to the results found, the planet, supposed to be divided internally as the earth (crust-mantle-core), would have a core/planet volume ratio of 28 percent, much greater than the earth's core percentage (16 percent). This result is in agreement with previous work which used other independent methods.

  7. Optical Properties of Zinc Selenide Grown Using Molecular Beam Deposition Techniques

    DTIC Science & Technology

    1989-06-01

    studied were grown using a standard MBE machine with insitu diagnostics. The ZnSe material used for growing the samples is highly pure polycrystalline...width of the interference maxima n can be found from equation (1). Beyond 550 nm absorption is varying rapidly and this will cause Tmax to vary...nonlinearity Is utilized - such as in an optically bistable switch. It is known from previous work on ZnSe grown on GaAs 113] that the material begins growing

  8. Ideal-Magnetohydrodynamic-Stable Tilting in Field-Reversed Configurations

    NASA Astrophysics Data System (ADS)

    Kanno, Ryutaro; Ishida, Akio; Steinhauer, Loren

    1995-02-01

    The tilting mode in field-reversed configurations (FRC) is examined using ideal-magnetohydrodynamic stability theory. Tilting, a global mode, is the greatest threat for disruption of FRC confinement. Previous studies uniformly found tilting to be unstable in ideal theory: the objective here is to ascertain if stable equilibria were overlooked in past work. Solving the variational problem with the Rayleigh-Ritz technique, tilting-stable equilibria are found for sufficiently hollow current profile and sufficient racetrackness of the separatrix shape. Although these equilibria were not examined previously, the present conclusion is quite surprising. Consequently checks of the method are offered. Even so it cannot yet be claimed with complete certainty that stability has been proved: absolute confirmation of ideal-stable tilting awaits the application of more complete methods.

  9. Research to improve the accuracy of determining the stroke volume of an artificial ventricle using the wavelet transform

    NASA Astrophysics Data System (ADS)

    Grad, Leszek; Murawski, Krzysztof; Sulej, Wojciech

    2017-08-01

    In the article we presented results obtained during research, which are the continuation of work on the use of artificial neural networks to determine the relationship between the view of the membrane and the stroke volume of the blood chamber of the mechanical prosthetic heart. The purpose of the research was to increase the accuracy of determining the blood chamber volume. Therefore, the study was focused on the technique of the features that the image extraction gives. During research we used the wavelet transform. The achieved results were compared to the results obtained by other previous methods. Tests were conducted on the same mechanical prosthetic heart model used in previous experiments.

  10. Absolute rate constant for the reaction of atomic chlorine with hydrogen peroxide vapor over the temperature range 265-400 K

    NASA Technical Reports Server (NTRS)

    Michael, J. V.; Whytock, D. A.; Lee, J. H.; Payne, W. A.; Stief, L. J.

    1977-01-01

    Rate constants for the reaction of atomic chlorine with hydrogen peroxide were measured from 265-400 K using the flash photolysis-resonance fluorescence technique. Analytical techniques were developed to measure H2O2 under reaction conditions. Due to ambiguity in the interpretation of the analytical results, the data combine to give two equally acceptable representations of the temperature dependence. The results are compared to previous work at 298 K and are theoretically discussed in terms of the mechanism of the reaction. Additional experiments on the H + H2O2 reaction at 298 and 359 K are compared with earlier results from this laboratory and give a slightly revised bimolecular rate constant.

  11. Symmetrical Taylor impact of glass bars

    NASA Astrophysics Data System (ADS)

    Murray, N. H.; Bourne, N. K.; Field, J. E.; Rosenberg, Z.

    1998-07-01

    Brar and Bless pioneered the use of plate impact upon bars as a technique for investigating the 1D stress loading of glass but limited their studies to relatively modest stresses (1). We wish to extend this technique by applying VISAR and embedded stress gauge measurements to a symmetrical version of the test in which two rods impact one upon the other. Previous work in the laboratory has characterised the glass types (soda-lime and borosilicate)(2). These experiments identify the failure mechanisms from high-speed photography and the stress and particle velocity histories are interpreted in the light of these results. The differences in response of the glasses and the relation of the fracture to the failure wave in uniaxial strain are discussed.

  12. Characterization of waviness in wind turbine blades using air coupled ultrasonics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakrapani, Sunil Kishore; Dayal, Vinay; Hsu, David K.

    2011-06-23

    Waviness in glass fiber reinforced composite is of great interest in composite research, since it results in the loss of stiffness. Several NDE techniques have been used previously to detect waviness. This work is concerned with waves normal to the plies in a composite. Air-coupled ultrasonics was used to detect waviness in thick composites used in the manufacturing of wind turbine blades. Composite samples with different wave aspect ratios were studied. Different wavy samples were characterized, and a three step process was developed to make sure the technique is field implementable. This gives us a better understanding of the effectmore » of waviness in thick composites, and how it affects the life and performance of the composite.« less

  13. Meteor tracking via local pattern clustering in spatio-temporal domain

    NASA Astrophysics Data System (ADS)

    Kukal, Jaromír.; Klimt, Martin; Švihlík, Jan; Fliegel, Karel

    2016-09-01

    Reliable meteor detection is one of the crucial disciplines in astronomy. A variety of imaging systems is used for meteor path reconstruction. The traditional approach is based on analysis of 2D image sequences obtained from a double station video observation system. Precise localization of meteor path is difficult due to atmospheric turbulence and other factors causing spatio-temporal fluctuations of the image background. The proposed technique performs non-linear preprocessing of image intensity using Box-Cox transform as recommended in our previous work. Both symmetric and asymmetric spatio-temporal differences are designed to be robust in the statistical sense. Resulting local patterns are processed by data whitening technique and obtained vectors are classified via cluster analysis and Self-Organized Map (SOM).

  14. Wavefront sensing with all-digital Stokes measurements

    NASA Astrophysics Data System (ADS)

    Dudley, Angela; Milione, Giovanni; Alfano, Robert R.; Forbes, Andrew

    2014-09-01

    A long-standing question in optics has been to efficiently measure the phase (or wavefront) of an optical field. This has led to numerous publications and commercial devices such as phase shift interferometry, wavefront reconstruction via modal decomposition and Shack-Hartmann wavefront sensors. In this work we develop a new technique to extract the phase which in contrast to previously mentioned methods is based on polarization (or Stokes) measurements. We outline a simple, all-digital approach using only a spatial light modulator and a polarization grating to exploit the amplitude and phase relationship between the orthogonal states of polarization to determine the phase of an optical field. We implement this technique to reconstruct the phase of static and propagating optical vortices.

  15. [The infertile couple: between biology and psyche].

    PubMed

    Dudkiewicz-Sibony, Charlotte

    2012-09-01

    In this paper, we tried to show through some clinical examples that there is not on the one side the body, and on the other the psyche. The Assisted Reproductive Technology (ART), because the field it covers, namely procreation, is a magnificent observatory of interactions between body and psyche, desire and inhibition, request and mental block, hatred and love… Assisted reproduction techniques are often indicative of links between these terms, and of what symbols are at stake in the couples concerned. It is interesting to see how these sophisticated techniques not only work as a solution to infertility problems -which are plural and diverse- but as a developer of human problems that we previously thought we would escape. Copyright © 2012. Published by Elsevier SAS.

  16. Limited receptive area neural classifier for recognition of swallowing sounds using continuous wavelet transform.

    PubMed

    Makeyev, Oleksandr; Sazonov, Edward; Schuckers, Stephanie; Lopez-Meyer, Paulo; Melanson, Ed; Neuman, Michael

    2007-01-01

    In this paper we propose a sound recognition technique based on the limited receptive area (LIRA) neural classifier and continuous wavelet transform (CWT). LIRA neural classifier was developed as a multipurpose image recognition system. Previous tests of LIRA demonstrated good results in different image recognition tasks including: handwritten digit recognition, face recognition, metal surface texture recognition, and micro work piece shape recognition. We propose a sound recognition technique where scalograms of sound instances serve as inputs of the LIRA neural classifier. The methodology was tested in recognition of swallowing sounds. Swallowing sound recognition may be employed in systems for automated swallowing assessment and diagnosis of swallowing disorders. The experimental results suggest high efficiency and reliability of the proposed approach.

  17. Oil Spill Detection in Terma-Side-Looking Airborne Radar Images Using Image Features and Region Segmentation

    PubMed Central

    Alacid, Beatriz

    2018-01-01

    This work presents a method for oil-spill detection on Spanish coasts using aerial Side-Looking Airborne Radar (SLAR) images, which are captured using a Terma sensor. The proposed method uses grayscale image processing techniques to identify the dark spots that represent oil slicks on the sea. The approach is based on two steps. First, the noise regions caused by aircraft movements are detected and labeled in order to avoid the detection of false-positives. Second, a segmentation process guided by a map saliency technique is used to detect image regions that represent oil slicks. The results show that the proposed method is an improvement on the previous approaches for this task when employing SLAR images. PMID:29316716

  18. Arc-Welding Spectroscopic Monitoring based on Feature Selection and Neural Networks.

    PubMed

    Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M

    2008-10-21

    A new spectral processing technique designed for application in the on-line detection and classification of arc-welding defects is presented in this paper. A noninvasive fiber sensor embedded within a TIG torch collects the plasma radiation originated during the welding process. The spectral information is then processed in two consecutive stages. A compression algorithm is first applied to the data, allowing real-time analysis. The selected spectral bands are then used to feed a classification algorithm, which will be demonstrated to provide an efficient weld defect detection and classification. The results obtained with the proposed technique are compared to a similar processing scheme presented in previous works, giving rise to an improvement in the performance of the monitoring system.

  19. An overview of groundwater chemistry studies in Malaysia.

    PubMed

    Kura, Nura Umar; Ramli, Mohammad Firuz; Sulaiman, Wan Nor Azmin; Ibrahim, Shaharin; Aris, Ahmad Zaharin

    2018-03-01

    In this paper, numerous studies on groundwater in Malaysia were reviewed with the aim of evaluating past trends and the current status for discerning the sustainability of the water resources in the country. It was found that most of the previous groundwater studies (44 %) focused on the islands and mostly concentrated on qualitative assessment with more emphasis being placed on seawater intrusion studies. This was then followed by inland-based studies, with Selangor state leading the studies which reflected the current water challenges facing the state. From a methodological perspective, geophysics, graphical methods, and statistical analysis are the dominant techniques (38, 25, and 25 %) respectively. The geophysical methods especially the 2D resistivity method cut across many subjects such as seawater intrusion studies, quantitative assessment, and hydraulic parameters estimation. The statistical techniques used include multivariate statistical analysis techniques and ANOVA among others, most of which are quality related studies using major ions, in situ parameters, and heavy metals. Conversely, numerical techniques like MODFLOW were somewhat less admired which is likely due to their complexity in nature and high data demand. This work will facilitate researchers in identifying the specific areas which need improvement and focus, while, at the same time, provide policymakers and managers with an executive summary and knowledge of the current situation in groundwater studies and where more work needs to be done for sustainable development.

  20. Vector Doppler: spatial sampling analysis and presentation techniques for real-time systems

    NASA Astrophysics Data System (ADS)

    Capineri, Lorenzo; Scabia, Marco; Masotti, Leonardo F.

    2001-05-01

    The aim of the vector Doppler (VD) technique is the quantitative reconstruction of a velocity field independently of the ultrasonic probe axis to flow angle. In particular vector Doppler is interesting for studying vascular pathologies related to complex blood flow conditions. Clinical applications require a real-time operating mode and the capability to perform Doppler measurements over a defined volume. The combination of these two characteristics produces a real-time vector velocity map. In previous works the authors investigated the theory of pulsed wave (PW) vector Doppler and developed an experimental system capable of producing off-line 3D vector velocity maps. Afterwards, for producing dynamic velocity vector maps, we realized a new 2D vector Doppler system based on a modified commercial echograph. The measurement and presentation of a vector velocity field requires a correct spatial sampling that must satisfy the Shannon criterion. In this work we tackled this problem, establishing a relationship between sampling steps and scanning system characteristics. Another problem posed by the vector Doppler technique is the data representation in real-time that should be easy to interpret for the physician. With this in mine we attempted a multimedia solution that uses both interpolated images and sound to represent the information of the measured vector velocity map. These presentation techniques were experimented for real-time scanning on flow phantoms and preliminary measurements in vivo on a human carotid artery.

  1. Total amino acid stabilization during cell-free protein synthesis reactions.

    PubMed

    Calhoun, Kara A; Swartz, James R

    2006-05-17

    Limitations in amino acid supply have been recognized as a substantial problem in cell-free protein synthesis reactions. Although enzymatic inhibitors and fed-batch techniques have been beneficial, the most robust way to stabilize amino acids is to remove the responsible enzymatic activities by genetically modifying the source strain used for cell extract preparation. Previous work showed this was possible for arginine, serine, and tryptophan, but cysteine degradation remained a major limitation in obtaining high protein synthesis yields. Through radiolabel techniques, we confirmed that cysteine degradation was caused by the activity of glutamate-cysteine ligase (gene gshA) in the cell extract. Next, we created Escherichia coli strain KC6 that combines a gshA deletion with previously described deletions for arginine, serine, and tryptophan stabilization. Strain KC6 grows well, and active cell extract can be produced from it for cell-free protein synthesis reactions. The extract from strain KC6 maintains stable amino acid concentrations of all 20 amino acids in a 3-h batch reaction. Yields for three different proteins improved 75-250% relative to cell-free expression using the control extract.

  2. Neural and Decision Theoretic Approaches for the Automated Segmentation of Radiodense Tissue in Digitized Mammograms

    NASA Astrophysics Data System (ADS)

    Eckert, R.; Neyhart, J. T.; Burd, L.; Polikar, R.; Mandayam, S. A.; Tseng, M.

    2003-03-01

    Mammography is the best method available as a non-invasive technique for the early detection of breast cancer. The radiographic appearance of the female breast consists of radiolucent (dark) regions due to fat and radiodense (light) regions due to connective and epithelial tissue. The amount of radiodense tissue can be used as a marker for predicting breast cancer risk. Previously, we have shown that the use of statistical models is a reliable technique for segmenting radiodense tissue. This paper presents improvements in the model that allow for further development of an automated system for segmentation of radiodense tissue. The segmentation algorithm employs a two-step process. In the first step, segmentation of tissue and non-tissue regions of a digitized X-ray mammogram image are identified using a radial basis function neural network. The second step uses a constrained Neyman-Pearson algorithm, developed especially for this research work, to determine the amount of radiodense tissue. Results obtained using the algorithm have been validated by comparing with estimates provided by a radiologist employing previously established methods.

  3. Great Basin vegetation response to groundwater fluctuation, climate variability, and previous land cultivation: The application of multitemporal measurements from remote sensing data to regional vegetation dynamics

    NASA Astrophysics Data System (ADS)

    Elmore, Andrew James

    The conversion of large natural basins to managed watersheds for the purpose of providing water to urban centers has had a negative impact on semiarid ecosystems, worldwide. We view semiarid plant communities as being adapted to short, regular periods of drought. However, human induced changes in the water balance often remove these systems from the range of natural variability that has been historically established. This thesis explores vegetation changes over a 13-yr period for Owens Valley, in eastern California. Using remotely sensed measurements of vegetation cover, an extensive vegetation survey, field data and observations, precipitation records, and data on water table depth, I identify the key modes of response of xeric, phreatophytic, and exotic Great Basin plant communities. Three specific advancements were reached as a result of this work. (1) A change classification technique was developed that was used to separate regions of land-cover that were dependent on precipitation from regions dependent on groundwater. This technique utilized Spectral Mixture Analysis of annually acquired Landsat Thematic Mapper remote sensing data, to retrieve regional estimates of percent vegetation cover. (2) A threshold response related to depth-to-water dependence was identified for phreatophytic Alkali Meadow communities. Plant communities that were subject to groundwater depths below this threshold exhibited greater invasion by precipitation sensitive plants. (3) The floristic differences between previously cultivated and uncultivated land were found to account for an increased sensitivity of plant communities to precipitation variability. Through (2) and (3), two human influences (groundwater decline and previous land cultivation) were shown to alter land cover such that the land became more sensitive to precipitation change. Climate change predictions include a component of increased climate variability for the western United States; therefore, these results place serious doubt on the sustainability of human activities in this region. The results from this work broadly cover topics from remote sensing techniques to the ecology of Great Basin plant communities and are applicable wherever large regions of land are being managed in an era of changing environmental conditions.

  4. Discovering functional interdependence relationship in PPI networks for protein complex identification.

    PubMed

    Lam, Winnie W M; Chan, Keith C C

    2012-04-01

    Protein molecules interact with each other in protein complexes to perform many vital functions, and different computational techniques have been developed to identify protein complexes in protein-protein interaction (PPI) networks. These techniques are developed to search for subgraphs of high connectivity in PPI networks under the assumption that the proteins in a protein complex are highly interconnected. While these techniques have been shown to be quite effective, it is also possible that the matching rate between the protein complexes they discover and those that are previously determined experimentally be relatively low and the "false-alarm" rate can be relatively high. This is especially the case when the assumption of proteins in protein complexes being more highly interconnected be relatively invalid. To increase the matching rate and reduce the false-alarm rate, we have developed a technique that can work effectively without having to make this assumption. The name of the technique called protein complex identification by discovering functional interdependence (PCIFI) searches for protein complexes in PPI networks by taking into consideration both the functional interdependence relationship between protein molecules and the network topology of the network. The PCIFI works in several steps. The first step is to construct a multiple-function protein network graph by labeling each vertex with one or more of the molecular functions it performs. The second step is to filter out protein interactions between protein pairs that are not functionally interdependent of each other in the statistical sense. The third step is to make use of an information-theoretic measure to determine the strength of the functional interdependence between all remaining interacting protein pairs. Finally, the last step is to try to form protein complexes based on the measure of the strength of functional interdependence and the connectivity between proteins. For performance evaluation, PCIFI was used to identify protein complexes in real PPI network data and the protein complexes it found were matched against those that were previously known in MIPS. The results show that PCIFI can be an effective technique for the identification of protein complexes. The protein complexes it found can match more known protein complexes with a smaller false-alarm rate and can provide useful insights into the understanding of the functional interdependence relationships between proteins in protein complexes.

  5. Interventions to improve work outcomes in work-related PTSD: a systematic review

    PubMed Central

    2011-01-01

    Background Posttraumatic stress disorder acquired at work can be debilitating both for workers and their employers. The disorder can result in increased sick leave, reduced productivity, and even unemployment. Furthermore, workers are especially unlikely to return to their previous place of employment after a traumatic incident at work because of the traumatic memories and symptoms of avoidance that typically accompany the disorder. Therefore, intervening in work-related PTSD becomes especially important in order to get workers back to the workplace. Methods A systematic literature search was conducted using Medline, PsycINFO, Embase, and Web of Science. The articles were independently screened based on inclusion and exclusion criteria, followed by a quality assessment of all included articles. Results The systematic search identified seven articles for inclusion in the review. These consisted of six research articles and one systematic review. The review focused specifically on interventions using real exposure techniques for anxiety disorders in the workplace. In the research articles addressed in the current review, study populations included police officers, public transportation workers, and employees injured at work. The studies examined the effectiveness of EMDR, cognitive-behavioural techniques, and an integrative therapy approach called brief eclectic psychotherapy. Interestingly, 2 of the 6 research articles addressed add-on treatments for workplace PTSD, which were designed to treat workers with PTSD who failed to respond to traditional evidence-based psychotherapy. Conclusions Results of the current review suggest that work-related interventions show promise as effective strategies for promoting return to work in employees who acquired PTSD in the workplace. Further research is needed in this area to determine how different occupational groups with specific types of traumatic exposure might respond differently to work-tailored treatments. PMID:22040066

  6. Towards the Verification of Human-Robot Teams

    NASA Technical Reports Server (NTRS)

    Fisher, Michael; Pearce, Edward; Wooldridge, Mike; Sierhuis, Maarten; Visser, Willem; Bordini, Rafael H.

    2005-01-01

    Human-Agent collaboration is increasingly important. Not only do high-profile activities such as NASA missions to Mars intend to employ such teams, but our everyday activities involving interaction with computational devices falls into this category. In many of these scenarios, we are expected to trust that the agents will do what we expect and that the agents and humans will work together as expected. But how can we be sure? In this paper, we bring together previous work on the verification of multi-agent systems with work on the modelling of human-agent teamwork. Specifically, we target human-robot teamwork. This paper provides an outline of the way we are using formal verification techniques in order to analyse such collaborative activities. A particular application is the analysis of human-robot teams intended for use in future space exploration.

  7. Reviewing the connection between speech and obstructive sleep apnea.

    PubMed

    Espinoza-Cuadros, Fernando; Fernández-Pozo, Rubén; Toledano, Doroteo T; Alcázar-Ramírez, José D; López-Gonzalo, Eduardo; Hernández-Gómez, Luis A

    2016-02-20

    Sleep apnea (OSA) is a common sleep disorder characterized by recurring breathing pauses during sleep caused by a blockage of the upper airway (UA). The altered UA structure or function in OSA speakers has led to hypothesize the automatic analysis of speech for OSA assessment. In this paper we critically review several approaches using speech analysis and machine learning techniques for OSA detection, and discuss the limitations that can arise when using machine learning techniques for diagnostic applications. A large speech database including 426 male Spanish speakers suspected to suffer OSA and derived to a sleep disorders unit was used to study the clinical validity of several proposals using machine learning techniques to predict the apnea-hypopnea index (AHI) or classify individuals according to their OSA severity. AHI describes the severity of patients' condition. We first evaluate AHI prediction using state-of-the-art speaker recognition technologies: speech spectral information is modelled using supervectors or i-vectors techniques, and AHI is predicted through support vector regression (SVR). Using the same database we then critically review several OSA classification approaches previously proposed. The influence and possible interference of other clinical variables or characteristics available for our OSA population: age, height, weight, body mass index, and cervical perimeter, are also studied. The poor results obtained when estimating AHI using supervectors or i-vectors followed by SVR contrast with the positive results reported by previous research. This fact prompted us to a careful review of these approaches, also testing some reported results over our database. Several methodological limitations and deficiencies were detected that may have led to overoptimistic results. The methodological deficiencies observed after critically reviewing previous research can be relevant examples of potential pitfalls when using machine learning techniques for diagnostic applications. We have found two common limitations that can explain the likelihood of false discovery in previous research: (1) the use of prediction models derived from sources, such as speech, which are also correlated with other patient characteristics (age, height, sex,…) that act as confounding factors; and (2) overfitting of feature selection and validation methods when working with a high number of variables compared to the number of cases. We hope this study could not only be a useful example of relevant issues when using machine learning for medical diagnosis, but it will also help in guiding further research on the connection between speech and OSA.

  8. Demographic management in a federated healthcare environment.

    PubMed

    Román, I; Roa, L M; Reina-Tosina, J; Madinabeitia, G

    2006-09-01

    The purpose of this paper is to provide a further step toward the decentralization of identification and demographic information about persons by solving issues related to the integration of demographic agents in a federated healthcare environment. The aim is to identify a particular person in every system of a federation and to obtain a unified view of his/her demographic information stored in different locations. This work is based on semantic models and techniques, and pursues the reconciliation of several current standardization works including ITU-T's Open Distributed Processing, CEN's prEN 12967, OpenEHR's dual and reference models, CEN's General Purpose Information Components and CORBAmed's PID service. We propose a new paradigm for the management of person identification and demographic data, based on the development of an open architecture of specialized distributed components together with the incorporation of techniques for the efficient management of domain ontologies, in order to have a federated demographic service. This new service enhances previous correlation solutions sharing ideas with different standards and domains like semantic techniques and database systems. The federation philosophy enforces us to devise solutions to the semantic, functional and instance incompatibilities in our approach. Although this work is based on several models and standards, we have improved them by combining their contributions and developing a federated architecture that does not require the centralization of demographic information. The solution is thus a good approach to face integration problems and the applied methodology can be easily extended to other tasks involved in the healthcare organization.

  9. Mitigation of PID in commercial PV modules using current interruption method

    NASA Astrophysics Data System (ADS)

    Bora, Birinchi; Oh, Jaewon; Tatapudi, Sai; Sastry, Oruganty S.; Kumar, Rajesh; Prasad, Basudev; Tamizhmani, Govindasamy

    2017-08-01

    Potential-induced degradation (PID) is known to have a very severe effect on the reliability of PV modules. PID is caused due to the leakage of current from the cell circuit to the grounded frame under humid conditions of high voltage photovoltaic (PV) systems. There are multiple paths for the current leakage. The most dominant leakage path is from the cell to the frame through encapsulant, glass bulk and glass surface. This dominant path can be prevented by interrupting the electrical conductivity at the glass surface. In our previous works related to this topic, we demonstrated the effectiveness of glass surface conductivity interruption technique using one-cell PV coupons. In this work, we demonstrate the effectiveness of this technique using a full size commercial module susceptible to PID. The interruption of surface conductivity of the commercial module was achieved by attaching a narrow, thin flexible glass strips, from Corning, called Willow Glass on the glass surface along the inner edges of the frame. The flexible glass strip was attached to the module glass surface by heating the glass strip with an ionomer adhesive underneath using a handheld heat gun. The PID stress test was performed at 60°C and 85% RH for 96 hours at -600 V. Pre- and post-PID characterizations including I-V and electroluminescence were carried out to determine the performance loss and affected cell areas. This work demonstrates that the PID issue can be effectively addressed by using this current interruption technique. An important benefit of this approach is that this interruption technique can be applied after manufacturing the modules and after installing the modules in the field as well.

  10. Quantitative chemical tagging, stellar ages and the chemo-dynamical evolution of the Galactic disc

    NASA Astrophysics Data System (ADS)

    Mitschang, A. W.; De Silva, G.; Zucker, D. B.; Anguiano, B.; Bensby, T.; Feltzing, S.

    2014-03-01

    The early science results from the new generation of high-resolution stellar spectroscopic surveys, such as Galactic Archaeology with HERMES (GALAH) and the Gaia European Southern Observatory survey (Gaia-ESO), will represent major milestones in the quest to chemically tag the Galaxy. Yet this technique to reconstruct dispersed coeval stellar groups has remained largely untested until recently. We build on previous work that developed an empirical chemical tagging probability function, which describes the likelihood that two field stars are conatal, that is, they were formed in the same cluster environment. In this work, we perform the first ever blind chemical tagging experiment, i.e. tagging stars with no known or otherwise discernible associations, on a sample of 714 disc field stars with a number of high-quality high-resolution homogeneous metal abundance measurements. We present evidence that chemical tagging of field stars does identify coeval groups of stars, yet these groups may not represent distinct formation sites, e.g. as in dissolved open clusters, as previously thought. Our results point to several important conclusions, among them that group finding will be limited strictly to chemical abundance space, e.g. stellar ages, kinematics, colours, temperature and surface gravity do not enhance the detectability of groups. We also demonstrate that in addition to its role in probing the chemical enrichment and kinematic history of the Galactic disc, chemical tagging represents a powerful new stellar age determination technique.

  11. Perception of the importance of human-animal interactions on cattle flow and worker safety on Minnesota dairy farms.

    PubMed

    Sorge, U S; Cherry, C; Bender, J B

    2014-07-01

    Proper cattle-handling techniques (stockmanship) are important to ensure calm animals and a safe work environment for dairy workers on farm. The objectives of this study were to (1) assess Minnesota dairy herd owners' attitudes toward stockmanship, its perceived importance for cow comfort and worker health, and the establishment of calm cattle movement; and (2) identify current resources and methods of stockmanship training on Minnesota dairy farms. A stratified-random sample of Minnesota dairy farmers were contacted via mail to participate in a 28-question survey. One hundred eight bovine dairy producers participated. Most commonly, respondents learned their cattle handling skills from family members (42.6%) and 29.9% of producers had participated in previous stockmanship training. Producers thought that the skill of the human handler was the most important factor in establishing good cattle flow. Cattle-handling techniques was the third most common topic for new-employee orientation after training in milking parlor protocols and milking parlor disinfection. Time limitations and language barrier were considered serious challenges for worker training. Work-related injuries were responsible for lost work days in the previous year in 13.3% of dairy herds and 73.3% of those injuries occurred while working with cattle. Producers perceived that cattle-related injuries were predominantly the handler's fault: either because of not paying enough attention to the animal or due to poor cattle handling skills. Facility design was considered the least important for the occurrence of worker injuries. Although no causal inference can be made, herds that had workers who had previously participated in stockmanship training had a 810 ± 378 kg (mean ± standard error of the mean) higher rolling herd average than those that did not, even after adjusting for herd size and bulk tank somatic cell count. However, 50% of respondents were not interested in attending future stockmanship training sessions. In conclusion, cattle handling skills are considered important by Minnesota dairy producers to ensure worker safety and cow flow. Limited availability of time, language barrier, and a perceived lack of training materials were considered challenges during the training of workers on farms. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. Characterization of tabique walls nails of the Alto Douro Wine Region

    NASA Astrophysics Data System (ADS)

    Cardoso, Rui; Pinto, Jorge; Paiva, Anabela; Lanzinha, João Carlos

    2016-11-01

    Tabique is one of the main Portuguese traditional building techniques which use raw materials as stone, earth andwood. In general, a tabique building component as a wall consist of a wooden structure made up of vertical boards connected to laths by metal nails and covered on both sides by an earth based material. This traditional building technology as an expressive incidence in the Alto Douro Wine Region located in the interior of Northern Portugal, added to the UNESCO's Word Heritage Sites List in December 2001 as an `evolved continuing cultural landscape'. Furthermore, previous research works have shown that the existing tabique construction, in this region, reveals a certain lack of maintenance partially justified by the knowledge loosed on that technique, consequently this construction technique present an advanced stage of deterioration. This aspect associated to the fact that there is still a lack of scientific studies in this field motivated the writing of this paper, the main objectives are to identify and characterize the nails used in the timber connections. The nails samples were collected from tabique walls included in tabique buildings located in LamegoMunicipality, near Douro River, in the Alto Douro Wine Region. This work also intends to give guidelines to the rehabilitation and preservation of this important legacy.

  13. Model-independent limits and constraints on extended theories of gravity from cosmic reconstruction techniques

    NASA Astrophysics Data System (ADS)

    de la Cruz-Dombriz, Álvaro; Dunsby, Peter K. S.; Luongo, Orlando; Reverberi, Lorenzo

    2016-12-01

    The onset of dark energy domination depends on the particular gravitational theory driving the cosmic evolution. Model independent techniques are crucial to test the both the present ΛCDM cosmological paradigm and alternative theories, making the least possible number of assumptions about the Universe. In this paper we investigate whether cosmography is able to distinguish between different gravitational theories, by determining bounds on model parameters for three different extensions of General Relativity, namely quintessence, F(𝒯) and f(R) gravitational theories. We expand each class of theories in powers of redshift z around the present time, making no additional assumptions. This procedure is an extension of previous work and can be seen as the most general approach for testing extended theories of gravity through the use of cosmography. In the case of F(𝒯) and f(R) theories, we show that some assumptions on model parameters often made in previous works are superfluous or even unjustified. We use data from the Union 2.1 supernovae catalogue, baryonic acoustic oscillation data and H(z) differential age compilations, which probe cosmology on different scales of the cosmological evolution. We perform a Monte Carlo analysis using a Metropolis-Hastings algorithm with a Gelman-Rubin convergence criterion, reporting 1-σ and 2-σ confidence levels. To do so, we perform two distinct fits, assuming only data within z < 1 first and then without limitations in redshift. We obtain the corresponding numerical intervals in which coefficients span, and find that the data is compatible the ΛCDM limit of all three theories at the 1-σ level, while still compatible with quite a large portion of parameter space. We compare our results to the truncated ΛCDM paradigm, demonstrating that our bounds divert from the expectations of previous works, showing that the permitted regions of coefficients are significantly modified and in general widened with respect to values usually reported in the existing literature. Finally, we test the extended theories through the Bayesian selection criteria AIC and BIC.

  14. Model-independent limits and constraints on extended theories of gravity from cosmic reconstruction techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cruz-Dombriz, Álvaro de la; Dunsby, Peter K.S.; Luongo, Orlando

    The onset of dark energy domination depends on the particular gravitational theory driving the cosmic evolution. Model independent techniques are crucial to test the both the present ΛCDM cosmological paradigm and alternative theories, making the least possible number of assumptions about the Universe. In this paper we investigate whether cosmography is able to distinguish between different gravitational theories, by determining bounds on model parameters for three different extensions of General Relativity, namely quintessence, F (Τ) and f ( R ) gravitational theories. We expand each class of theories in powers of redshift z around the present time, making no additionalmore » assumptions. This procedure is an extension of previous work and can be seen as the most general approach for testing extended theories of gravity through the use of cosmography. In the case of F (Τ) and f ( R ) theories, we show that some assumptions on model parameters often made in previous works are superfluous or even unjustified. We use data from the Union 2.1 supernovae catalogue, baryonic acoustic oscillation data and H ( z ) differential age compilations, which probe cosmology on different scales of the cosmological evolution. We perform a Monte Carlo analysis using a Metropolis-Hastings algorithm with a Gelman-Rubin convergence criterion, reporting 1-σ and 2-σ confidence levels. To do so, we perform two distinct fits, assuming only data within z < 1 first and then without limitations in redshift. We obtain the corresponding numerical intervals in which coefficients span, and find that the data is compatible the ΛCDM limit of all three theories at the 1-σ level, while still compatible with quite a large portion of parameter space. We compare our results to the truncated ΛCDM paradigm, demonstrating that our bounds divert from the expectations of previous works, showing that the permitted regions of coefficients are significantly modified and in general widened with respect to values usually reported in the existing literature. Finally, we test the extended theories through the Bayesian selection criteria AIC and BIC.« less

  15. Energy-based dosimetry of low-energy, photon-emitting brachytherapy sources

    NASA Astrophysics Data System (ADS)

    Malin, Martha J.

    Model-based dose calculation algorithms (MBDCAs) for low-energy, photon-emitting brachytherapy sources have advanced to the point where the algorithms may be used in clinical practice. Before these algorithms can be used, a methodology must be established to verify the accuracy of the source models used by the algorithms. Additionally, the source strength metric for these algorithms must be established. This work explored the feasibility of verifying the source models used by MBDCAs by measuring the differential photon fluence emitted from the encapsulation of the source. The measured fluence could be compared to that modeled by the algorithm to validate the source model. This work examined how the differential photon fluence varied with position and angle of emission from the source, and the resolution that these measurements would require for dose computations to be accurate to within 1.5%. Both the spatial and angular resolution requirements were determined. The techniques used to determine the resolution required for measurements of the differential photon fluence were applied to determine why dose-rate constants determined using a spectroscopic technique disagreed with those computed using Monte Carlo techniques. The discrepancy between the two techniques had been previously published, but the cause of the discrepancy was not known. This work determined the impact that some of the assumptions used by the spectroscopic technique had on the accuracy of the calculation. The assumption of isotropic emission was found to cause the largest discrepancy in the spectroscopic dose-rate constant. Finally, this work improved the instrumentation used to measure the rate at which energy leaves the encapsulation of a brachytherapy source. This quantity is called emitted power (EP), and is presented as a possible source strength metric for MBDCAs. A calorimeter that measured EP was designed and built. The theoretical framework that the calorimeter relied upon to measure EP was established. Four clinically relevant 125I brachytherapy sources were measured with the instrument. The accuracy of the measured EP was compared to an air-kerma strength-derived EP to test the accuracy of the instrument. The instrument was accurate to within 10%, with three out of the four source measurements accurate to within 4%.

  16. A rapid Orthopoxvirus purification protocol suitable for high-containment laboratories.

    PubMed

    Hughes, Laura; Wilkins, Kimberly; Goldsmith, Cynthia S; Smith, Scott; Hudson, Paul; Patel, Nishi; Karem, Kevin; Damon, Inger; Li, Yu; Olson, Victoria A; Satheshkumar, P S

    2017-05-01

    Virus purification in a high-containment setting provides unique challenges due to barrier precautions and operational safety approaches that are not necessary in lower biosafety level (BSL) 2 environments. The need for high risk group pathogen diagnostic assay development, anti-viral research, pathogenesis and vaccine efficacy research necessitates work in BSL-3 and BSL-4 labs with infectious agents. When this work is performed in accordance with BSL-4 practices, modifications are often required in standard protocols. Classical virus purification techniques are difficult to execute in a BSL-3 or BSL-4 laboratory because of the work practices used in these environments. Orthopoxviruses are a family of viruses that, in some cases, requires work in a high-containment laboratory and due to size do not lend themselves to simpler purification methods. Current CDC purification techniques of orthopoxviruses uses 1,1,2-trichlorotrifluoroethane, commonly known as Genetron ® . Genetron ® is a chlorofluorocarbon (CFC) that has been shown to be detrimental to the ozone and has been phased out and the limited amount of product makes it no longer a feasible option for poxvirus purification purposes. Here we demonstrate a new Orthopoxvirus purification method that is suitable for high-containment laboratories and produces virus that is not only comparable to previous purification methods, but improves on purity and yield. Published by Elsevier B.V.

  17. Three-dimensional rendering of segmented object using matlab - biomed 2010.

    PubMed

    Anderson, Jeffrey R; Barrett, Steven F

    2010-01-01

    The three-dimensional rendering of microscopic objects is a difficult and challenging task that often requires specialized image processing techniques. Previous work has been described of a semi-automatic segmentation process of fluorescently stained neurons collected as a sequence of slice images with a confocal laser scanning microscope. Once properly segmented, each individual object can be rendered and studied as a three-dimensional virtual object. This paper describes the work associated with the design and development of Matlab files to create three-dimensional images from the segmented object data previously mentioned. Part of the motivation for this work is to integrate both the segmentation and rendering processes into one software application, providing a seamless transition from the segmentation tasks to the rendering and visualization tasks. Previously these tasks were accomplished on two different computer systems, windows and Linux. This transition basically limits the usefulness of the segmentation and rendering applications to those who have both computer systems readily available. The focus of this work is to create custom Matlab image processing algorithms for object rendering and visualization, and merge these capabilities to the Matlab files that were developed especially for the image segmentation task. The completed Matlab application will contain both the segmentation and rendering processes in a single graphical user interface, or GUI. This process for rendering three-dimensional images in Matlab requires that a sequence of two-dimensional binary images, representing a cross-sectional slice of the object, be reassembled in a 3D space, and covered with a surface. Additional segmented objects can be rendered in the same 3D space. The surface properties of each object can be varied by the user to aid in the study and analysis of the objects. This inter-active process becomes a powerful visual tool to study and understand microscopic objects.

  18. Generalized Robertson-Walker Space-Time Admitting Evolving Null Horizons Related to a Black Hole Event Horizon.

    PubMed

    Duggal, K L

    2016-01-01

    A new technique is used to study a family of time-dependent null horizons, called " Evolving Null Horizons " (ENHs), of generalized Robertson-Walker (GRW) space-time [Formula: see text] such that the metric [Formula: see text] satisfies a kinematic condition. This work is different from our early papers on the same issue where we used (1 + n )-splitting space-time but only some special subcases of GRW space-time have this formalism. Also, in contrast to previous work, we have proved that each member of ENHs is totally umbilical in [Formula: see text]. Finally, we show that there exists an ENH which is always a null horizon evolving into a black hole event horizon and suggest some open problems.

  19. A comparison of design variables for control theory based airfoil optimization

    NASA Technical Reports Server (NTRS)

    Reuther, James; Jameson, Antony

    1995-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work in the area it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using either the potential flow or the Euler equations with either a conformal mapping or a general coordinate system. We have also explored three-dimensional extensions of these formulations recently. The goal of our present work is to demonstrate the versatility of the control theory approach by designing airfoils using both Hicks-Henne functions and B-spline control points as design variables. The research also demonstrates that the parameterization of the design space is an open question in aerodynamic design.

  20. Only Image Based for the 3d Metric Survey of Gothic Structures by Using Frame Cameras and Panoramic Cameras

    NASA Astrophysics Data System (ADS)

    Pérez Ramos, A.; Robleda Prieto, G.

    2016-06-01

    Indoor Gothic apse provides a complex environment for virtualization using imaging techniques due to its light conditions and architecture. Light entering throw large windows in combination with the apse shape makes difficult to find proper conditions to photo capture for reconstruction purposes. Thus, documentation techniques based on images are usually replaced by scanning techniques inside churches. Nevertheless, the need to use Terrestrial Laser Scanning (TLS) for indoor virtualization means a significant increase in the final surveying cost. So, in most cases, scanning techniques are used to generate dense point clouds. However, many Terrestrial Laser Scanner (TLS) internal cameras are not able to provide colour images or cannot reach the image quality that can be obtained using an external camera. Therefore, external quality images are often used to build high resolution textures of these models. This paper aims to solve the problem posted by virtualizing indoor Gothic churches, making that task more affordable using exclusively techniques base on images. It reviews a previous proposed methodology using a DSRL camera with 18-135 lens commonly used for close range photogrammetry and add another one using a HDR 360° camera with four lenses that makes the task easier and faster in comparison with the previous one. Fieldwork and office-work are simplified. The proposed methodology provides photographs in such a good conditions for building point clouds and textured meshes. Furthermore, the same imaging resources can be used to generate more deliverables without extra time consuming in the field, for instance, immersive virtual tours. In order to verify the usefulness of the method, it has been decided to apply it to the apse since it is considered one of the most complex elements of Gothic churches and it could be extended to the whole building.

  1. Stimulated Brillouin scattering continuous wave phase conjugation in step-index fiber optics.

    PubMed

    Massey, Steven M; Spring, Justin B; Russell, Timothy H

    2008-07-21

    Continuous wave (CW) stimulated Brillouin scattering (SBS) phase conjugation in step-index optical fibers was studied experimentally and modeled as a function of fiber length. A phase conjugate fidelity over 80% was measured from SBS in a 40 m fiber using a pinhole technique. Fidelity decreases with fiber length, and a fiber with a numerical aperture (NA) of 0.06 was found to generate good phase conjugation fidelity over longer lengths than a fiber with 0.13 NA. Modeling and experiment support previous work showing the maximum interaction length which yields a high fidelity phase conjugate beam is inversely proportional to the fiber NA(2), but find that fidelity remains high over much longer fiber lengths than previous models calculated. Conditions for SBS beam cleanup in step-index fibers are discussed.

  2. Production of oscillatory flow in wind tunnels

    NASA Astrophysics Data System (ADS)

    Al-Asmi, K.; Castro, I. P.

    1993-06-01

    A method for producing oscillatory flow in open-circuit wind tunnels driven by centrifugal fans is described. Performance characteristics of a new device installed on two such tunnels of greatly differing size are presented. It is shown that sinusoidal variations of the working section flow, having peak-to-peak amplitudes up to at least 30 percent of the mean flow speed and frequencies up to, typically, that corresponding to the acoustic quarter-wave-length frequency determined by the tunnel size, can be obtained with negligible harmonic distortion or acoustic noise difficulties. A brief review of the various methods that have been used previously is included, and the advantages and disadvantages of these different techniques are highlighted. The present technique seems to represent a significant improvement over many of them.

  3. Taylor impact of glass bars

    NASA Astrophysics Data System (ADS)

    Murray, Natalie; Bourne, Neil; Field, John

    1997-07-01

    Brar and Bless pioneeered the use of plate impact upon bars as a technique for investigating the 1D stress loading of glass. We wish to extend this technique by applying VISAR and embedded stress gauge measurements to a symmetrical version of the test. In this configuration two rods impact one upon the other in a symmetrical version of the Taylor test geometry in which the impact is perfectly rigid in the centre of mass frame. Previous work in the laboratory has characterised the three glass types (float, borosilicate and a high density lead glass). These experiments will identify the 1D stress failure mechanisms from high-speed photography and the stress and particle velocity histories will be interpreted in the light of these results. The differences in response of the three glasses will be highlighted.

  4. Method: automatic segmentation of mitochondria utilizing patch classification, contour pair classification, and automatically seeded level sets

    PubMed Central

    2012-01-01

    Background While progress has been made to develop automatic segmentation techniques for mitochondria, there remains a need for more accurate and robust techniques to delineate mitochondria in serial blockface scanning electron microscopic data. Previously developed texture based methods are limited for solving this problem because texture alone is often not sufficient to identify mitochondria. This paper presents a new three-step method, the Cytoseg process, for automated segmentation of mitochondria contained in 3D electron microscopic volumes generated through serial block face scanning electron microscopic imaging. The method consists of three steps. The first is a random forest patch classification step operating directly on 2D image patches. The second step consists of contour-pair classification. At the final step, we introduce a method to automatically seed a level set operation with output from previous steps. Results We report accuracy of the Cytoseg process on three types of tissue and compare it to a previous method based on Radon-Like Features. At step 1, we show that the patch classifier identifies mitochondria texture but creates many false positive pixels. At step 2, our contour processing step produces contours and then filters them with a second classification step, helping to improve overall accuracy. We show that our final level set operation, which is automatically seeded with output from previous steps, helps to smooth the results. Overall, our results show that use of contour pair classification and level set operations improve segmentation accuracy beyond patch classification alone. We show that the Cytoseg process performs well compared to another modern technique based on Radon-Like Features. Conclusions We demonstrated that texture based methods for mitochondria segmentation can be enhanced with multiple steps that form an image processing pipeline. While we used a random-forest based patch classifier to recognize texture, it would be possible to replace this with other texture identifiers, and we plan to explore this in future work. PMID:22321695

  5. New techniques for positron emission tomography in the study of human neurological disorders: Progress report, 15 June 1992--31 October 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-01-01

    During the past six months, we have continued work on the fronts of kinetic modeling of radioligands for studying neurotransmitter/receptor systems, iterative reconstruction techniques, and methodology for PET cerebral blood flow activation studies. Initial human PET studies have been performed and analyzed with many different kinetic model formulations to determine the quantitative potential of the neuronwsmitter/receptor ligand, ({sup 11}C)N-methyl piperidyl benzilate (NMPB), a muscarinic cholinergic antagonist. In addition, initial human studies using ({sup 11}C)tetrabenazine (TBZ), a marker for monoantine nerve terminal density. Results of the NWB studies have indicated that this new agent yields better estimates of receptor density thanmore » previous muscarinic ligands developed at our facility, ({sup 11}C)-TRB and ({sup 11}C)scopolamine. TRB and scopolamine have previously been shown to be only partially successful ligands due to sub-optimal values of the individual rate constants, causing varying degrees of flow limitation. This is found to be much less of a problem for NMPB due to the 2.0--2.5 fold increase in ligand transport observed in the human studies ({approximately}60% first pass extraction). A 2-parameter 2-compartment simplification had previously been implemented for the benzodiazepine ligand, (C-11)FMZ, and a similar model appears to be suitable for TBZ based on the preliminary human data.« less

  6. New techniques for positron emission tomography in the study of human neurological disorders: Progress report, 15 June 1992--31 October 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-10-01

    During the past six months, we have continued work on the fronts of kinetic modeling of radioligands for studying neurotransmitter/receptor systems, iterative reconstruction techniques, and methodology for PET cerebral blood flow activation studies. Initial human PET studies have been performed and analyzed with many different kinetic model formulations to determine the quantitative potential of the neuronwsmitter/receptor ligand, [{sup 11}C]N-methyl piperidyl benzilate (NMPB), a muscarinic cholinergic antagonist. In addition, initial human studies using [{sup 11}C]tetrabenazine (TBZ), a marker for monoantine nerve terminal density. Results of the NWB studies have indicated that this new agent yields better estimates of receptor density thanmore » previous muscarinic ligands developed at our facility, [{sup 11}C]-TRB and [{sup 11}C]scopolamine. TRB and scopolamine have previously been shown to be only partially successful ligands due to sub-optimal values of the individual rate constants, causing varying degrees of flow limitation. This is found to be much less of a problem for NMPB due to the 2.0--2.5 fold increase in ligand transport observed in the human studies ({approximately}60% first pass extraction). A 2-parameter 2-compartment simplification had previously been implemented for the benzodiazepine ligand, [C-11]FMZ, and a similar model appears to be suitable for TBZ based on the preliminary human data.« less

  7. Mapping Topological Magnetization and Magnetic Skyrmions

    NASA Astrophysics Data System (ADS)

    Chess, Jordan J.

    A 2014 study by the US Department of Energy conducted at Lawrence Berkeley National Laboratory estimated that U.S. data centers consumed 70 billion kWh of electricity. This represents about 1.8% of the total U.S. electricity consumption. Putting this in perspective 70 billion kWh of electricity is the equivalent of roughly 8 big nuclear reactors, or around double the nation's solar panel output. Developing new memory technologies capable of reducing this power consumption would be greatly beneficial as our demand for connectivity increases in the future. One newly emerging candidate for an information carrier in low power memory devices is the magnetic skyrmion. This magnetic texture is characterized by its specific non-trivial topology, giving it particle-like characteristics. Recent experimental work has shown that these skyrmions can be stabilized at room temperature and moved with extremely low electrical current densities. This rapidly developing field requires new measurement techniques capable of determining the topology of these textures at greater speed than previous approaches. In this dissertation, I give a brief introduction to the magnetic structures found in Fe/Gd multilayered systems. I then present newly developed techniques that streamline the analysis of Lorentz Transmission Electron Microscopy (LTEM) data. These techniques are then applied to further the understanding of the magnetic properties of these Fe/Gd based multilayered systems. This dissertation includes previously published and unpublished co-authored material.

  8. Towards an Entropy Stable Spectral Element Framework for Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.; Parsani, Matteo; Fisher, Travis C.; Nielsen, Eric J.

    2016-01-01

    Entropy stable (SS) discontinuous spectral collocation formulations of any order are developed for the compressible Navier-Stokes equations on hexahedral elements. Recent progress on two complementary efforts is presented. The first effort is a generalization of previous SS spectral collocation work to extend the applicable set of points from tensor product, Legendre-Gauss-Lobatto (LGL) to tensor product Legendre-Gauss (LG) points. The LG and LGL point formulations are compared on a series of test problems. Although being more costly to implement, it is shown that the LG operators are significantly more accurate on comparable grids. Both the LGL and LG operators are of comparable efficiency and robustness, as is demonstrated using test problems for which conventional FEM techniques suffer instability. The second effort generalizes previous SS work to include the possibility of p-refinement at non-conforming interfaces. A generalization of existing entropy stability machinery is developed to accommodate the nuances of fully multi-dimensional summation-by-parts (SBP) operators. The entropy stability of the compressible Euler equations on non-conforming interfaces is demonstrated using the newly developed LG operators and multi-dimensional interface interpolation operators.

  9. Adversarial Feature Selection Against Evasion Attacks.

    PubMed

    Zhang, Fei; Chan, Patrick P K; Biggio, Battista; Yeung, Daniel S; Roli, Fabio

    2016-03-01

    Pattern recognition and machine learning techniques have been increasingly adopted in adversarial settings such as spam, intrusion, and malware detection, although their security against well-crafted attacks that aim to evade detection by manipulating data at test time has not yet been thoroughly assessed. While previous work has been mainly focused on devising adversary-aware classification algorithms to counter evasion attempts, only few authors have considered the impact of using reduced feature sets on classifier security against the same attacks. An interesting, preliminary result is that classifier security to evasion may be even worsened by the application of feature selection. In this paper, we provide a more detailed investigation of this aspect, shedding some light on the security properties of feature selection against evasion attacks. Inspired by previous work on adversary-aware classifiers, we propose a novel adversary-aware feature selection model that can improve classifier security against evasion attacks, by incorporating specific assumptions on the adversary's data manipulation strategy. We focus on an efficient, wrapper-based implementation of our approach, and experimentally validate its soundness on different application examples, including spam and malware detection.

  10. Study of ( α , p ) and ( α , n ) reactions with a Multi-Sampling Ionization Chamber

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avila, M. L.; Rehm, K. E.; Almaraz-Calderon, S.

    Here, a large number of (α,p) and (α, n) reactions are known to play a fundamental role in nuclear astrophysics. This work presents a novel technique to study these reactions with the active target system MUSIC whose segmented anode allows the investigation of a large energy range of the excitation function with a single beam energy. In order to verify the method, we performed direct measurements of the previously measured reactions 17O (α, n) 20Ne, 23Na (α,p) 26Mg, and 23Na 26Al. These reactions were investigated in inverse kinematics using 4He gas in the detector to study the excitation functions inmore » the energy range of about 2–6 MeV in the center of mass. We found good agreement between the cross sections of the 17O (α, n) 20Ne reaction measured in this work and previous measurements. Furthermore we have successfully performed a simultaneous measurement of the 23Na (α,p) 26Mg and 23Na (α, n) 26Al reactions.« less

  11. Study of (α , p) and (α , n) reactions with a Multi-Sampling Ionization Chamber

    NASA Astrophysics Data System (ADS)

    Avila, M. L.; Rehm, K. E.; Almaraz-Calderon, S.; Ayangeakaa, A. D.; Dickerson, C.; Hoffman, C. R.; Jiang, C. L.; Kay, B. P.; Lai, J.; Nusair, O.; Pardo, R. C.; Santiago-Gonzalez, D.; Talwar, R.; Ugalde, C.

    2017-07-01

    A large number of (α , p) and (α , n) reactions are known to play a fundamental role in nuclear astrophysics. This work presents a novel technique to study these reactions with the active target system MUSIC whose segmented anode allows the investigation of a large energy range of the excitation function with a single beam energy. In order to verify the method, we performed direct measurements of the previously measured reactions 17O (α , n) 20Ne, 23Na (α , p) 26Mg, and 23Na (α , n) 26Al. These reactions were investigated in inverse kinematics using 4He gas in the detector to study the excitation functions in the energy range of about 2-6 MeV in the center of mass. We found good agreement between the cross sections of the 17O (α , n) 20Ne reaction measured in this work and previous measurements. Furthermore we have successfully performed a simultaneous measurement of the 23Na (α , p) 26Mg and 23Na (α , n) 26Al reactions.

  12. Study of ( α , p ) and ( α , n ) reactions with a Multi-Sampling Ionization Chamber

    DOE PAGES

    Avila, M. L.; Rehm, K. E.; Almaraz-Calderon, S.; ...

    2017-04-03

    Here, a large number of (α,p) and (α, n) reactions are known to play a fundamental role in nuclear astrophysics. This work presents a novel technique to study these reactions with the active target system MUSIC whose segmented anode allows the investigation of a large energy range of the excitation function with a single beam energy. In order to verify the method, we performed direct measurements of the previously measured reactions 17O (α, n) 20Ne, 23Na (α,p) 26Mg, and 23Na 26Al. These reactions were investigated in inverse kinematics using 4He gas in the detector to study the excitation functions inmore » the energy range of about 2–6 MeV in the center of mass. We found good agreement between the cross sections of the 17O (α, n) 20Ne reaction measured in this work and previous measurements. Furthermore we have successfully performed a simultaneous measurement of the 23Na (α,p) 26Mg and 23Na (α, n) 26Al reactions.« less

  13. Line Narrowing Parameter Measurement by Modulation Spectroscopy

    NASA Technical Reports Server (NTRS)

    Dharamsi, Amin N.

    1998-01-01

    Accurate Characterization of Oxygen A-Band Line Parameters by Wavelength Modulation Spectroscopy with tunable diode lasers is an ongoing research at Old Dominion University, under sponsorship from NASA Langley research Center. The work proposed here will be undertaken under the guidance of Dr. William Chu and Dr. Lamont Poole of the Aerosol Research Branch at NASA Langley-Research Center in Hampton, Virginia. The research was started about two years ago and utilizes wavelength modulation absorption spectroscopy with higher harmonic detection, a technique that we developed at Old Dominion University, to obtain the absorption line characteristics of the Oxygen A-band rovibronic lines. Accurate characterization of this absorption band is needed for processing of data that will be obtained in experiments such as the NASA Stratospheric Aerosol and Gas Experiment III (SAGE III) as part of the US Mission to Planet Earth. The research work for Summer Fellowship undertook a measurement of the Dicke line-narrowing parameters of the Oxygen A-Band lines by using wavelength modulation spectroscopy. Our previous theoretical results had indicated that such a measurement could be done sensitively and in a convenient fashion by using this type of spectroscopy. In particular, theoretical results had indicated that the signal magnitude would depend on pressure in a manner that was very sensitive to the narrowing parameter. One of the major tasks undertaken during the summer of 1998 was to establish experimentally that these theoretical predictions were correct. This was done successfully and the results of the work are being prepared for publication. Experimental Results were obtained in which the magnitude of the signal was measured as a function of pressure, for various harmonic detection orders (N = 1, 2, 3, 4, 5). A comparison with theoretical results was made, and it was shown that the agreement between theory and experiment was very good. More importantly, however, it was shown that the measurement was yielded a very sensitive technique for obtaining the narrowing parameter that describes the deviation of Oxygen A-band lines from the Voigt profile. In particular, it was seen that the best fits were obtained consistently when the narrowing parameter value used was 0.022 1/cm, 1/Atm, Previous work, upon which the current work was based has resulted in several accurate measurements of properties of particular lines of the Oxygen A band. For example, this work has resulted in the measurement of the collision cross sections of several lines including the RQ(13,14) and the RR(15,15) lines. A major achievement of achievement of the work was also the demonstration that the technique we have developed can accurately probe the structure of the absorption lineshape function. In particular, the method we have developed is very well suited for experimentally probing the characteristics of lines in their wings. This work was accepted for publication in the Journal of Applied Physics, and is scheduled to appear in the December 15, 1998 issue.

  14. Development of a direct three-dimensional biomicrofabrication concept based on electrospraying a custom made siloxane sol.

    PubMed

    Sullivan, Alice C; Jayasinghe, Suwan N

    2007-07-19

    We demonstrate here the discovery of a unique and direct three-dimensional biomicrofabrication concept possessing the ability to revolutionize the jet-based fabrication arena. Previous work carried out on similar jet-based approaches have been successful in fabricating only vertical wallpillar-structures by the controlled deposition of stacked droplets. However, these advanced jet-techniques have not been able to directly fabricate self-supporting archeslinks (without molds or reaction methods) between adjacent structures (walls or pillars). Our work reported here gives birth to a unique type of jet determined by high intensity electric fields, which is derived from a specially formulated siloxane sol. The sol studied here has been chosen for its attractive properties (such as an excellent cross-linking nature as well as the ability to polymerize via polycondensation on deposition to its biocompatability), which promotes direct forming of biostructures with nanometer (<50 nm) sized droplets in three dimensions. We foresee that this direct three-dimensional biomicrofabrication jet technique coupled with a variety of formulated sols having focused and enhanced functionality will be explored throughout the physical and life sciences.

  15. Paint and Click: Unified Interactions for Image Boundaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Summa, B.; Gooch, A. A.; Scorzelli, G.

    Image boundaries are a fundamental component of many interactive digital photography techniques, enabling applications such as segmentation, panoramas, and seamless image composition. Interactions for image boundaries often rely on two complementary but separate approaches: editing via painting or clicking constraints. In this work, we provide a novel, unified approach for interactive editing of pairwise image boundaries that combines the ease of painting with the direct control of constraints. Rather than a sequential coupling, this new formulation allows full use of both interactions simultaneously, giving users unprecedented flexibility for fast boundary editing. To enable this new approach, we provide technical advancements.more » In particular, we detail a reformulation of image boundaries as a problem of finding cycles, expanding and correcting limitations of the previous work. Our new formulation provides boundary solutions for painted regions with performance on par with state-of-the-art specialized, paint-only techniques. In addition, we provide instantaneous exploration of the boundary solution space with user constraints. Finally, we provide examples of common graphics applications impacted by our new approach.« less

  16. 1983 James B. Macelwane Awards: Donald J. DePaolo

    NASA Astrophysics Data System (ADS)

    Wasson, John T.; DePaolo, Donald J.

    We honor Don DePaolo with the Macelwane Award for several key contributions to the earth sciences. While a graduate student at Caltech Don recognized that the study of an isotopic parent—daughter pair having the same volatility and the same host phases could eliminate many of the ambiguities that had plagued previous attempts to apply isotopic data to the study of mantle reservoirs and that the 147Sm-143Nd system was such a geochemically coherent pair.Following Gunter Lugmair's pioneering work on meteorites, Don was one of the first to work out laboratory techniques for the study of Sm and Nd; these techniques and the high-precision mass spectrometers of the Wasserburg lab allowed Don and Jerry to demonstrate in several key papers that there were at least two major mantle reservoirs and to examine earth structural models that could account for the observations. Don continues to contribute new ideas to this area; for example, in a paper in press he discusses the use of 176Lu-176Hf results to place constraints on the rate of crustal recycling.

  17. Radiological risks of neutron interrogation of food.

    PubMed

    Albright, S; Seviour, R

    2015-09-01

    In recent years there has been growing interest in the use of neutron scanning techniques for security. Neutron techniques with a range of energy spectra including thermal, white and fast neutrons have been shown to work in different scenarios. As international interest in neutron scanning increases the risk of activating cargo, especially foodstuffs must be considered. There has been a limited amount of research into the activation of foods by neutron beams and we have sought to improve the amount of information available. In this paper we show that for three important metrics; activity, ingestion dose and Time to Background there is a strong dependence on the food being irradiated and a weak dependence on the energy of irradiation. Previous studies into activation used results based on irradiation of pharmaceuticals as the basis for research into activation of food. The earlier work reports that (24)Na production is the dominant threat which motivated the search for (24)Na(n,γ)(24)Na in highly salted foods. We show that (42)K can be more significant than (24)Na in low sodium foods such as Bananas and Potatoes.

  18. Microfabricated FSCV-Compatible Microelectrode Array for Real-time Monitoring of Heterogeneous Dopamine Release

    PubMed Central

    Zachek, Matthew K.; Park, Jinwoo; Takmakov, Pavel; Wightman, R. Mark; McCarty, Gregory S.

    2010-01-01

    Fast scan cyclic voltammetry (FSCV) has been used previously to detect neurotransmitter release and reuptake in vivo. An advantage that FSCV has over other electrochemical techniques is its ability to distinguish neurotransmitters of interest (i.e. monoamines) from their metabolites using their respective characteristic cyclic voltammogram. While much has been learned with this technique, it has generally only been used in a single working electrode arrangement. Additionally, traditional electrode fabrication techniques tend to be difficult and somewhat irreproducible. Described in this report is a fabrication method for a FSCV compatible microelectrode array (FSCV-MEA) that is capable of functioning in vivo. The microfabrication techniques employed here allow for better reproducibility than traditional fabrication methods of carbon fiber microelectrodes, and enable batch fabrication of electrode arrays. The reproducibility and electrochemical qualities of the probes were assessed along with cross talk in vitro. Heterogeneous release of electrically stimulated dopamine was observed in real-time in the striatum of an anesthetized rat using the FSCV-MEA. The heterogeneous effects of pharmacology on the striatum was also observed and shown to be consistent across multiple animals. PMID:20464031

  19. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  20. To deliver or not to deliver cognitive behavioral therapy for eating disorders: Replication and extension of our understanding of why therapists fail to do what they should do.

    PubMed

    Mulkens, Sandra; de Vos, Chloé; de Graaff, Anastacia; Waller, Glenn

    2018-07-01

    This study investigated the extent to which therapists fail to apply empirically supported treatments in a sample of clinicians in The Netherlands, delivering cognitive behavioral therapy for eating disorders (CBT-ED). It aimed to replicate previous findings, and to extend them by examining other potential intra-individual factors associated with the level of (non-)use of core CBT-ED techniques. Participants were 139 clinicians (127 women; mean age 41.4 years, range = 24-64) who completed an online survey about the level of use of specific techniques, their beliefs (e.g., about the importance of the alliance and use of pretreatment motivational techniques), anxiety (Intolerance of Uncertainty Scale), and personality (Ten Item Personality Inventory). Despite some differences with Waller's (2012) findings, the present results continue to indicate that therapists are not reliably delivering the CBT-ED techniques that would be expected to provide the best treatment to their patients. This 'non-delivery' appears to be related to clinician anxiety, temporal factors, and clinicians' beliefs about the power of the therapeutic alliance in driving therapy outcomes. Improving treatment delivery will involve working with clinicians' levels of anxiety, clarifying the lack of benefit of pre-therapy motivational enhancement work, and reminding clinicians that the therapeutic alliance is enhanced by behavioral change in CBT-ED, rather than the other way around. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Hydrogeology from 10,000 ft below: lessons learned in applying pulse testing for leakage detection in a carbon sequestration formation

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Lu, J.; Hovorka, S. D.; Freifeld, B. M.; Islam, A.

    2015-12-01

    Monitoring techniques capable of deep subsurface detection are desirable for early warning and leakage pathway identification in geologic carbon storage formations. This work investigates the feasibility of a leakage detection technique based on pulse testing, which is a traditional hydrogeological characterization tool. In pulse testing, the monitoring reservoir is stimulated at a fixed frequency and the acquired pressure perturbation signals are analyzed in the frequency domain to detect potential deviations in the reservoir's frequency domain response function. Unlike traditional time-domain analyses, the frequency-domain analysis aims to minimize the interference of reservoir noise by imposing coded injection patterns such that the reservoir responses to injection can be uniquely determined. We have established the theoretical basis of the approach in previous work. Recently, field validation of this pressure-based, leakage detection technique was conducted at a CO2-EOR site located in Mississippi, USA. During the demonstration, two sets of experiments were performed using 90-min and 150-min pulsing periods, for both with and without leak scenarios. Because of the lack of pre-existing leakage pathways, artificial leakage CO2 was simulated by rate-controlled venting from one of the monitoring wells. Our results show that leakage events caused a significant deviation in the amplitude of the frequency response function, indicating that pulse testing may be used as a cost-effective monitoring technique with a strong potential for automation.

  2. Machine Learning Techniques for Global Sensitivity Analysis in Climate Models

    NASA Astrophysics Data System (ADS)

    Safta, C.; Sargsyan, K.; Ricciuto, D. M.

    2017-12-01

    Climate models studies are not only challenged by the compute intensive nature of these models but also by the high-dimensionality of the input parameter space. In our previous work with the land model components (Sargsyan et al., 2014) we identified subsets of 10 to 20 parameters relevant for each QoI via Bayesian compressive sensing and variance-based decomposition. Nevertheless the algorithms were challenged by the nonlinear input-output dependencies for some of the relevant QoIs. In this work we will explore a combination of techniques to extract relevant parameters for each QoI and subsequently construct surrogate models with quantified uncertainty necessary to future developments, e.g. model calibration and prediction studies. In the first step, we will compare the skill of machine-learning models (e.g. neural networks, support vector machine) to identify the optimal number of classes in selected QoIs and construct robust multi-class classifiers that will partition the parameter space in regions with smooth input-output dependencies. These classifiers will be coupled with techniques aimed at building sparse and/or low-rank surrogate models tailored to each class. Specifically we will explore and compare sparse learning techniques with low-rank tensor decompositions. These models will be used to identify parameters that are important for each QoI. Surrogate accuracy requirements are higher for subsequent model calibration studies and we will ascertain the performance of this workflow for multi-site ALM simulation ensembles.

  3. Cooper pair tunnelling and quasiparticle poisoning in a galvanically isolated superconducting double dot

    NASA Astrophysics Data System (ADS)

    Esmail, A. A.; Ferguson, A. J.; Lambert, N. J.

    2017-12-01

    We increase the isolation of a superconducting double dot from its environment by galvanically isolating it from any electrodes. We probe it using high frequency reflectometry techniques, find 2e-periodic behaviour, and characterise the energy structure of its charge states. By modelling the response of the device, we determine the time averaged probability that the device is poisoned by quasiparticles, and by comparing this with previous work, we conclude that quasiparticle exchange between the dots and the leads is an important relaxation mechanism.

  4. The Make 2D-DB II package: conversion of federated two-dimensional gel electrophoresis databases into a relational format and interconnection of distributed databases.

    PubMed

    Mostaguir, Khaled; Hoogland, Christine; Binz, Pierre-Alain; Appel, Ron D

    2003-08-01

    The Make 2D-DB tool has been previously developed to help build federated two-dimensional gel electrophoresis (2-DE) databases on one's own web site. The purpose of our work is to extend the strength of the first package and to build a more efficient environment. Such an environment should be able to fulfill the different needs and requirements arising from both the growing use of 2-DE techniques and the increasing amount of distributed experimental data.

  5. Hybrid Discrete-Continuous Markov Decision Processes

    NASA Technical Reports Server (NTRS)

    Feng, Zhengzhu; Dearden, Richard; Meuleau, Nicholas; Washington, Rich

    2003-01-01

    This paper proposes a Markov decision process (MDP) model that features both discrete and continuous state variables. We extend previous work by Boyan and Littman on the mono-dimensional time-dependent MDP to multiple dimensions. We present the principle of lazy discretization, and piecewise constant and linear approximations of the model. Having to deal with several continuous dimensions raises several new problems that require new solutions. In the (piecewise) linear case, we use techniques from partially- observable MDPs (POMDPS) to represent value functions as sets of linear functions attached to different partitions of the state space.

  6. Electroform replication used for multiple X-ray mirror production

    NASA Technical Reports Server (NTRS)

    Kowalski, M. P.; Ulmer, M. P.; Purcell, W. R., Jr.; Loughlin, J. E. A.

    1984-01-01

    The electroforming technique for producing X-ray mirrors is described, and results of X-ray tests performed on copies made from a simple conical mandrel are reported. The design of the mandrel is depicted and the total reflectivity as well as the full-wave half modulation resolution are shown as a function of energy. The reported work has improved on previous studies by providing smaller grazing angles, making measurements at higher energies, producing about four times as many replicas from one mandrel, and obtaining better angular resolution.

  7. Foot-mounted inertial measurement unit for activity classification.

    PubMed

    Ghobadi, Mostafa; Esfahani, Ehsan T

    2014-01-01

    This paper proposes a classification technique for daily base activity recognition for human monitoring during physical therapy in home. The proposed method estimates the foot motion using single inertial measurement unit, then segments the motion into steps classify them by template-matching as walking, stairs up or stairs down steps. The results show a high accuracy of activity recognition. Unlike previous works which are limited to activity recognition, the proposed approach is more qualitative by providing similarity index of any activity to its desired template which can be used to assess subjects improvement.

  8. Higher-Order Binding Corrections to the Lamb Shift

    NASA Astrophysics Data System (ADS)

    Pachucki, K.

    1993-08-01

    In this work a new analytical method for calculating the one-loop self-energy correction to the Lamb shift is presented in detail. The technique relies on division into the low and the high energy parts. The low energy part is calculated using the multipole expansion and the high energy part is calculated by expanding the Dirac-Coulomb propagator in powers of the Coulomb field. The obtained results are in agreement with those previously known, but are more accurate. A new theoretical value of the Lamb shift is also given.

  9. Resuscitation of the newly born infant: an advisory statement from the Pediatric Working Group of the International Liaison Committee on Resuscitation.

    PubMed

    Kattwinkel, J; Niermeyer, S; Nadkarni, V; Tibballs, J; Phillips, B; Zideman, D; Van Reempts, P; Osmond, M

    1999-04-01

    The International Liaison Committee on Resuscitation (ILCOR), with representation from North America, Europe, Australia, New Zealand, Africa, and South America, was formed in 1992 to provide a forum for liaison between resuscitation organizations in the developed world. This consensus document on resuscitation extends previously published ILCOR advisory statements on resuscitation to address the unique and changing physiology of the newly born infant within the first few hours following birth and the techniques for providing advanced life support.

  10. Nuclear techniques for the on-line bulk analysis of carbon in coal-fired power stations.

    PubMed

    Sowerby, B D

    2009-09-01

    Carbon trading schemes usually require large emitters of CO(2), such as coal-fired power stations, to monitor, report and be audited on their CO(2) emissions. The emission price provides a significant additional incentive for power stations to improve efficiency. In the present paper, previous work on the bulk determination of carbon in coal is reviewed and assessed. The most favourable method is that based on neutron inelastic scattering. The potential role of on-line carbon analysers in improving boiler efficiency and in carbon accounting is discussed.

  11. Recommendations report for the platanares geothermal site, Department of Copan, Honduras. Reporte de recomendaciones para el sitio geotermico de platanares, Departamento de Copan, Honduras (in EN;SP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-11-01

    A geothermal assessment of six previously identified sites in Honduras has been conducted by a team comprised of staff from the Los Alamos National Laboratory, the US Geological Survey, and the Empresa Nacional de Energia Electrica. The application of both reconnaissance and detailed scale techniques lead to the selection of Platanares in the Department of Copan as the highest potential site. Additional work resulted in the completion of a prefeasibility study at Platanares. We present here a tabulation of the work completed and short summaries of the results from these technical studies. We also present a brief model of themore » geothermal system and recommendations for additional feasibility work. Both English and Spanish versions of this report are provided in the same document. 18 figs., 5 tabs.« less

  12. DREAMING THE ANALYTIC SESSION: A CLINICAL ESSAY.

    PubMed

    Ogden, Thomas H

    2017-01-01

    This is a clinical paper in which the author describes analytic work in which he dreams the analytic session with three of his patients. He begins with a brief discussion of aspects of analytic theory that make up a good deal of the context for his clinical work. Central among these concepts are (1) the idea that the role of the analyst is to help the patient dream his previously "undreamt" and "interrupted" dreams; and (2) dreaming the analytic session involves engaging in the experience of dreaming the session with the patient and, at the same time, unconsciously (and at times consciously) understanding the dream. The author offers no "technique" for dreaming the analytic session. Each analyst must find his or her own way of dreaming each session with each patient. Dreaming the session is not something one works at; rather, one tries not to get in its way. © 2017 The Psychoanalytic Quarterly, Inc.

  13. Application of synchrotron radiation phase-contrast microtomography with iodine staining to Rhodnius prolixus head during ecdysis period

    NASA Astrophysics Data System (ADS)

    Sena, G.; Nogueira, L. P.; Braz, D.; Colaço, M. V.; Azambuja, P.; Gonzalez, M. S.; Tromba, G.; Mantuano, A.; Costa, F. N.; Barroso, R. C.

    2018-05-01

    Synchrotron radiation phase-contrast microtomography (SR-PHC-CT) has become an important tool in studies of insects, mainly Rhodinius prolixus, the insect vector of Chagas disease. A previous work has shown that SR-PHC-CT is an excellent technique in studies about the ecdysis process of R.prolixus head. The term ecdysis refers to the set of behaviors by which an insect extracts itself from an old exoskeleton. The exoskeleton formation is indispensable for the evolutionary success of insect species, so failure to complete ecdysis will, in most cases result in death, making this process an excellent target in the search for new insect pest management strategies. Understanding the behavior of the ecdysis process is fundamental for the non-proliferation of Chagas disease. Despite it has been possible to identify the moulting process in the first work, main structures of the R.prolixus head could not be identified. In this work, it was developed a staining protocol which enabled the identification of these important structures using Iodine at SYRMEP beamline of ELETTRA. In the 3D images, it was possible to segment essential structures in the process of ecdysis. These structures have never been presented previously in the moulting period with SR-PHC-CT.

  14. Towards predictive diagnosis and management of rotator cuff disease: using curvelet transform for edge detection and segmentation of tissue

    NASA Astrophysics Data System (ADS)

    Pai Raikar, Vipul; Kwartowitz, David M.

    2016-04-01

    Degradation and injury of the rotator cuff is one of the most common diseases of the shoulder among the general population. In orthopedic injuries, rotator cuff disease is only second to back pain in terms of overall reduced quality of life for patients. Clinically, this disease is managed via pain and activity assessment and diagnostic imaging using ultrasound and MRI. Ultrasound has been shown to have good accuracy for identification and measurement of rotator cuff tears. In our previous work, we have developed novel, real-time techniques to biomechanically assess the condition of the rotator cuff based on Musculoskeletal Ultrasound. Of the rotator cuff tissues, supraspinatus is the first that sees degradation and is the most commonly affected. In our work, one of the challenges lies in effectively segmenting and characterizing the supraspinatus. We are exploring the possibility of using curvelet transform for improving techniques to segment tissue in ultrasound. Curvelets have been shown to give optimal multi-scale representation of edges in images. They are designed to represent edges and singularities along curves in images which makes them an attractive proposition for use in ultrasound segmentation. In this work, we present a novel approach to the possibility of using curvelet transforms for automatic edge and feature extraction for the supraspinatus.

  15. Amplitude modulation of steady-state visual evoked potentials by event-related potentials in a working memory task

    PubMed Central

    Yao, Dezhong; Tang, Yu; Huang, Yilan; Su, Sheng

    2009-01-01

    Previous studies have shown that the amplitude and phase of the steady-state visual-evoked potential (SSVEP) can be influenced by a cognitive task, yet the mechanism of this influence has not been understood. As the event-related potential (ERP) is the direct neural electric response to a cognitive task, studying the relationship between the SSVEP and ERP would be meaningful in understanding this underlying mechanism. In this work, the traditional average method was applied to extract the ERP directly, following the stimulus of a working memory task, while a technique named steady-state probe topography was utilized to estimate the SSVEP under the simultaneous stimulus of an 8.3-Hz flicker and a working memory task; a comparison between the ERP and SSVEP was completed. The results show that the ERP can modulate the SSVEP amplitude, and for regions where both SSVEP and ERP are strong, the modulation depth is large. PMID:19960240

  16. Final Report: Subcontract B623868 Algebraic Multigrid solvers for coupled PDE systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brannick, J.

    The Pennsylvania State University (“Subcontractor”) continued to work on the design of algebraic multigrid solvers for coupled systems of partial differential equations (PDEs) arising in numerical modeling of various applications, with a main focus on solving the Dirac equation arising in Quantum Chromodynamics (QCD). The goal of the proposed work was to develop combined geometric and algebraic multilevel solvers that are robust and lend themselves to efficient implementation on massively parallel heterogeneous computers for these QCD systems. The research in these areas built on previous works, focusing on the following three topics: (1) the development of parallel full-multigrid (PFMG) andmore » non-Galerkin coarsening techniques in this frame work for solving the Wilson Dirac system; (2) the use of these same Wilson MG solvers for preconditioning the Overlap and Domain Wall formulations of the Dirac equation; and (3) the design and analysis of algebraic coarsening algorithms for coupled PDE systems including Stokes equation, Maxwell equation and linear elasticity.« less

  17. 3He Lung Morphometry Technique: Accuracy Analysis and Pulse Sequence Optimization

    PubMed Central

    Sukstanskii, A.L.; Conradi, M.S.; Yablonskiy, D.A.

    2010-01-01

    The 3He lung morphometry technique (Yablonskiy et al, JAP, 2009), based on MRI measurements of hyperpolarized gas diffusion in lung airspaces, provides unique information on the lung microstructure at the alveolar level. 3D tomographic images of standard morphological parameters (mean airspace chord length, lung parenchyma surface-to-volume ratio, and the number of alveoli per unit lung volume) can be created from a rather short (several seconds) MRI scan. These parameters are most commonly used to characterize lung morphometry but were not previously available from in vivo studies. A background of the 3He lung morphometry technique is based on a previously proposed model of lung acinar airways, treated as cylindrical passages of external radius R covered by alveolar sleeves of depth h, and on a theory of gas diffusion in these airways. The initial works approximated the acinar airways as very long cylinders, all with the same R and h. The present work aims at analyzing effects of realistic acinar airway structures, incorporating airway branching, physiological airway lengths, a physiological ratio of airway ducts and sacs, and distributions of R and h. By means of Monte Carlo computer simulations, we demonstrate that our technique allows rather accurate measurements of geometrical and morphological parameters of acinar airways. In particular, the accuracy of determining one of the most important physiological parameter of acinar airways – surface-to-volume ratio – does not exceed several percent. Second, we analyze the effect of the susceptibility induced inhomogeneous magnetic field on the parameter estimate and demonstrate that this effect is rather negligible at B0 ≤ 3T and becomes substantial only at higher B0 Third, we theoretically derive an optimal choice of MR pulse sequence parameters, which should be used to acquire a series of diffusion attenuated MR signals, allowing a substantial decrease in the acquisition time and improvement in accuracy of the results. It is demonstrated that the optimal choice represents three not equidistant b-values: b1 = 0, b2 ~ 2 s/cm2, b3 ~ 8 s/cm2. PMID:20937564

  18. Characterization of osseointegrative phosphatidylserine and cholesterol orthopaedic implant coatings

    NASA Astrophysics Data System (ADS)

    Rodgers, William Paul, III

    Total joint arthroplasties are one of the most successful surgeries available today for improving patients' quality of life. Increasing demand is driven largely by an ageing population and an increased occurrence of obesity. Current patient options have significant shortcomings. Nearly a third of patients require a revision surgery before the implant is 15 years old, and those who have revision surgeries are at an increased risk of requiring additional reoperations. A recent implant technology that has shown to be effective at improving bone to implant integration is the use of phosphatidylserine (DOPS) coatings. These coatings are challenging to analyze and measure due to their highly dynamic, soft, rough, thick, and optically diffractive properties. Previous work had difficulty investigating pertinent parameters for these coating's development due in large part to a lack of available analytical techniques and a dearth of understanding of the micro- and nano-structural configuration of the coatings. This work addresses the lack of techniques available for use with DOPS coatings through the development of original methods of measurement, including the use of scanning white light interferometry and nanoindentation. These techniques were then applied for the characterization of DOPS coatings and the study of effects from several factors: 1. influence of adding calcium and cholesterol to the coatings, 2. effects of composition and roughness on aqueous contact angles, and 3. impact of ageing and storage environment on the coatings. Using these newly developed, highly repeatable quantitative analysis methods, this study sheds light on the microstructural configuration of the DOPS coatings and elucidates previously unexplained phenomena of the coatings. Cholesterol was found to supersaturate in the coatings at high concentration and phase separate into an anhydrous crystalline form, while lower concentrations were found to significantly harden the coatings. Morphological and microstructural changes were detected in the coatings over the course of as little as two weeks that were dependent on the storage environment. The understanding gained paves the path for focused future research effort. Additionally, the methods and techniques developed for the analysis of DOPS coatings have a broader application for the analysis of other problematic biological materials and surfaces.

  19. Evaluation of nursing students' work technique after proficiency training in patient transfer methods during undergraduate education.

    PubMed

    Johnsson, A Christina E; Kjellberg, Anders; Lagerström, Monica I

    2006-05-01

    The aim of this study was to investigate if nursing students improved their work technique when assisting a simulated patient from bed to wheelchair after proficiency training, and to investigate whether there was a correlation between the nursing students' work technique and the simulated patients' perceptions of the transfer. 71 students participated in the study, 35 in the intervention group and 36 in the comparison group. The students assisted a simulated patient to move from a bed to a wheelchair. In the intervention group the students made one transfer before and one after training, and in the comparison group they made two transfers before training. Six variables were evaluated: work technique score; nursing students' ratings of comfort, work technique and exertion, and the simulated patients' perceptions of comfort and safety during the transfer. The result showed that nursing students improved their work technique, and that there was a correlation between the work technique and the simulated patients' subjective ratings of the transfer. In conclusion, nursing students improved their work technique after training in patient transfer methods, and the work technique affected the simulated patients' perceptions of the transfer.

  20. Correcting for deformation in skin-based marker systems.

    PubMed

    Alexander, E J; Andriacchi, T P

    2001-03-01

    A new technique is described that reduces error due to skin movement artifact in the opto-electronic measurement of in vivo skeletal motion. This work builds on a previously described point cluster technique marker set and estimation algorithm by extending the transformation equations to the general deformation case using a set of activity-dependent deformation models. Skin deformation during activities of daily living are modeled as consisting of a functional form defined over the observation interval (the deformation model) plus additive noise (modeling error). The method is described as an interval deformation technique. The method was tested using simulation trials with systematic and random components of deformation error introduced into marker position vectors. The technique was found to substantially outperform methods that require rigid-body assumptions. The method was tested in vivo on a patient fitted with an external fixation device (Ilizarov). Simultaneous measurements from markers placed on the Ilizarov device (fixed to bone) were compared to measurements derived from skin-based markers. The interval deformation technique reduced the errors in limb segment pose estimate by 33 and 25% compared to the classic rigid-body technique for position and orientation, respectively. This newly developed method has demonstrated that by accounting for the changing shape of the limb segment, a substantial improvement in the estimates of in vivo skeletal movement can be achieved.

  1. A short review on a complication of lumbar spine surgery: CSF leak.

    PubMed

    Menon, Sajesh K; Onyia, Chiazor U

    2015-12-01

    Cerebrospinal fluid (CSF) leak is a common complication of surgery involving the lumbar spine. Over the past decades, there has been significant advancement in understanding the basis, management and techniques of treatment for post-operative CSF leak following lumbar spine surgery. In this article, we review previous work in the literature on the various factors and technical errors during or after lumbar spine surgery that may lead to this feared complication, the available options of management with focus on the various techniques employed, the outcomes and also to highlight on the current trends. We also discuss the presentation, factors contributing to its development, basic concepts and practical aspects of the management with emphasis on the different techniques of treatment. Different outcomes following various techniques of managing post-operative CSF leak after lumbar spine surgery have been well described in the literature. However, there is currently no most ideal technique among the available options. The choice of which technique to be applied in each case is dependent on each surgeon's cumulative experience as well as a clear understanding of the contributory underlying factors in each patient, the nature and site of the leak, the available facilities and equipment. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  3. Acoustic resonance in MEMS scale cylindrical tubes with side branches

    NASA Astrophysics Data System (ADS)

    Schill, John F.; Holthoff, Ellen L.; Pellegrino, Paul M.; Marcus, Logan S.

    2014-05-01

    Photoacoustic spectroscopy (PAS) is a useful monitoring technique that is well suited for trace gas detection. This method routinely exhibits detection limits at the parts-per-million (ppm) or parts-per-billion (ppb) level for gaseous samples. PAS also possesses favorable detection characteristics when the system dimensions are scaled to a microelectromechanical system (MEMS) design. One of the central issues related to sensor miniaturization is optimization of the photoacoustic cell geometry, especially in relationship to high acoustical amplification and reduced system noise. Previous work relied on a multiphysics approach to analyze the resonance structures of the MEMS scale photo acoustic cell. This technique was unable to provide an accurate model of the acoustic structure. In this paper we describe a method that relies on techniques developed from musical instrument theory and electronic transmission line matrix methods to describe cylindrical acoustic resonant cells with side branches of various configurations. Experimental results are presented that demonstrate the ease and accuracy of this method. All experimental results were within 2% of those predicted by this theory.

  4. Constraint-based integration of planning and scheduling for space-based observatory management

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Smith, Steven F.

    1994-01-01

    Progress toward the development of effective, practical solutions to space-based observatory scheduling problems within the HSTS scheduling framework is reported. HSTS was developed and originally applied in the context of the Hubble Space Telescope (HST) short-term observation scheduling problem. The work was motivated by the limitations of the current solution and, more generally, by the insufficiency of classical planning and scheduling approaches in this problem context. HSTS has subsequently been used to develop improved heuristic solution techniques in related scheduling domains and is currently being applied to develop a scheduling tool for the upcoming Submillimeter Wave Astronomy Satellite (SWAS) mission. The salient architectural characteristics of HSTS and their relationship to previous scheduling and AI planning research are summarized. Then, some key problem decomposition techniques underlying the integrated planning and scheduling approach to the HST problem are described; research results indicate that these techniques provide leverage in solving space-based observatory scheduling problems. Finally, more recently developed constraint-posting scheduling procedures and the current SWAS application focus are summarized.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chandler, D.P.; Welt, M.; Leung, F.C.

    An efficient one-step injection technique for gene insertion into fertilized rainbow trout (Oncorhynchus mykiss) eggs is described, and basic parameters affecting egg survival are reported. Freshly fertilized rainbow trout eggs were injected in the perivitelline space with a recombinant mouse metallothionein-genomic bovine growth hormone (bGH) DNA construct using a 30-gauge hypodermic needle and a standard microinjection system. Relative to control, site of injection and DNA concentration did not affect the egg survival, but injections later than 3--4 hours post fertilization were detrimental. The injection technique permitted treatment of 100 eggs/hr with survivals up to 100%, resulting in a 4% DNAmore » uptake rate as indicated by DNA dot blot analysis. Positive dot blot results also indicated that the injected DNA is able to cross the vitelline membrane and persist for 50--60 days post hatching, obviating the need for direct injection into the germinal disk. Results are consistent with previous transgenic fish work, underscoring the usefulness of the technique for generating transgenic trout and salmonids. 24 refs., 6 figs., 3 tabs.« less

  6. Generating Models of Infinite-State Communication Protocols Using Regular Inference with Abstraction

    NASA Astrophysics Data System (ADS)

    Aarts, Fides; Jonsson, Bengt; Uijen, Johan

    In order to facilitate model-based verification and validation, effort is underway to develop techniques for generating models of communication system components from observations of their external behavior. Most previous such work has employed regular inference techniques which generate modest-size finite-state models. They typically suppress parameters of messages, although these have a significant impact on control flow in many communication protocols. We present a framework, which adapts regular inference to include data parameters in messages and states for generating components with large or infinite message alphabets. A main idea is to adapt the framework of predicate abstraction, successfully used in formal verification. Since we are in a black-box setting, the abstraction must be supplied externally, using information about how the component manages data parameters. We have implemented our techniques by connecting the LearnLib tool for regular inference with the protocol simulator ns-2, and generated a model of the SIP component as implemented in ns-2.

  7. Glancing-incidence X-ray diffraction of Ag nanoparticles in gold lustre decoration of Italian Renaissance pottery

    NASA Astrophysics Data System (ADS)

    Bontempi, E.; Colombi, P.; Depero, L. E.; Cartechini, L.; Presciutti, F.; Brunetti, B. G.; Sgamellotti, A.

    2006-06-01

    Lustre is known as one of the most significant decorative techniques of Medieval and Renaissance pottery in the Mediterranean basin, characterized by brilliant gold and red metallic reflections and iridescence effects. Previous studies by various techniques (SEM-EDS and TEM, UV-VIS, XRF, RBS and EXAFS) demonstrated that lustre consists of a heterogeneous metal-glass composite film, formed by Cu and Ag nanoparticles dispersed within the outer layer of a tin-opacified lead glaze. In the present work the investigation of an original gold lustre sample from Deruta has been carried out by means of glancing-incidence X-ray diffraction techniques (GIXRD). The study was aimed at providing information on structure and depth distribution of Ag nanoparticles. Exploiting the capability of controlling X-ray penetration in the glaze by changing the incidence angle, we used GIXRD measurements to estimate non-destructively thickness and depth of silver particles present in the first layers of the glaze.

  8. A review on pesticide removal through different processes.

    PubMed

    Marican, Adolfo; Durán-Lara, Esteban F

    2018-01-01

    The main organic pollutants worldwide are pesticides, persistent chemicals that are of concern owing to their prevalence in various ecosystems. In nature, pesticide remainders are subjected to the chemical, physical, and biochemical degradation process, but because of its elevated stability and some cases water solubility, the pesticide residues persist in the ecosystem. The removal of pesticides has been performed through several techniques classified under biological, chemical, physical, and physicochemical process of remediation from different types of matrices, such as water and soil. This review provides a description of older and newer techniques and materials developed to remove specific pesticides according to previous classification, which range from bioremediation with microorganisms, clay, activated carbon, and polymer materials to chemical treatment based on oxidation processes. Some types of pesticides that have been removed successfully to large and small scale include, organophosphorus, carbamates, organochlorines, chlorophenols, and synthetic pyrethroids, among others. The most important characteristics, advantages, and disadvantages of techniques and materials for removing pesticides are described in this work.

  9. Temperature and leakage aware techniques to improve cache reliability

    NASA Astrophysics Data System (ADS)

    Akaaboune, Adil

    Decreasing power consumption in small devices such as handhelds, cell phones and high-performance processors is now one of the most critical design concerns. On-chip cache memories dominate the chip area in microprocessors and thus arises the need for power efficient cache memories. Cache is the simplest cost effective method to attain high speed memory hierarchy and, its performance is extremely critical for high speed computers. Cache is used by the microprocessor for channeling the performance gap between processor and main memory (RAM) hence the memory bandwidth is frequently a bottleneck which can affect the peak throughput significantly. In the design of any cache system, the tradeoffs of area/cost, performance, power consumption, and thermal management must be taken into consideration. Previous work has mainly concentrated on performance and area/cost constraints. More recent works have focused on low power design especially for portable devices and media-processing systems, however fewer research has been done on the relationship between heat management, Leakage power and cost per die. Lately, the focus of power dissipation in the new generations of microprocessors has shifted from dynamic power to idle power, a previously underestimated form of power loss that causes battery charge to drain and shutdown too early due the waste of energy. The problem has been aggravated by the aggressive scaling of process; device level method used originally by designers to enhance performance, conserve dissipation and reduces the sizes of digital circuits that are increasingly condensed. This dissertation studies the impact of hotspots, in the cache memory, on leakage consumption and microprocessor reliability and durability. The work will first prove that by eliminating hotspots in the cache memory, leakage power will be reduced and therefore, the reliability will be improved. The second technique studied is data quality management that improves the quality of the data stored in the cache to reduce power consumption. The initial work done on this subject focuses on the type of data that increases leakage consumption and ways to manage without impacting the performance of the microprocessor. The second phase of the project focuses on managing the data storage in different blocks of the cache to smooth the leakage power as well as dynamic power consumption. The last technique is a voltage controlled cache to reduce the leakage consumption of the cache while in execution and even in idle state. Two blocks of the 4-way set associative cache go through a voltage regulator before getting to the voltage well, and the other two are directly connected to the voltage well. The idea behind this technique is to use the replacement algorithm information to increase or decrease voltage of the two blocks depending on the need of the information stored on them.

  10. Low frequency radio synthesis imaging of the galactic center region

    NASA Astrophysics Data System (ADS)

    Nord, Michael Evans

    2005-11-01

    The Very Large Array radio interferometer has been equipped with new receivers to allow observations at 330 and 74 MHz, frequencies much lower than were previously possible with this instrument. Though the VLA dishes are not optimal for working at these frequencies, the system is successful and regular observations are now taken at these frequencies. However, new data analysis techniques are required to work at these frequencies. The technique of self- calibration, used to remove small atmospheric effects at higher frequencies, has been adapted to compensate for ionospheric turbulence in much the same way that adaptive optics is used in the optical regime. Faceted imaging techniques are required to compensate for the noncoplanar image distortion that affects the system due to the wide fields of view at these frequencies (~2.3° at 330 MHz and ~11° at 74 MHz). Furthermore, radio frequency interference is a much larger problem at these frequencies than in higher frequencies and novel approaches to its mitigation are required. These new techniques and new system are allowing for imaging of the radio sky at sensitivities and resolutions orders of magnitude higher than were possible with the low frequency systems of decades past. In this work I discuss the advancements in low frequency data techniques required to make high resolution, high sensitivity, large field of view measurements with the new Very Large Array low frequency system and then detail the results of turning this new system and techniques on the center of our Milky Way Galaxy. At 330 MHz I image the Galactic center region with roughly 10 inches resolution and 1.6 mJy beam -1 sensitivity. New Galactic center nonthermal filaments, new pulsar candidates, and the lowest frequency detection to date of the radio source associated with our Galaxy's central massive black hole result. At 74 MHz I image a region of the sky roughly 40° x 6° with, ~10 feet resolution. I use the high opacity of H II regions at 74 MHz to extract three-dimensional data on the distribution of Galactic cosmic ray emissivity, a measurement possible only at low radio frequencies.

  11. Talking Physics: Two Case Studies on Short Answers and Self-explanation in Learning Physics

    NASA Astrophysics Data System (ADS)

    Badeau, Ryan C.

    This thesis explores two case studies into the use of short answers and self-explanation to improve student learning in physics. The first set of experiments focuses on the role of short answer questions in the context of computer-based instruction. Through a series of six experiments, we compare and evaluate the performance of computer-assessed short answer questions versus multiple choice for training conceptual topics in physics, controlling for feedback between the two formats. In addition to finding overall similar improvements on subsequent student performance and retention, we identify unique differences in how students interact with the treatments in terms of time spent on feedback and performance on follow-up short answer assessment. In addition, we identify interactions between the level of interactivity of the training, question format, and student attitudinal ratings of each respective training. The second case study focuses on the use of worked examples in the context of multi-concept physics problems - which we call "synthesis problems." For this part of the thesis, four experiments were designed to evaluate the effectiveness of two instructional methods employing worked examples on student performance with synthesis problems; these instructional techniques, analogical comparison and self-explanation, have previously been studied primarily in the context of single-concept problems. As such, the work presented here represents a novel focus on extending these two techniques to this class of more complicated physics problem. Across the four experiments, both self-explanation and certain kinds of analogical comparison of worked examples significantly improved student performance on a target synthesis problem, with distinct improvements in recognition of the relevant concepts. More specifically, analogical comparison significantly improved student performance when the comparisons were invoked between worked synthesis examples. In contrast, similar comparisons between corresponding pairs of worked single-concept examples did not significantly improve performance. On a more complicated synthesis problem, self-explanation was significantly more effective than analogical comparison, potentially due to differences in how successfully students encoded the full structure of the worked examples. Finally, we find that the two techniques can be combined for additional benefit, with the trade-off of slightly more time-on-task.

  12. One-step direct-laser metal writing of sub-100 nm 3D silver nanostructures in a gelatin matrix

    NASA Astrophysics Data System (ADS)

    Kang, SeungYeon; Vora, Kevin; Mazur, Eric

    2015-03-01

    Developing an ability to fabricate high-resolution, 3D metal nanostructures in a stretchable 3D matrix is a critical step to realizing novel optoelectronic devices such as tunable bulk metal-dielectric optical devices and THz metamaterial devices that are not feasible with alternative techniques. We report a new chemistry method to fabricate high-resolution, 3D silver nanostructures using a femtosecond-laser direct metal writing technique. Previously, only fabrication of 3D polymeric structures or single-/few-layer metal structures was possible. Our method takes advantage of unique gelatin properties to overcome such previous limitations as limited freedom in 3D material design and short sample lifetime. We fabricate more than 15 layers of 3D silver nanostructures with a resolution of less than 100 nm in a stable dielectric matrix that is flexible and has high large transparency that is well-matched for potential applications in the optical and THz metamaterial regimes. This is a single-step process that does not require any further processing. This work will be of interest to those interested in fabrication methods that utilize nonlinear light-matter interactions and the realization of future metamaterials.

  13. Real-time, ultrasound-based control of a virtual hand by a trans-radial amputee.

    PubMed

    Baker, Clayton A; Akhlaghi, Nima; Rangwala, Huzefa; Kosecka, Jana; Sikdar, Siddhartha

    2016-08-01

    Advancements in multiarticulate upper-limb prosthetics have outpaced the development of intuitive, non-invasive control mechanisms for implementing them. Surface electromyography is currently the most popular non-invasive control method, but presents a number of drawbacks including poor deep-muscle specificity. Previous research established the viability of ultrasound imaging as an alternative means of decoding movement intent, and demonstrated the ability to distinguish between complex grasps in able-bodied subjects via imaging of the anterior forearm musculature. In order to translate this work to clinical viability, able-bodied testing is insufficient. Amputation-induced changes in muscular geometry, dynamics, and imaging characteristics are all likely to influence the effectiveness of our existing techniques. In this work, we conducted preliminary trials with a transradial amputee participant to assess these effects, and potentially elucidate necessary refinements to our approach. Two trials were performed, the first using a set of three motion types, and the second using four. After a brief training period in each trial, the participant was able to control a virtual prosthetic hand in real-time; attempted grasps were successfully classified with a rate of 77% in trial 1, and 71% in trial 2. While the results are sub-optimal compared to our previous able-bodied testing, they are a promising step forward. More importantly, the data collected during these trials can provide valuable information for refining our image processing methods, especially via comparison to previously acquired data from able-bodied individuals. Ultimately, further work with amputees is a necessity for translation towards clinical application.

  14. Validation and Improvement of SRTM Performance over Rugged Terrain

    NASA Technical Reports Server (NTRS)

    Zebker, Howard A.

    2004-01-01

    We have previously reported work related to basic technique development in phase unwrapping and generation of digital elevation models (DEM). In the final year of this work we have applied our technique work to the improvement of DEM's produced by SRTM. In particular, we have developed a rigorous mathematical algorithm and means to fill in missing data over rough terrain from other data sets. We illustrate this method by using a higher resolution, but globally less accurate, DEM produced by the TOPSAR airborne instrument over the Galapagos Islands to augment the SRTM data set in this area, We combine this data set with SRTM to use each set to fill in holes left over by the other imaging system. The infilling is done by first interpolating each data set using a prediction error filter that reproduces the same statistical characterization as exhibited by the entire data set within the interpolated region. After this procedure is implemented on each data set, the two are combined on a point by point basis with weights that reflect the accuracy of each data point in its original image. In areas that are better covered by SRTM, TOPSAR data are weighted down but still retain TOPSAR statistics. The reverse is true for regions better covered by TOPSAR. The resulting DEM passes statistical tests and appears quite feasible to the eye, but as this DEM is the best available for the region we cannot fully veri@ its accuracy. Spot checks with GPS points show that locally the technique results in a more comprehensive and accurate map than either data set alone.

  15. The Whole AMS Matrix: Using the Owens Lake, Ardath Slump, and Gaviota Slide cores to explore classification of ellipsoid shapes

    NASA Astrophysics Data System (ADS)

    Schwehr, K.; Driscoll, N.; Tauxe, L.

    2004-12-01

    Categorizing sediment history using Anisotropy of Magnetic Susceptibility (AMS) has been a long standing challenge for the paleomagnetic community. The goal is to have a robust test of shape fabrics that allows workers to classify sediments in terms of being primary depositional fabric, deposition in with currents, or altered fabrics. Additionally, it is important to be able to distinguish altered fabrics into such classes as slumps, crypto-slumps, drilling deformation (such as fluidization from drilling mud and flow-in), and so forth. To try to bring a unified test scheme to AMS interpretation, we are using three example test cases. First is the Owens Lake OL92 core, which has provided previous workers with a long core example in a lacustrian environment. OL92 was classified into five zones based on visual observations of the core photographs. Using these groupings, Rosenbaum et al. (2000) was able to use the deflection of the minimum eigen vector from vertical to classify each individual AMS sample. Second is the Ardath Shale location, which provides a clear case of a lithified outcrop scale problem that showed success with the bootstrap eigen value test. Finally is the Gaviota Slide in the Santa Barbara Basin, which provides usage of 1-2 meter gravity cores. Previous work has focused on Flinn, Jelinek, and bootstrap plots of eigen values. In supporting the shape characterization we have also used a 95% confidence F-Test by means of Hext's statistical work. We have extended the F-Test into a promising new plot of the F12 and F23 confidence values, which shows good clustering in early tests. We have applied all of the available techniques to the above three test cases and will present how each technique either succeeds or fails. Since each method has its own strengths and weaknesses, it is clear that the community needs to carefully evaluate which technique should be applied to any particular problem.

  16. Ultrasonic guided wave tomography of pipes: A development of new techniques for the nondestructive evaluation of cylindrical geometries and guided wave multi-mode analysis

    NASA Astrophysics Data System (ADS)

    Leonard, Kevin Raymond

    This dissertation concentrates on the development of two new tomographic techniques that enable wide-area inspection of pipe-like structures. By envisioning a pipe as a plate wrapped around upon itself, the previous Lamb Wave Tomography (LWT) techniques are adapted to cylindrical structures. Helical Ultrasound Tomography (HUT) uses Lamb-like guided wave modes transmitted and received by two circumferential arrays in a single crosshole geometry. Meridional Ultrasound Tomography (MUT) creates the same crosshole geometry with a linear array of transducers along the axis of the cylinder. However, even though these new scanning geometries are similar to plates, additional complexities arise because they are cylindrical structures. First, because it is a single crosshole geometry, the wave vector coverage is poorer than in the full LWT system. Second, since waves can travel in both directions around the circumference of the pipe, modes can also constructively and destructively interfere with each other. These complexities necessitate improved signal processing algorithms to produce accurate and unambiguous tomographic reconstructions. Consequently, this work also describes a new algorithm for improving the extraction of multi-mode arrivals from guided wave signals. Previous work has relied solely on the first arriving mode for the time-of-flight measurements. In order to improve the LWT, HUT and MUT systems reconstructions, improved signal processing methods are needed to extract information about the arrival times of the later arriving modes. Because each mode has different through-thickness displacement values, they are sensitive to different types of flaws, and the information gained from the multi-mode analysis improves understanding of the structural integrity of the inspected material. Both tomographic frequency compounding and mode sorting algorithms are introduced. It is also shown that each of these methods improve the reconstructed images both qualitatively and quantitatively.

  17. Early effects of ageing on the mechanical performance of isolated locomotory (EDL) and respiratory (diaphragm) skeletal muscle using the work-loop technique.

    PubMed

    Tallis, Jason; James, Rob S; Little, Alexander G; Cox, Val M; Duncan, Michael J; Seebacher, Frank

    2014-09-15

    Previous isolated muscle studies examining the effects of ageing on contractility have used isometric protocols, which have been shown to have poor relevance to dynamic muscle performance in vivo. The present study uniquely uses the work-loop technique for a more realistic estimation of in vivo muscle function to examine changes in mammalian skeletal muscle mechanical properties with age. Measurements of maximal isometric stress, activation and relaxation time, maximal power output, and sustained power output during repetitive activation and recovery are compared in locomotory extensor digitorum longus (EDL) and core diaphragm muscle isolated from 3-, 10-, 30-, and 50-wk-old female mice to examine the early onset of ageing. A progressive age-related reduction in maximal isometric stress that was of greater magnitude than the decrease in maximal power output occurred in both muscles. Maximal force and power developed earlier in diaphragm than EDL muscle but demonstrated a greater age-related decline. The present study indicates that ability to sustain skeletal muscle power output through repetitive contraction is age- and muscle-dependent, which may help rationalize previously reported equivocal results from examination of the effect of age on muscular endurance. The age-related decline in EDL muscle performance is prevalent without a significant reduction in muscle mass, and biochemical analysis of key marker enzymes suggests that although there is some evidence of a more oxidative fiber type, this is not the primary contributor to the early age-related reduction in muscle contractility. Copyright © 2014 the American Physiological Society.

  18. A cryocooler for applications requiring low magnetic and mechanical interference

    NASA Technical Reports Server (NTRS)

    Zimmerman, J. E.; Daney, D. E.; Sullivan, D. B.

    1983-01-01

    A very low-power, low-interference Stirling cryocooler is being developed based on principles and techniques described in several previous publications over the last four years. It differs in several important details from those built previously. It uses a tapered displacer based upon an analytical optimization procedure. The displacer is driven by an auxiliary piston and cylinder (rather than by mechanical linkage) using some of the working fluid itself to provide the driving force. This provides smooth, vibration-free motion, and, more importantly, allows complete mechanical and spatial separation of the cryostat from the pressure-wave generator. Either of two different pressure-wave generators can be used. One is a non-contaminating, unlubricated ceramic piston and cylinder. The other is a compressed-air-operated rubber diaphragm with motor-driven valves to cycle the pressure between appropriate limits.

  19. Bethe-Boltzmann hydrodynamics and spin transport in the XXZ chain

    NASA Astrophysics Data System (ADS)

    Bulchandani, Vir B.; Vasseur, Romain; Karrasch, Christoph; Moore, Joel E.

    2018-01-01

    Quantum integrable systems, such as the interacting Bose gas in one dimension and the XXZ quantum spin chain, have an extensive number of local conserved quantities that endow them with exotic thermalization and transport properties. We discuss recently introduced hydrodynamic approaches for such integrable systems from the viewpoint of kinetic theory and extend the previous works by proposing a numerical scheme to solve the hydrodynamic equations for finite times and arbitrary locally equilibrated initial conditions. We then discuss how such methods can be applied to describe nonequilibrium steady states involving ballistic heat and spin currents. In particular, we show that the spin Drude weight in the XXZ chain, previously accessible only by rigorous techniques of limited scope or controversial thermodynamic Bethe ansatz arguments, may be evaluated from hydrodynamics in very good agreement with density-matrix renormalization group calculations.

  20. Chemical Dynamics of nano-Aluminum and Iodine Based Oxidizers

    NASA Astrophysics Data System (ADS)

    Little, Brian; Ridge, Claron; Overdeep, Kyle; Slizewski, Dylan; Lindsay, Michael

    2017-06-01

    As observed in previous studies of nanoenergetic powder composites, micro/nano-structural features such as particle morphology and/or reactant spatial distance are expected to strongly influence properties that govern the combustion behavior of energetic materials (EM). In this study, highly reactive composites containing crystalline iodine (V) oxide or iodate salts with nano-sized aluminum (nAl) were blended by two different processing techniques and then collected as a powder for characterization. Physiochemical techniques such as thermal gravimetric analysis, calorimetry, X-ray diffraction, electron microscopy, high speed photography, pressure profile analysis, temperature programmed reactions, and spectroscopy were employed to characterize these EM with emphasis on correlating the chemical reactivity with inherent structural features and variations in stoichiometry. This work is a continuation of efforts to probe the chemical dynamics of nAl-iodine based composites.

  1. Soft magnetic tweezers: a proof of principle.

    PubMed

    Mosconi, Francesco; Allemand, Jean François; Croquette, Vincent

    2011-03-01

    We present here the principle of soft magnetic tweezers which improve the traditional magnetic tweezers allowing the simultaneous application and measurement of an arbitrary torque to a deoxyribonucleic acid (DNA) molecule. They take advantage of a nonlinear coupling regime that appears when a fast rotating magnetic field is applied to a superparamagnetic bead immersed in a viscous fluid. In this work, we present the development of the technique and we compare it with other techniques capable of measuring the torque applied to the DNA molecule. In this proof of principle, we use standard electromagnets to achieve our experiments. Despite technical difficulties related to the present implementation of these electromagnets, the agreement of measurements with previous experiments is remarkable. Finally, we propose a simple way to modify the experimental design of electromagnets that should bring the performances of the device to a competitive level.

  2. Research on slow electron collision processes in gases. Final report, September 15, 1970--December 31, 1972

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baldwin, G C

    1974-04-30

    Research on low energy electron collisions in gases by the time-of- flight velocity selection technique included, as a preliminary to total cross section measurements, investigations of the statistical and systematic errors inherent in the technique. In particular, thermal transpiration and instrumental fluctuation errors in manometry were investigated, and the results embodied in computer programs for data reduction. The instrumental system was improved to permit extended periods of data accumulation without manual attention. Total cross section measurements in helium, made prior to, and in molecular nitrogen, made after the supporting work was completed, are reported. The total cross sec tion ofmore » helium is found to be higher than reported in previous beam determinations. That of nitrogen is found to be structureless at low energies. (auth)« less

  3. Accurate, fast, and secure biometric fingerprint recognition system utilizing sensor fusion of fingerprint patterns

    NASA Astrophysics Data System (ADS)

    El-Saba, Aed; Alsharif, Salim; Jagapathi, Rajendarreddy

    2011-04-01

    Fingerprint recognition is one of the first techniques used for automatically identifying people and today it is still one of the most popular and effective biometric techniques. With this increase in fingerprint biometric uses, issues related to accuracy, security and processing time are major challenges facing the fingerprint recognition systems. Previous work has shown that polarization enhancementencoding of fingerprint patterns increase the accuracy and security of fingerprint systems without burdening the processing time. This is mainly due to the fact that polarization enhancementencoding is inherently a hardware process and does not have detrimental time delay effect on the overall process. Unpolarized images, however, posses a high visual contrast and when fused (without digital enhancement) properly with polarized ones, is shown to increase the recognition accuracy and security of the biometric system without any significant processing time delay.

  4. In vivo measurement of mechanical properties of human long bone by using sonic sound

    NASA Astrophysics Data System (ADS)

    Hossain, M. Jayed; Rahman, M. Moshiur; Alam, Morshed

    2016-07-01

    Vibration analysis has evaluated as non-invasive techniques for the in vivo assessment of bone mechanical properties. The relation between the resonant frequencies, long bone geometry and mechanical properties can be obtained by vibration analysis. In vivo measurements were performed on human ulna as a simple beam model with an experimental technique and associated apparatus. The resonant frequency of the ulna was obtained by Fast Fourier Transformation (FFT) analysis of the vibration response of piezoelectric accelerometer. Both elastic modulus and speed of the sound were inferred from the resonant frequency. Measurement error in the improved experimental setup was comparable with the previous work. The in vivo determination of bone elastic response has potential value in screening programs for metabolic bone disease, early detection of osteoporosis and evaluation of skeletal effects of various therapeutic modalities.

  5. Fe Oxides on Ag Surfaces: Structure and Reactivity

    DOE PAGES

    Shipilin, M.; Lundgren, E.; Gustafson, J.; ...

    2016-09-09

    One layer thick iron oxide films are attractive from both applied and fundamental science perspectives. The structural and chemical properties of these systems can be tuned by changing the substrate, making them promising materials for heterogeneous catalysis. In the present work, we investigate the structure of FeO(111) monolayer films grown on Ag(100) and Ag(111) substrates by means of microscopy and diffraction techniques and compare it with the structure of FeO(111) grown on other substrates reported in literature. We also study the NO adsorption properties of FeO(111)/Ag(100) and FeO(111)/Ag(111) systems utilizing different spectroscopic techniques. Finally, we discuss similarities and differences inmore » the data obtained from adsorption experiments and compare it with previous results for FeO(111)/Pt(111).« less

  6. Fe Oxides on Ag Surfaces: Structure and Reactivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shipilin, M.; Lundgren, E.; Gustafson, J.

    One layer thick iron oxide films are attractive from both applied and fundamental science perspectives. The structural and chemical properties of these systems can be tuned by changing the substrate, making them promising materials for heterogeneous catalysis. In the present work, we investigate the structure of FeO(111) monolayer films grown on Ag(100) and Ag(111) substrates by means of microscopy and diffraction techniques and compare it with the structure of FeO(111) grown on other substrates reported in literature. We also study the NO adsorption properties of FeO(111)/Ag(100) and FeO(111)/Ag(111) systems utilizing different spectroscopic techniques. Finally, we discuss similarities and differences inmore » the data obtained from adsorption experiments and compare it with previous results for FeO(111)/Pt(111).« less

  7. Generalized Robertson-Walker Space-Time Admitting Evolving Null Horizons Related to a Black Hole Event Horizon

    PubMed Central

    2016-01-01

    A new technique is used to study a family of time-dependent null horizons, called “Evolving Null Horizons” (ENHs), of generalized Robertson-Walker (GRW) space-time (M¯,g¯) such that the metric g¯ satisfies a kinematic condition. This work is different from our early papers on the same issue where we used (1 + n)-splitting space-time but only some special subcases of GRW space-time have this formalism. Also, in contrast to previous work, we have proved that each member of ENHs is totally umbilical in (M¯,g¯). Finally, we show that there exists an ENH which is always a null horizon evolving into a black hole event horizon and suggest some open problems. PMID:27722202

  8. Error analysis in inverse scatterometry. I. Modeling.

    PubMed

    Al-Assaad, Rayan M; Byrne, Dale M

    2007-02-01

    Scatterometry is an optical technique that has been studied and tested in recent years in semiconductor fabrication metrology for critical dimensions. Previous work presented an iterative linearized method to retrieve surface-relief profile parameters from reflectance measurements upon diffraction. With the iterative linear solution model in this work, rigorous models are developed to represent the random and deterministic or offset errors in scatterometric measurements. The propagation of different types of error from the measurement data to the profile parameter estimates is then presented. The improvement in solution accuracies is then demonstrated with theoretical and experimental data by adjusting for the offset errors. In a companion paper (in process) an improved optimization method is presented to account for unknown offset errors in the measurements based on the offset error model.

  9. Relativistic algorithm for time transfer in Mars missions under IAU Resolutions: an analytic approach

    NASA Astrophysics Data System (ADS)

    Pan, Jun-Yang; Xie, Yi

    2015-02-01

    With tremendous advances in modern techniques, Einstein's general relativity has become an inevitable part of deep space missions. We investigate the relativistic algorithm for time transfer between the proper time τ of the onboard clock and the Geocentric Coordinate Time, which extends some previous works by including the effects of propagation of electromagnetic signals. In order to evaluate the implicit algebraic equations and integrals in the model, we take an analytic approach to work out their approximate values. This analytic model might be used in an onboard computer because of its limited capability to perform calculations. Taking an orbiter like Yinghuo-1 as an example, we find that the contributions of the Sun, the ground station and the spacecraft dominate the outcomes of the relativistic corrections to the model.

  10. Tire Changes, Fresh Air, and Yellow Flags: Challenges in Predictive Analytics for Professional Racing.

    PubMed

    Tulabandhula, Theja; Rudin, Cynthia

    2014-06-01

    Our goal is to design a prediction and decision system for real-time use during a professional car race. In designing a knowledge discovery process for racing, we faced several challenges that were overcome only when domain knowledge of racing was carefully infused within statistical modeling techniques. In this article, we describe how we leveraged expert knowledge of the domain to produce a real-time decision system for tire changes within a race. Our forecasts have the potential to impact how racing teams can optimize strategy by making tire-change decisions to benefit their rank position. Our work significantly expands previous research on sports analytics, as it is the only work on analytical methods for within-race prediction and decision making for professional car racing.

  11. User guide for MODPATH Version 7—A particle-tracking model for MODFLOW

    USGS Publications Warehouse

    Pollock, David W.

    2016-09-26

    MODPATH is a particle-tracking post-processing program designed to work with MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. MODPATH version 7 is the fourth major release since its original publication. Previous versions were documented in USGS Open-File Reports 89–381 and 94–464 and in USGS Techniques and Methods 6–A41.MODPATH version 7 works with MODFLOW-2005 and MODFLOW–USG. Support for unstructured grids in MODFLOW–USG is limited to smoothed, rectangular-based quadtree and quadpatch grids.A software distribution package containing the computer program and supporting documentation, such as input instructions, output file descriptions, and example problems, is available from the USGS over the Internet (http://water.usgs.gov/ogw/modpath/).

  12. Rotating Reverse Osmosis for Wastewater Reuse

    NASA Technical Reports Server (NTRS)

    Lueptow, Richard M.; Yoon, Yeomin; Pederson, Cynthia

    2004-01-01

    Our previous work established the concept of a low-pressure rotating reverse osmosis membrane system. The rotation of the cylindrical RO filter produces shear and Taylor vortices in the annulus of the device that decrease the concentration polarization and fouling commonly seen with conventional RO filtration techniques. A mathematical model based on the film theory and the solution-diffusion model agrees well with the experimental results obtained using this first generation prototype. However, based on the model, the filtrate flux and contaminant rejection depend strongly on the transmembrane pressure. Therefore, the goal of our current work is to improve the flux of the device by increasing the transmembrane pressure by a factor of 3 to 4. In addition, the rejections for a wider variety of inorganic and organic compounds typically found in space mission wastewater are measured.

  13. Theoretical estimates of the width of light-meson states in the SO(4) (2+1)-flavor limit

    NASA Astrophysics Data System (ADS)

    Yépez-Martínez, Tochtli; Civitarese, Osvaldo; Hess, Peter Otto

    The low-energy sector of the mesonic spectrum exhibits some features which may be understood in terms of the SO(4) symmetry contained in the QCD-Hamiltonian written in the Coulomb Gauge. In our previous work, we have shown that this is indeed the case when the Instantaneous Color-Charge Interaction (ICCI) is treated by means of nonperturbative many-body techniques. Continuing along this line of description, in this work we calculate the width of meson states belonging to the low portion of the spectrum (E < 1 GeV). In spite of the rather simple structure of the Hamiltonian used to calculate the spectra of pseudoscalar and vector mesons, the results for the width of these states follow the pattern of the data.

  14. Design, Manufacturing and Characterization of Functionally Graded Flextensional Piezoelectric Actuators

    NASA Astrophysics Data System (ADS)

    Amigo, R. C. R.; Vatanabe, S. L.; Silva, E. C. N.

    2013-03-01

    Previous works have been shown several advantages in using Functionally Graded Materials (FGMs) for the performance of flextensional devices, such as reduction of stress concentrations and gains in reliability. In this work, the FGM concept is explored in the design of graded devices by using the Topology Optimization Method (TOM), in order to determine optimal topologies and gradations of the coupled structures of piezoactuators. The graded pieces are manufactured by using the Spark Plasma Sintering (SPS) technique and are bonded to piezoelectric ceramics. The graded actuators are then tested by using a modular vibrometer system for measuring output displacements, in order to validate the numerical simulations. The technological path developed here represents the initial step toward the manufacturing of an integral piezoelectric device, constituted by piezoelectric and non-piezoelectric materials without bonding layers.

  15. Cell separation by immunoaffinity partitioning with polyethylene glycol-modified Protein A in aqueous polymer two-phase systems

    NASA Technical Reports Server (NTRS)

    Karr, Laurel J.; Van Alstine, James M.; Snyder, Robert S.; Shafer, Steven G.; Harris, J. Milton

    1988-01-01

    Previous work has shown that polyethylene glycol (PEG)-bound antibodies can be used as affinity ligands in PEG-dextran two-phase systems to provide selective partitioning of cells to the PEG-rich phase. In the present work it is shown that immunoaffinity partitioning can be simplified by use of PEG-modified Protein A which complexes with unmodified antibody and cells and shifts their partitioning into the PEG-rich phase, thus eliminating the need to prepare a PEG-modified antibody for each cell type. In addition, the paper provides a more rigorous test of the original technique with PEG-bound antibodies by showing that it is effective at shifting the partitioning of either cell type of a mixture of two cell populations.

  16. Cascade Error Projection: A Learning Algorithm for Hardware Implementation

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A.; Daud, Taher

    1996-01-01

    In this paper, we workout a detailed mathematical analysis for a new learning algorithm termed Cascade Error Projection (CEP) and a general learning frame work. This frame work can be used to obtain the cascade correlation learning algorithm by choosing a particular set of parameters. Furthermore, CEP learning algorithm is operated only on one layer, whereas the other set of weights can be calculated deterministically. In association with the dynamical stepsize change concept to convert the weight update from infinite space into a finite space, the relation between the current stepsize and the previous energy level is also given and the estimation procedure for optimal stepsize is used for validation of our proposed technique. The weight values of zero are used for starting the learning for every layer, and a single hidden unit is applied instead of using a pool of candidate hidden units similar to cascade correlation scheme. Therefore, simplicity in hardware implementation is also obtained. Furthermore, this analysis allows us to select from other methods (such as the conjugate gradient descent or the Newton's second order) one of which will be a good candidate for the learning technique. The choice of learning technique depends on the constraints of the problem (e.g., speed, performance, and hardware implementation); one technique may be more suitable than others. Moreover, for a discrete weight space, the theoretical analysis presents the capability of learning with limited weight quantization. Finally, 5- to 8-bit parity and chaotic time series prediction problems are investigated; the simulation results demonstrate that 4-bit or more weight quantization is sufficient for learning neural network using CEP. In addition, it is demonstrated that this technique is able to compensate for less bit weight resolution by incorporating additional hidden units. However, generation result may suffer somewhat with lower bit weight quantization.

  17. Local liquid velocity measurement of Trickle Bed Reactor using Digital Industrial X-ray Radiography

    NASA Astrophysics Data System (ADS)

    Mohd Salleh, Khairul Anuar

    Trickle Bed Reactors (TBRs) are fixed beds of particles in which both liquid and gas flow concurrently downward. They are widely used to produce not only fuels but also lubrication products. The measurement and the knowledge of local liquid velocities (VLL) in TBRs is less which is essential for advancing the understanding of its hydrodynamics and for validation computational fluid dynamics (CFD). Therefore, this work focused on developing a new, non-invasive, statistically reliable technique that can be used to measure local liquid velocity (VLL) in two-dimensions (2-D). This is performed by combining Digital Industrial X-ray Radiography (DIR) and Particle Tracking Velocimetry (PTV) techniques. This work also make possible the development of three-dimensional (3-D) VLL measurements that can be taken in TBRs. Measurements taken through both the combined and the novel technique, once validated, were found to be comparable to another technique (a two-point fiber optical probe) currently being developed at Missouri University of Science and Technology. The results from this study indicate that, for a gas-liquid-solid type bed, the measured VLL can have a maximum range that is between 35 and 51 times that of its superficial liquid velocity (VSL). Without the existence of gas, the measured VLL can have a maximum range that is between 4 and 4.7 times that of its VSL. At a higher V SL, the particle tracer was greatly distributed and became carried away by a high liquid flow rate. Neither the variance nor the range of measured VLL varied for any of the replications, confirming the reproducibility of the experimental measurements used, regardless of the VSL . The liquid's movement inside the pore was consistent with findings from previous studies that used various techniques.

  18. A robust multilevel simultaneous eigenvalue solver

    NASA Technical Reports Server (NTRS)

    Costiner, Sorin; Taasan, Shlomo

    1993-01-01

    Multilevel (ML) algorithms for eigenvalue problems are often faced with several types of difficulties such as: the mixing of approximated eigenvectors by the solution process, the approximation of incomplete clusters of eigenvectors, the poor representation of solution on coarse levels, and the existence of close or equal eigenvalues. Algorithms that do not treat appropriately these difficulties usually fail, or their performance degrades when facing them. These issues motivated the development of a robust adaptive ML algorithm which treats these difficulties, for the calculation of a few eigenvectors and their corresponding eigenvalues. The main techniques used in the new algorithm include: the adaptive completion and separation of the relevant clusters on different levels, the simultaneous treatment of solutions within each cluster, and the robustness tests which monitor the algorithm's efficiency and convergence. The eigenvectors' separation efficiency is based on a new ML projection technique generalizing the Rayleigh Ritz projection, combined with a technique, the backrotations. These separation techniques, when combined with an FMG formulation, in many cases lead to algorithms of O(qN) complexity, for q eigenvectors of size N on the finest level. Previously developed ML algorithms are less focused on the mentioned difficulties. Moreover, algorithms which employ fine level separation techniques are of O(q(sub 2)N) complexity and usually do not overcome all these difficulties. Computational examples are presented where Schrodinger type eigenvalue problems in 2-D and 3-D, having equal and closely clustered eigenvalues, are solved with the efficiency of the Poisson multigrid solver. A second order approximation is obtained in O(qN) work, where the total computational work is equivalent to only a few fine level relaxations per eigenvector.

  19. Evaluation of linear regression techniques for atmospheric applications: the importance of appropriate weighting

    NASA Astrophysics Data System (ADS)

    Wu, Cheng; Zhen Yu, Jian

    2018-03-01

    Linear regression techniques are widely used in atmospheric science, but they are often improperly applied due to lack of consideration or inappropriate handling of measurement uncertainty. In this work, numerical experiments are performed to evaluate the performance of five linear regression techniques, significantly extending previous works by Chu and Saylor. The five techniques are ordinary least squares (OLS), Deming regression (DR), orthogonal distance regression (ODR), weighted ODR (WODR), and York regression (YR). We first introduce a new data generation scheme that employs the Mersenne twister (MT) pseudorandom number generator. The numerical simulations are also improved by (a) refining the parameterization of nonlinear measurement uncertainties, (b) inclusion of a linear measurement uncertainty, and (c) inclusion of WODR for comparison. Results show that DR, WODR and YR produce an accurate slope, but the intercept by WODR and YR is overestimated and the degree of bias is more pronounced with a low R2 XY dataset. The importance of a properly weighting parameter λ in DR is investigated by sensitivity tests, and it is found that an improper λ in DR can lead to a bias in both the slope and intercept estimation. Because the λ calculation depends on the actual form of the measurement error, it is essential to determine the exact form of measurement error in the XY data during the measurement stage. If a priori error in one of the variables is unknown, or the measurement error described cannot be trusted, DR, WODR and YR can provide the least biases in slope and intercept among all tested regression techniques. For these reasons, DR, WODR and YR are recommended for atmospheric studies when both X and Y data have measurement errors. An Igor Pro-based program (Scatter Plot) was developed to facilitate the implementation of error-in-variables regressions.

  20. Mapping the Binding Interface of VEGF and a Monoclonal Antibody Fab-1 Fragment with Fast Photochemical Oxidation of Proteins (FPOP) and Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Wecksler, Aaron T.; Molina, Patricia; Deperalta, Galahad; Gross, Michael L.

    2017-05-01

    We previously analyzed the Fab-1:VEGF (vascular endothelial growth factor) system described in this work, with both native top-down mass spectrometry and bottom-up mass spectrometry (carboxyl-group or GEE footprinting) techniques. This work continues bottom-up mass spectrometry analysis using a fast photochemical oxidation of proteins (FPOP) platform to map the solution binding interface of VEGF and a fragment antigen binding region of an antibody (Fab-1). In this study, we use FPOP to compare the changes in solvent accessibility by quantitating the extent of oxidative modification in the unbound versus bound states. Determining the changes in solvent accessibility enables the inference of the protein binding sites (epitope and paratopes) and a comparison to the previously published Fab-1:VEGF crystal structure, adding to the top-down and bottom-up data. Using this method, we investigated peptide-level and residue-level changes in solvent accessibility between the unbound proteins and bound complex. Mapping these data onto the Fab-1:VEGF crystal structure enabled successful characterization of both the binding region and regions of remote conformation changes. These data, coupled with our previous higher order structure (HOS) studies, demonstrate the value of a comprehensive toolbox of methods for identifying the putative epitopes and paratopes for biotherapeutic antibodies.

  1. International Technical Working Group Round Robin Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dudder, Gordon B.; Hanlen, Richard C.; Herbillion, Georges M.

    The goal of nuclear forensics is to develop a preferred approach to support illicit trafficking investigations. This approach must be widely understood and accepted as credible. The principal objectives of the Round Robin Tests are to prioritize forensic techniques and methods, evaluate attribution capabilities, and examine the utility of database. The HEU (Highly Enriched Uranium) Round Robin, and previous Plutonium Round Robin, have made tremendous contributions to fulfilling these goals through a collaborative learning experience that resulted from the outstanding efforts of the nine participating internal laboratories. A prioritized list of techniques and methods has been developed based on thismore » exercise. Current work is focused on the extent to which the techniques and methods can be generalized. The HEU Round Robin demonstrated a rather high level of capability to determine the important characteristics of the materials and processes using analytical methods. When this capability is combined with the appropriate knowledge/database, it results in a significant capability to attribute the source of the materials to a specific process or facility. A number of shortfalls were also identified in the current capabilities including procedures for non-nuclear forensics and the lack of a comprehensive network of data/knowledge bases. The results of the Round Robin will be used to develop guidelines or a ''recommended protocol'' to be made available to the interested authorities and countries to use in real cases.« less

  2. Stress Management and Relaxation Techniques use among underserved inpatients in an inner city hospital.

    PubMed

    Gardiner, Paula; Sadikova, Ekaterina; Filippelli, Amanda C; Mitchell, Suzanne; White, Laura F; Saper, Robert; Kaptchuk, Ted J; Jack, Brian W; Fredman, Lisa

    2015-06-01

    Little is known about the use of Stress Management and Relaxation Techniques (SMART) in racially diverse inpatients. We hope to identify socioeconomic status (SES) factors, health behavior factors, and clinical factors associated with the use of SMART. We conducted a secondary analysis of baseline data from 623 hospitalized patients enrolled in the Re-Engineered Discharge (RED) clinical trial. We assessed socio-demographic characteristics and use of SMART. We used bivariate and multivariate logistic regression to test the association of SMART with socio-demographic characteristics, health behaviors, and clinical factors. A total of 26.6% of participants reported using SMART and 23.6% used mind body techniques. Thirty six percent of work disabled patients, 39% of illicit drug users, and 38% of participants with depressive symptoms used SMART. Patients who both reported illicit drug use and screened positive for depression had significantly increased odds of using SMART [OR=4.94, 95% CI (1.59, 15.13)]. Compared to non-Hispanic whites, non-Hispanic blacks [0.55 (0.34-0.87)] and Hispanic/other race individuals [0.40 (0.20-0.76)] were less likely to use SMART. We found greater utilization of SMART among all racial groups compared to previous national studies. In the inner city inpatient setting, patients with depression, illicit drug use, and work disability reported higher rates of using SMART. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Fluctuating volume-current formulation of electromagnetic fluctuations in inhomogeneous media: Incandescence and luminescence in arbitrary geometries

    NASA Astrophysics Data System (ADS)

    Polimeridis, Athanasios G.; Reid, M. T. H.; Jin, Weiliang; Johnson, Steven G.; White, Jacob K.; Rodriguez, Alejandro W.

    2015-10-01

    We describe a fluctuating volume-current formulation of electromagnetic fluctuations that extends our recent work on heat exchange and Casimir interactions between arbitrarily shaped homogeneous bodies [A. W. Rodriguez, M. T. H. Reid, and S. G. Johnson, Phys. Rev. B 88, 054305 (2013), 10.1103/PhysRevB.88.054305] to situations involving incandescence and luminescence problems, including thermal radiation, heat transfer, Casimir forces, spontaneous emission, fluorescence, and Raman scattering, in inhomogeneous media. Unlike previous scattering formulations based on field and/or surface unknowns, our work exploits powerful techniques from the volume-integral equation (VIE) method, in which electromagnetic scattering is described in terms of volumetric, current unknowns throughout the bodies. The resulting trace formulas (boxed equations) involve products of well-studied VIE matrices and describe power and momentum transfer between objects with spatially varying material properties and fluctuation characteristics. We demonstrate that thanks to the low-rank properties of the associated matrices, these formulas are susceptible to fast-trace computations based on iterative methods, making practical calculations tractable. We apply our techniques to study thermal radiation, heat transfer, and fluorescence in complicated geometries, checking our method against established techniques best suited for homogeneous bodies as well as applying it to obtain predictions of radiation from complex bodies with spatially varying permittivities and/or temperature profiles.

  4. New markers to identify the provenance of lapis lazuli: trace elements in pyrite by means of micro-PIXE

    NASA Astrophysics Data System (ADS)

    Re, A.; Angelici, D.; Lo Giudice, A.; Maupas, E.; Giuntini, L.; Calusi, S.; Gelli, N.; Massi, M.; Borghi, A.; Gallo, L. M.; Pratesi, G.; Mandò, P. A.

    2013-04-01

    Lapis lazuli has been used for glyptics and carving since the fifth millennium BC to produce jewels, amulets, seals, inlays, etc; the identification of the origin of the stone used for carving artworks may be valuable for reconstructing old trade routes. Since ancient lapis lazuli art objects are precious, only non-destructive techniques can be used to identify their provenance, and ion beam analysis (IBA) techniques allow us to characterise this stone in a fully non-invasive way. In addition, by using an ion microprobe, we have been able to focus the analysis on single crystals, as their typical dimensions may range from a few microns to hundreds of microns. Provenance markers, identified in previous IBA studies and already presented elsewhere, were based on the presence/absence of mineral phases, on the presence/quantity of trace elements inside a phase and on characteristic features of the luminescence spectra. In this work, a systematic study on pyrite crystals, a common accessory mineral in lapis lazuli, was carried out, following a multi-technique approach: optical microscopy and SEM-EDX to select crystals for successive trace element micro-PIXE measurements at two Italian facilities, the INFN Laboratori Nazionali di Legnaro and the INFN LABEC laboratory in Firenze. The results of this work allowed us to obtain new markers for lapis lazuli provenance identification.

  5. Noncontact phase-sensitive dynamic optical coherence elastography at megahertz rate

    NASA Astrophysics Data System (ADS)

    Singh, Manmohan; Wu, Chen; Liu, Chih-Hao; Li, Jiasong; Schill, Alexander; Nair, Achuth; Kistenev, Yury V.; Larin, Kirill V.

    2016-03-01

    Dynamic optical coherence elastography (OCE) techniques have shown great promise at quantitatively obtaining the biomechanical properties of tissue. However, the majority of these techniques have required multiple temporal OCT acquisitions (M-B mode) and corresponding excitations, which lead to clinically unfeasible acquisition times and potential tissue damage. Furthermore, the large data sets and extended laser exposures hinder their translation to the clinic, where patient discomfort and safety are critical criteria. In this work we demonstrate noncontact true kilohertz frame-rate dynamic optical coherence elastography by directly imaging a focused air-pulse induced elastic wave with a home-built phase-sensitive OCE system based on a 4X buffered Fourier Domain Mode Locked swept source laser with an A-scan rate of ~1.5 MHz. The elastic wave was imaged at a frame rate of ~7.3 kHz using only a single excitation. In contrast to previous techniques, successive B-scans were acquired over the measurement region (B-M mode) in this work. The feasibility of this method was validated by quantifying the elasticity of tissue-mimicking agar phantoms as well as porcine corneas ex vivo at different intraocular pressures. The results demonstrate that this method can acquire a depth-resolved elastogram in milliseconds. The reduced data set enabled a rapid elasticity assessment, and the ultra-fast acquisition speed allowed for a clinically safe laser exposure to the cornea.

  6. Signal processing for the detection of explosive residues on varying substrates using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Morton, Kenneth D., Jr.; Torrione, Peter A.; Collins, Leslie

    2011-05-01

    Laser induced breakdown spectroscopy (LIBS) can provide rapid, minimally destructive, chemical analysis of substances with the benefit of little to no sample preparation. Therefore, LIBS is a viable technology for the detection of substances of interest in near real-time fielded remote sensing scenarios. Of particular interest to military and security operations is the detection of explosive residues on various surfaces. It has been demonstrated that LIBS is capable of detecting such residues, however, the surface or substrate on which the residue is present can alter the observed spectra. Standard chemometric techniques such as principal components analysis and partial least squares discriminant analysis have previously been applied to explosive residue detection, however, the classification techniques developed on such data perform best against residue/substrate pairs that were included in model training but do not perform well when the residue/substrate pairs are not in the training set. Specifically residues in the training set may not be correctly detected if they are presented on a previously unseen substrate. In this work, we explicitly model LIBS spectra resulting from the residue and substrate to attempt to separate the response from each of the two components. This separation process is performed jointly with classifier design to ensure that the classifier that is developed is able to detect residues of interest without being confused by variations in the substrates. We demonstrate that the proposed classification algorithm provides improved robustness to variations in substrate compared to standard chemometric techniques for residue detection.

  7. Assessment of occupational exposure to asbestos fibers: Contribution of analytical transmission electron microscopy analysis and comparison with phase-contrast microscopy.

    PubMed

    Eypert-Blaison, Céline; Romero-Hariot, Anita; Clerc, Frédéric; Vincent, Raymond

    2018-03-01

    From November 2009 to October 2010, the French general directorate for labor organized a large field-study using analytical transmission electron microscopy (ATEM) to characterize occupational exposure to asbestos fibers during work on asbestos containing materials (ACM). The primary objective of this study was to establish a method and to validate the feasibility of using ATEM for the analysis of airborne asbestos of individual filters sampled in various occupational environments. For each sampling event, ATEM data were compared to those obtained by phase-contrast optical microscopy (PCOM), the WHO-recommended reference technique. A total of 265 results were obtained from 29 construction sites where workers were in contact with ACM. Data were sorted depending on the combination of the ACM type and the removal technique. For each "ACM-removal technique" combination, ATEM data were used to compute statistical indicators on short, fine and WHO asbestos fibers. Moreover, exposure was assessed taking into account the use of respiratory protective devices (RPD). As in previous studies, no simple relationship was found between results by PCOM and ATEM counting methods. Some ACM, such as asbestos-containing plasters, generated very high dust levels, and some techniques generated considerable levels of dust whatever the ACM treated. On the basis of these observations, recommendations were made to measure and control the occupational exposure limit. General prevention measures to be taken during work with ACM are also suggested. Finally, it is necessary to continue acquiring knowledge, in particular regarding RPD and the dust levels measured by ATEM for the activities not evaluated during this study.

  8. Label-free imaging of the native, living cellular nanoarchitecture using partial-wave spectroscopic microscopy

    PubMed Central

    Almassalha, Luay M.; Bauer, Greta M.; Chandler, John E.; Gladstein, Scott; Cherkezyan, Lusik; Stypula-Cyrus, Yolanda; Weinberg, Samuel; Zhang, Di; Thusgaard Ruhoff, Peder; Roy, Hemant K.; Subramanian, Hariharan; Chandel, Navdeep S.; Szleifer, Igal; Backman, Vadim

    2016-01-01

    The organization of chromatin is a regulator of molecular processes including transcription, replication, and DNA repair. The structures within chromatin that regulate these processes span from the nucleosomal (10-nm) to the chromosomal (>200-nm) levels, with little known about the dynamics of chromatin structure between these scales due to a lack of quantitative imaging technique in live cells. Previous work using partial-wave spectroscopic (PWS) microscopy, a quantitative imaging technique with sensitivity to macromolecular organization between 20 and 200 nm, has shown that transformation of chromatin at these length scales is a fundamental event during carcinogenesis. As the dynamics of chromatin likely play a critical regulatory role in cellular function, it is critical to develop live-cell imaging techniques that can probe the real-time temporal behavior of the chromatin nanoarchitecture. Therefore, we developed a live-cell PWS technique that allows high-throughput, label-free study of the causal relationship between nanoscale organization and molecular function in real time. In this work, we use live-cell PWS to study the change in chromatin structure due to DNA damage and expand on the link between metabolic function and the structure of higher-order chromatin. In particular, we studied the temporal changes to chromatin during UV light exposure, show that live-cell DNA-binding dyes induce damage to chromatin within seconds, and demonstrate a direct link between higher-order chromatin structure and mitochondrial membrane potential. Because biological function is tightly paired with structure, live-cell PWS is a powerful tool to study the nanoscale structure–function relationship in live cells. PMID:27702891

  9. Research in robust control for hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Calise, A. J.

    1994-01-01

    The research during the third reporting period focused on fixed order robust control design for hypersonic vehicles. A new technique was developed to synthesize fixed order H(sub infinity) controllers. A controller canonical form is imposed on the compensator structure and a homotopy algorithm is employed to perform the controller design. Various reduced order controllers are designed for a simplified version of the hypersonic vehicle model used in our previous studies to demonstrate the capabilities of the code. However, further work is needed to investigate the issue of numerical ill-conditioning for large order systems and to make the numerical approach more reliable.

  10. A techno-economic approach to plasma gasification

    NASA Astrophysics Data System (ADS)

    Ramos, Ana; Rouboa, Abel

    2018-05-01

    Within the most used Waste-to-Energy technologies plasma gasification is recent and therefore not yet widely commercialized. Thus, it is necessary to conduct a viability study to support the thorough understanding and implementation of this thermal treatment. This paper aims to assess some technical, environmental and economic aspects of plasma gasification paving the way for a more sustained waste management system, as well as taking advantage of the commodity assets granted by the technique. Therefore, results from previously published studies were updated and highlighted as a preliminary starting point in order to potentially evolve to a complete and systematic work.

  11. THE OPTICS OF REFRACTIVE SUBSTRUCTURE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Michael D.; Narayan, Ramesh, E-mail: mjohnson@cfa.harvard.edu

    2016-08-01

    Newly recognized effects of refractive scattering in the ionized interstellar medium have broad implications for very long baseline interferometry (VLBI) at extreme angular resolutions. Building upon work by Blandford and Narayan, we present a simplified, geometrical optics framework, which enables rapid, semi-analytic estimates of refractive scattering effects. We show that these estimates exactly reproduce previous results based on a more rigorous statistical formulation. We then derive new expressions for the scattering-induced fluctuations of VLBI observables such as closure phase, and we demonstrate how to calculate the fluctuations for arbitrary quantities of interest using a Monte Carlo technique.

  12. Practical adaptive quantum tomography

    NASA Astrophysics Data System (ADS)

    Granade, Christopher; Ferrie, Christopher; Flammia, Steven T.

    2017-11-01

    We introduce a fast and accurate heuristic for adaptive tomography that addresses many of the limitations of prior methods. Previous approaches were either too computationally intensive or tailored to handle special cases such as single qubits or pure states. By contrast, our approach combines the efficiency of online optimization with generally applicable and well-motivated data-processing techniques. We numerically demonstrate these advantages in several scenarios including mixed states, higher-dimensional systems, and restricted measurements. http://cgranade.com complete data and source code for this work are available online [1], and can be previewed at https://goo.gl/koiWxR.

  13. Local Charge Injection and Extraction on Surface-Modified Al2O3 Nanoparticles in LDPE.

    PubMed

    Borgani, Riccardo; Pallon, Love K H; Hedenqvist, Mikael S; Gedde, Ulf W; Haviland, David B

    2016-09-14

    We use a recently developed scanning probe technique to image with high spatial resolution the injection and extraction of charge around individual surface-modified aluminum oxide nanoparticles embedded in a low-density polyethylene (LDPE) matrix. We find that the experimental results are consistent with a simple band structure model where localized electronic states are available in the band gap (trap states) in the vicinity of the nanoparticles. This work offers experimental support to a previously proposed mechanism for enhanced insulating properties of nanocomposite LDPE and provides a powerful experimental tool to further investigate such properties.

  14. Self-consistent analysis of high drift velocity measurements with the STARE system

    NASA Technical Reports Server (NTRS)

    Reinleitner, L. A.; Nielsen, E.

    1985-01-01

    The use of the STARE and SABRE coherent radar systems as valuable tools for geophysical research has been enhanced by a new technique called the Superimposed-Grid-Point method. This method permits an analysis of E-layer plasma irregularity phase velocity versus flow angle utilizing only STARE or SABRE data. As previous work with STARE has indicated, this analysis has clearly shown that the cosine law assumption breaks down for velocities near and exceeding the local ion acoustic velocities. Use of this method is improving understanding of naturally-occurring plasma irregularities in the E-layer.

  15. Higher-order binding corrections to the Lamb shift

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pachucki, K.

    1993-08-15

    In this work a new analytical method for calculating the one-loop self-energy correction to the Lamb shift is presented in detail. The technique relies on division into the low and the high energy parts. The low energy part is calculated using the multipole expansion and the high energy part is calculated by expanding the Dirac-Coulomb propagator in powers of the Coulomb field. The obtained results are in agreement with those previously known, but are more accurate. A new theoretical value of the Lamb shift is also given. 47 refs., 2 figs., 1 tab.

  16. Thermal control of power supplies with electronic packaging techniques

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The analysis, design, and development work to reduce the weight and size of a standard modular power supply with a 350 watt output was summarized. By integrating low cost commercial heat pipes in the redesign of this power supply, weight was reduced by 30% from that of the previous design. The temperature was also appreciably reduced, increasing the environmental capability of the unit. A demonstration unit with a 100 watt output and a 15 volt regulator module, plus simulated output modules, was built and tested to evaluate the thermal performance of the redesigned power supply.

  17. Laboratory verification respiratory measurements. IMBLMS phase B.4, appendix C, section 13

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The B-4 IMBLMS preliminary design of the respiratory measurement element includes certain techniques and apparatus which are quite different from those included in the B-3 version previously delivered to NASA-MSC. A working model was constructed in the laboratory to prove the feasibility of certain key features. The most critical of these is the capability of switching sample gases into the mass spectrometer from two different sources during a single breath cycle. Results proved the feasibility of all of the concepts which were tested, and certain refinements and improvements were included, as well.

  18. Effects of low energy proton, electron, and simultaneously combined proton and electron environments in silicon and GaAs solar cells

    NASA Technical Reports Server (NTRS)

    Horne, W. E.; Day, A. C.; Russell, D. A.

    1980-01-01

    Degradation of silicon and GaAs solar cells due to exposures to low energy proton and electron environments and annealing data for these cells are discussed. Degradation of silicon cells in simultaneously combined electron and low energy proton environments and previous experimental work is summarized and evaluated. The deficiencies in current solar array damage prediction techniques indicated by these data and the relevance of these deficiencies to specific missions such as intermediate altitude orbits and orbital transfer vehicles using solar electric propulsion systems are considered.

  19. Study of the Anatomy of the X-Ray and Neutron Production Scaling Laws in the Plasma Focus.

    DTIC Science & Technology

    1980-05-15

    plasma focus discharge in deuterium as an extension of our previous work on scaling laws of x-ray and neutron production. The structure of dense plasmoids which emit MeV ions has been recorded by ion imaging with pinhole camera and contact print techniques. The plasmoids are generated in the same region in which particle beams, neutron and x-ray emission reach a maximum of intensity. Sharply defined boundaries of the ion-beam source and of plasmoids have been obtained by ion track etching on plastic material

  20. Fabrication Materials for a Closed Cycle Brayton Turbine Wheel

    NASA Technical Reports Server (NTRS)

    Khandelwal, Suresh; Hah, Chunill; Powers, Lynn M.; Stewart, Mark E.; Suresh, Ambady; Owen, Albert K.

    2006-01-01

    A multidisciplinary analysis of a radial inflow turbine rotor is presented. This work couples high-fidelity fluid, structural, and thermal simulations in a seamless multidisciplinary analysis to investigate the consequences of material selection. This analysis extends multidisciplinary techniques previously demonstrated on rocket turbopumps and hypersonic engines. Since no design information is available for the anticipated Brayton rotating machinery, an existing rotor design (the Brayton Rotating Unit (BRU)) was used in the analysis. Steady state analysis results of a notional turbine rotor indicate that stress levels are easily manageable at the turbine inlet temperature, and stress levels anticipated using either superalloys or ceramics.

  1. Propagation-based x-ray phase contrast imaging using an iterative phase diversity technique

    NASA Astrophysics Data System (ADS)

    Carroll, Aidan J.; van Riessen, Grant A.; Balaur, Eugeniu; Dolbnya, Igor P.; Tran, Giang N.; Peele, Andrew G.

    2018-03-01

    Through the use of a phase diversity technique, we demonstrate a near-field in-line x-ray phase contrast algorithm that provides improved object reconstruction when compared to our previous iterative methods for a homogeneous sample. Like our previous methods, the new technique uses the sample refractive index distribution during the reconstruction process. The technique complements existing monochromatic and polychromatic methods and is useful in situations where experimental phase contrast data is affected by noise.

  2. 5-D interpolation with wave-front attributes

    NASA Astrophysics Data System (ADS)

    Xie, Yujiang; Gajewski, Dirk

    2017-11-01

    Most 5-D interpolation and regularization techniques reconstruct the missing data in the frequency domain by using mathematical transforms. An alternative type of interpolation methods uses wave-front attributes, that is, quantities with a specific physical meaning like the angle of emergence and wave-front curvatures. In these attributes structural information of subsurface features like dip and strike of a reflector are included. These wave-front attributes work on 5-D data space (e.g. common-midpoint coordinates in x and y, offset, azimuth and time), leading to a 5-D interpolation technique. Since the process is based on stacking next to the interpolation a pre-stack data enhancement is achieved, improving the signal-to-noise ratio (S/N) of interpolated and recorded traces. The wave-front attributes are determined in a data-driven fashion, for example, with the Common Reflection Surface (CRS method). As one of the wave-front-attribute-based interpolation techniques, the 3-D partial CRS method was proposed to enhance the quality of 3-D pre-stack data with low S/N. In the past work on 3-D partial stacks, two potential problems were still unsolved. For high-quality wave-front attributes, we suggest a global optimization strategy instead of the so far used pragmatic search approach. In previous works, the interpolation of 3-D data was performed along a specific azimuth which is acceptable for narrow azimuth acquisition but does not exploit the potential of wide-, rich- or full-azimuth acquisitions. The conventional 3-D partial CRS method is improved in this work and we call it as a wave-front-attribute-based 5-D interpolation (5-D WABI) as the two problems mentioned above are addressed. Data examples demonstrate the improved performance by the 5-D WABI method when compared with the conventional 3-D partial CRS approach. A comparison of the rank-reduction-based 5-D seismic interpolation technique with the proposed 5-D WABI method is given. The comparison reveals that there are significant advantages for steep dipping events using the 5-D WABI method when compared to the rank-reduction-based 5-D interpolation technique. Diffraction tails substantially benefit from this improved performance of the partial CRS stacking approach while the CPU time is comparable to the CPU time consumed by the rank-reduction-based method.

  3. Efficient use of mobile devices for quantification of pressure injury images.

    PubMed

    Garcia-Zapirain, Begonya; Sierra-Sosa, Daniel; Ortiz, David; Isaza-Monsalve, Mariano; Elmaghraby, Adel

    2018-01-01

    Pressure Injuries are chronic wounds that are formed due to the constriction of the soft tissues against bone prominences. In order to assess these injuries, the medical personnel carry out the evaluation and diagnosis using visual methods and manual measurements, which can be inaccurate and may generate discomfort in the patients. By using segmentation techniques, the Pressure Injuries can be extracted from an image and accurately parameterized, leading to a correct diagnosis. In general, these techniques are based on the solution of differential equations and the involved numerical methods are demanding in terms of computational resources. In previous work, we proposed a technique developed using toroidal parametric equations for image decomposition and segmentation without solving differential equations. In this paper, we present the development of a mobile application useful for the non-contact assessment of Pressure Injuries based on the toroidal decomposition from images. The usage of this technique allows us to achieve an accurate segmentation almost 8 times faster than Active Contours without Edges (ACWE) and Dynamic Contours methods. We describe the techniques and the implementation for Android devices using Python and Kivy. This application allows for the segmentation and parameterization of injuries, obtain relevant information for the diagnosis and tracking the evolution of patient's injuries.

  4. Results and Error Estimates from GRACE Forward Modeling over Antarctica

    NASA Astrophysics Data System (ADS)

    Bonin, Jennifer; Chambers, Don

    2013-04-01

    Forward modeling using a weighted least squares technique allows GRACE information to be projected onto a pre-determined collection of local basins. This decreases the impact of spatial leakage, allowing estimates of mass change to be better localized. The technique is especially valuable where models of current-day mass change are poor, such as over Antarctica. However when tested previously, the least squares technique has required constraints in the form of added process noise in order to be reliable. Poor choice of local basin layout has also adversely affected results, as has the choice of spatial smoothing used with GRACE. To develop design parameters which will result in correct high-resolution mass detection and to estimate the systematic errors of the method over Antarctica, we use a "truth" simulation of the Antarctic signal. We apply the optimal parameters found from the simulation to RL05 GRACE data across Antarctica and the surrounding ocean. We particularly focus on separating the Antarctic peninsula's mass signal from that of the rest of western Antarctica. Additionally, we characterize how well the technique works for removing land leakage signal from the nearby ocean, particularly that near the Drake Passage.

  5. Damage Diagnosis in Semiconductive Materials Using Electrical Impedance Measurements

    NASA Technical Reports Server (NTRS)

    Ross, Richard W.; Hinton, Yolanda L.

    2008-01-01

    Recent aerospace industry trends have resulted in an increased demand for real-time, effective techniques for in-flight structural health monitoring. A promising technique for damage diagnosis uses electrical impedance measurements of semiconductive materials. By applying a small electrical current into a material specimen and measuring the corresponding voltages at various locations on the specimen, changes in the electrical characteristics due to the presence of damage can be assessed. An artificial neural network uses these changes in electrical properties to provide an inverse solution that estimates the location and magnitude of the damage. The advantage of the electrical impedance method over other damage diagnosis techniques is that it uses the material as the sensor. Simple voltage measurements can be used instead of discrete sensors, resulting in a reduction in weight and system complexity. This research effort extends previous work by employing finite element method models to improve accuracy of complex models with anisotropic conductivities and by enhancing the computational efficiency of the inverse techniques. The paper demonstrates a proof of concept of a damage diagnosis approach using electrical impedance methods and a neural network as an effective tool for in-flight diagnosis of structural damage to aircraft components.

  6. Watermarking and copyright labeling of printed images

    NASA Astrophysics Data System (ADS)

    Hel-Or, Hagit Z.

    2001-07-01

    Digital watermarking is a labeling technique for digital images which embeds a code into the digital data so the data are marked. Watermarking techniques previously developed deal with on-line digital data. These techniques have been developed to withstand digital attacks such as image processing, image compression and geometric transformations. However, one must also consider the readily available attack of printing and scanning. The available watermarking techniques are not reliable under printing and scanning. In fact, one must consider the availability of watermarks for printed images as well as for digital images. An important issue is to intercept and prevent forgery in printed material such as currency notes, back checks, etc. and to track and validate sensitive and secrete printed material. Watermarking in such printed material can be used not only for verification of ownership but as an indicator of date and type of transaction or date and source of the printed data. In this work we propose a method of embedding watermarks in printed images by inherently taking advantage of the printing process. The method is visually unobtrusive to the printed image, the watermark is easily extracted and is robust under reconstruction errors. The decoding algorithm is automatic given the watermarked image.

  7. Application of Function-Failure Similarity Method to Rotorcraft Component Design

    NASA Technical Reports Server (NTRS)

    Roberts, Rory A.; Stone, Robert E.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Performance and safety are the top concerns of high-risk aerospace applications at NASA. Eliminating or reducing performance and safety problems can be achieved with a thorough understanding of potential failure modes in the designs that lead to these problems. The majority of techniques use prior knowledge and experience as well as Failure Modes and Effects as methods to determine potential failure modes of aircraft. During the design of aircraft, a general technique is needed to ensure that every potential failure mode is considered, while avoiding spending time on improbable failure modes. In this work, this is accomplished by mapping failure modes to specific components, which are described by their functionality. The failure modes are then linked to the basic functions that are carried within the components of the aircraft. Using this technique, designers can examine the basic functions, and select appropriate analyses to eliminate or design out the potential failure modes. The fundamentals of this method were previously introduced for a simple rotating machine test rig with basic functions that are common to a rotorcraft. In this paper, this technique is applied to the engine and power train of a rotorcraft, using failures and functions obtained from accident reports and engineering drawings.

  8. Bubble nucleation in simple and molecular liquids via the largest spherical cavity method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, Miguel A., E-mail: m.gonzalez12@imperial.ac.uk; Department of Chemistry, Imperial College London, London SW7 2AZ; Abascal, José L. F.

    2015-04-21

    In this work, we propose a methodology to compute bubble nucleation free energy barriers using trajectories generated via molecular dynamics simulations. We follow the bubble nucleation process by means of a local order parameter, defined by the volume of the largest spherical cavity (LSC) formed in the nucleating trajectories. This order parameter simplifies considerably the monitoring of the nucleation events, as compared with the previous approaches which require ad hoc criteria to classify the atoms and molecules as liquid or vapor. The combination of the LSC and the mean first passage time technique can then be used to obtain themore » free energy curves. Upon computation of the cavity distribution function the nucleation rate and free-energy barrier can then be computed. We test our method against recent computations of bubble nucleation in simple liquids and water at negative pressures. We obtain free-energy barriers in good agreement with the previous works. The LSC method provides a versatile and computationally efficient route to estimate the volume of critical bubbles the nucleation rate and to compute bubble nucleation free-energies in both simple and molecular liquids.« less

  9. Combined magnetic resonance imaging approach for the assessment of in vivo knee joint kinematics under full weight-bearing conditions.

    PubMed

    Al Hares, Ghaith; Eschweiler, Jörg; Radermacher, Klaus

    2015-06-01

    The development of detailed and specific knowledge on the biomechanical behavior of loaded knee structures has received increased attention in recent years. Stress magnetic resonance imaging techniques have been introduced in previous work to study knee kinematics under load conditions. Previous studies captured the knee movement either in atypical loading supine positions, or in upright positions with help of inclined supporting backrests being insufficient for movement capture under full-body weight-bearing conditions. In this work, we used a combined magnetic resonance imaging approach for measurement and assessment in knee kinematics under full-body weight-bearing in single legged stance. The proposed method is based on registration of high-resolution static magnetic resonance imaging data acquired in supine position with low-resolution data, quasi-static upright-magnetic resonance imaging data acquired in loaded positions for different degrees of knee flexion. The proposed method was applied for the measurement of tibiofemoral kinematics in 10 healthy volunteers. The combined magnetic resonance imaging approach allows the non-invasive measurement of knee kinematics in single legged stance and under physiological loading conditions. We believe that this method can provide enhanced understanding of the loaded knee kinematics. © IMechE 2015.

  10. Modeling and quantification of repolarization feature dependency on heart rate.

    PubMed

    Minchole, A; Zacur, E; Pueyo, E; Laguna, P

    2014-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Biosignal Interpretation: Advanced Methods for Studying Cardiovascular and Respiratory Systems". This work aims at providing an efficient method to estimate the parameters of a non linear model including memory, previously proposed to characterize rate adaptation of repolarization indices. The physiological restrictions on the model parameters have been included in the cost function in such a way that unconstrained optimization techniques such as descent optimization methods can be used for parameter estimation. The proposed method has been evaluated on electrocardiogram (ECG) recordings of healthy subjects performing a tilt test, where rate adaptation of QT and Tpeak-to-Tend (Tpe) intervals has been characterized. The proposed strategy results in an efficient methodology to characterize rate adaptation of repolarization features, improving the convergence time with respect to previous strategies. Moreover, Tpe interval adapts faster to changes in heart rate than the QT interval. In this work an efficient estimation of the parameters of a model aimed at characterizing rate adaptation of repolarization features has been proposed. The Tpe interval has been shown to be rate related and with a shorter memory lag than the QT interval.

  11. Blind Forensics of Successive Geometric Transformations in Digital Images Using Spectral Method: Theory and Applications.

    PubMed

    Chen, Chenglong; Ni, Jiangqun; Shen, Zhaoyi; Shi, Yun Qing

    2017-06-01

    Geometric transformations, such as resizing and rotation, are almost always needed when two or more images are spliced together to create convincing image forgeries. In recent years, researchers have developed many digital forensic techniques to identify these operations. Most previous works in this area focus on the analysis of images that have undergone single geometric transformations, e.g., resizing or rotation. In several recent works, researchers have addressed yet another practical and realistic situation: successive geometric transformations, e.g., repeated resizing, resizing-rotation, rotation-resizing, and repeated rotation. We will also concentrate on this topic in this paper. Specifically, we present an in-depth analysis in the frequency domain of the second-order statistics of the geometrically transformed images. We give an exact formulation of how the parameters of the first and second geometric transformations influence the appearance of periodic artifacts. The expected positions of characteristic resampling peaks are analytically derived. The theory developed here helps to address the gap left by previous works on this topic and is useful for image security and authentication, in particular, the forensics of geometric transformations in digital images. As an application of the developed theory, we present an effective method that allows one to distinguish between the aforementioned four different processing chains. The proposed method can further estimate all the geometric transformation parameters. This may provide useful clues for image forgery detection.

  12. Numerical Simulation of the Diffusion Processes in Nanoelectrode Arrays Using an Axial Neighbor Symmetry Approximation.

    PubMed

    Peinetti, Ana Sol; Gilardoni, Rodrigo S; Mizrahi, Martín; Requejo, Felix G; González, Graciela A; Battaglini, Fernando

    2016-06-07

    Nanoelectrode arrays have introduced a complete new battery of devices with fascinating electrocatalytic, sensitivity, and selectivity properties. To understand and predict the electrochemical response of these arrays, a theoretical framework is needed. Cyclic voltammetry is a well-fitted experimental technique to understand the undergoing diffusion and kinetics processes. Previous works describing microelectrode arrays have exploited the interelectrode distance to simulate its behavior as the summation of individual electrodes. This approach becomes limited when the size of the electrodes decreases to the nanometer scale due to their strong radial effect with the consequent overlapping of the diffusional fields. In this work, we present a computational model able to simulate the electrochemical behavior of arrays working either as the summation of individual electrodes or being affected by the overlapping of the diffusional fields without previous considerations. Our computational model relays in dividing a regular electrode array in cells. In each of them, there is a central electrode surrounded by neighbor electrodes; these neighbor electrodes are transformed in a ring maintaining the same active electrode area than the summation of the closest neighbor electrodes. Using this axial neighbor symmetry approximation, the problem acquires a cylindrical symmetry, being applicable to any diffusion pattern. The model is validated against micro- and nanoelectrode arrays showing its ability to predict their behavior and therefore to be used as a designing tool.

  13. Automated crack detection in conductive smart-concrete structures using a resistor mesh model

    NASA Astrophysics Data System (ADS)

    Downey, Austin; D'Alessandro, Antonella; Ubertini, Filippo; Laflamme, Simon

    2018-03-01

    Various nondestructive evaluation techniques are currently used to automatically detect and monitor cracks in concrete infrastructure. However, these methods often lack the scalability and cost-effectiveness over large geometries. A solution is the use of self-sensing carbon-doped cementitious materials. These self-sensing materials are capable of providing a measurable change in electrical output that can be related to their damage state. Previous work by the authors showed that a resistor mesh model could be used to track damage in structural components fabricated from electrically conductive concrete, where damage was located through the identification of high resistance value resistors in a resistor mesh model. In this work, an automated damage detection strategy that works through placing high value resistors into the previously developed resistor mesh model using a sequential Monte Carlo method is introduced. Here, high value resistors are used to mimic the internal condition of damaged cementitious specimens. The proposed automated damage detection method is experimentally validated using a 500 × 500 × 50 mm3 reinforced cement paste plate doped with multi-walled carbon nanotubes exposed to 100 identical impact tests. Results demonstrate that the proposed Monte Carlo method is capable of detecting and localizing the most prominent damage in a structure, demonstrating that automated damage detection in smart-concrete structures is a promising strategy for real-time structural health monitoring of civil infrastructure.

  14. Instrumentation for Studies of Electron Emission and Charging From Insulators

    NASA Technical Reports Server (NTRS)

    Thomson, C. D.; Zavyalov, V.; Dennison, J. R.

    2004-01-01

    Making measurements of electron emission properties of insulators is difficult since insulators can charge either negatively or positively under charge particle bombardment. In addition, high incident energies or high fluences can result in modification of a material s conductivity, bulk and surface charge profile, structural makeup through bond breaking and defect creation, and emission properties. We discuss here some of the charging difficulties associated with making insulator-yield measurements and review the methods used in previous studies of electron emission from insulators. We present work undertaken by our group to make consistent and accurate measurements of the electron/ion yield properties for numerous thin-film and thick insulator materials using innovative instrumentation and techniques. We also summarize some of the necessary instrumentation developed for this purpose including fast response, low-noise, high-sensitivity ammeters; signal isolation and interface to standard computer data acquisition apparatus using opto-isolation, sample-and-hold, and boxcar integration techniques; computer control, automation and timing using Labview software; a multiple sample carousel; a pulsed, compact, low-energy, charge neutralization electron flood gun; and pulsed visible and UV light neutralization sources. This work is supported through funding from the NASA Space Environments and Effects Program and the NASA Graduate Research Fellowship Program.

  15. Development of a direct three-dimensional biomicrofabrication concept based on electrospraying a custom made siloxane sol

    PubMed Central

    Sullivan, Alice C.; Jayasinghe, Suwan N.

    2007-01-01

    We demonstrate here the discovery of a unique and direct three-dimensional biomicrofabrication concept possessing the ability to revolutionize the jet-based fabrication arena. Previous work carried out on similar jet-based approaches have been successful in fabricating only vertical wall∕pillar-structures by the controlled deposition of stacked droplets. However, these advanced jet-techniques have not been able to directly fabricate self-supporting arches∕links (without molds or reaction methods) between adjacent structures (walls or pillars). Our work reported here gives birth to a unique type of jet determined by high intensity electric fields, which is derived from a specially formulated siloxane sol. The sol studied here has been chosen for its attractive properties (such as an excellent cross-linking nature as well as the ability to polymerize via polycondensation on deposition to its biocompatability), which promotes direct forming of biostructures with nanometer (<50 nm) sized droplets in three dimensions. We foresee that this direct three-dimensional biomicrofabrication jet technique coupled with a variety of formulated sols having focused and enhanced functionality will be explored throughout the physical and life sciences. PMID:19693359

  16. Mammalian cell culture process for monoclonal antibody production: nonlinear modelling and parameter estimation.

    PubMed

    Selişteanu, Dan; Șendrescu, Dorin; Georgeanu, Vlad; Roman, Monica

    2015-01-01

    Monoclonal antibodies (mAbs) are at present one of the fastest growing products of pharmaceutical industry, with widespread applications in biochemistry, biology, and medicine. The operation of mAbs production processes is predominantly based on empirical knowledge, the improvements being achieved by using trial-and-error experiments and precedent practices. The nonlinearity of these processes and the absence of suitable instrumentation require an enhanced modelling effort and modern kinetic parameter estimation strategies. The present work is dedicated to nonlinear dynamic modelling and parameter estimation for a mammalian cell culture process used for mAb production. By using a dynamical model of such kind of processes, an optimization-based technique for estimation of kinetic parameters in the model of mammalian cell culture process is developed. The estimation is achieved as a result of minimizing an error function by a particle swarm optimization (PSO) algorithm. The proposed estimation approach is analyzed in this work by using a particular model of mammalian cell culture, as a case study, but is generic for this class of bioprocesses. The presented case study shows that the proposed parameter estimation technique provides a more accurate simulation of the experimentally observed process behaviour than reported in previous studies.

  17. Integrating Retraction Modeling Into an Atlas-Based Framework for Brain Shift Prediction

    PubMed Central

    Chen, Ishita; Ong, Rowena E.; Simpson, Amber L.; Sun, Kay; Thompson, Reid C.

    2015-01-01

    In recent work, an atlas-based statistical model for brain shift prediction, which accounts for uncertainty in the intraoperative environment, has been proposed. Previous work reported in the literature using this technique did not account for local deformation caused by surgical retraction. It is challenging to precisely localize the retractor location prior to surgery and the retractor is often moved in the course of the procedure. This paper proposes a technique that involves computing the retractor-induced brain deformation in the operating room through an active model solve and linearly superposing the solution with the precomputed deformation atlas. As a result, the new method takes advantage of the atlas-based framework’s accounting for uncertainties while also incorporating the effects of retraction with minimal intraoperative computing. This new approach was tested using simulation and phantom experiments. The results showed an improvement in average shift correction from 50% (ranging from 14 to 81%) for gravity atlas alone to 80% using the active solve retraction component (ranging from 73 to 85%). This paper presents a novel yet simple way to integrate retraction into the atlas-based brain shift computation framework. PMID:23864146

  18. Situated Agents and Humans in Social Interaction for Elderly Healthcare: From Coaalas to AVICENA.

    PubMed

    Gómez-Sebastià, Ignasi; Moreno, Jonathan; Álvarez-Napagao, Sergio; Garcia-Gasulla, Dario; Barrué, Cristian; Cortés, Ulises

    2016-02-01

    Assistive Technologies (AT) are an application area where several Artificial Intelligence techniques and tools have been successfully applied to support elderly or impeded people on their daily activities. However, approaches to AT tend to center in the user-tool interaction, neglecting the user's connection with its social environment (such as caretakers, relatives and health professionals) and the possibility to monitor undesired behaviour providing both adaptation to a dynamic environment and early response to potentially dangerous situations. In previous work we have presented COAALAS, an intelligent social and norm-aware device for elderly people that is able to autonomously organize, reorganize and interact with the different actors involved in elderly-care, either human actors or other devices. In this paper we put our work into context, by first examining what are the desirable properties of such a system, analysing the state-of-the-art on the relevant topics, and verifying the validity of our proposal in a larger context that we call AVICENA. AVICENA's aim is develop a semi-autonomous (collaborative) tool to promote monitored, intensive, extended and personalized therapeutic regime adherence at home based on adaptation techniques.

  19. Barriers to Physical Activity in a Mass Transit Population: A Qualitative Study.

    PubMed

    Das, Bhibha M; Petruzzello, Steven J

    2016-01-01

    The physical inactivity epidemic continues be one of the greatest public health challenges in contemporary society in the United States. The transportation industry is at greater risk of physical inactivity, compared with individuals in other sectors of the workforce. The aim of this study was to use the Nominal Group Technique, a focus group technique, to examine mass transit employees' perceptions of the barriers to physical activity at their worksite. Three focus groups (n = 31) were conducted to examine mass transit employees' perceptions of barriers to physical activity at the worksite. Salient barriers included (1) changing work schedules, (2) poor weather conditions, and (3) lack of scheduled and timely breaks. Findings were consistent with previous research demonstrating shift work, poor weather, and lack of breaks can negatively impact mass transit employees' ability to be physically active. Although physical activity barriers for this population have been consistent for the last 20 years, public health practice and policy have not changed to address these barriers. Future studies should include conducing focus groups stratified by job classification (eg, operators, maintenance, and clerical) along with implementing and evaluating worksite-based physical activity interventions and policy changes.

  20. Parabolic aircraft solidification experiments

    NASA Technical Reports Server (NTRS)

    Workman, Gary L. (Principal Investigator); Smith, Guy A.; OBrien, Susan

    1996-01-01

    A number of solidification experiments have been utilized throughout the Materials Processing in Space Program to provide an experimental environment which minimizes variables in solidification experiments. Two techniques of interest are directional solidification and isothermal casting. Because of the wide-spread use of these experimental techniques in space-based research, several MSAD experiments have been manifested for space flight. In addition to the microstructural analysis for interpretation of the experimental results from previous work with parabolic flights, it has become apparent that a better understanding of the phenomena occurring during solidification can be better understood if direct visualization of the solidification interface were possible. Our university has performed in several experimental studies such as this in recent years. The most recent was in visualizing the effect of convective flow phenomena on the KC-135 and prior to that were several successive contracts to perform directional solidification and isothermal casting experiments on the KC-135. Included in this work was the modification and utilization of the Convective Flow Analyzer (CFA), the Aircraft Isothermal Casting Furnace (ICF), and the Three-Zone Directional Solidification Furnace. These studies have contributed heavily to the mission of the Microgravity Science and Applications' Materials Science Program.

  1. Mammalian Cell Culture Process for Monoclonal Antibody Production: Nonlinear Modelling and Parameter Estimation

    PubMed Central

    Selişteanu, Dan; Șendrescu, Dorin; Georgeanu, Vlad

    2015-01-01

    Monoclonal antibodies (mAbs) are at present one of the fastest growing products of pharmaceutical industry, with widespread applications in biochemistry, biology, and medicine. The operation of mAbs production processes is predominantly based on empirical knowledge, the improvements being achieved by using trial-and-error experiments and precedent practices. The nonlinearity of these processes and the absence of suitable instrumentation require an enhanced modelling effort and modern kinetic parameter estimation strategies. The present work is dedicated to nonlinear dynamic modelling and parameter estimation for a mammalian cell culture process used for mAb production. By using a dynamical model of such kind of processes, an optimization-based technique for estimation of kinetic parameters in the model of mammalian cell culture process is developed. The estimation is achieved as a result of minimizing an error function by a particle swarm optimization (PSO) algorithm. The proposed estimation approach is analyzed in this work by using a particular model of mammalian cell culture, as a case study, but is generic for this class of bioprocesses. The presented case study shows that the proposed parameter estimation technique provides a more accurate simulation of the experimentally observed process behaviour than reported in previous studies. PMID:25685797

  2. Detection of chewing from piezoelectric film sensor signals using ensemble classifiers.

    PubMed

    Farooq, Muhammad; Sazonov, Edward

    2016-08-01

    Selection and use of pattern recognition algorithms is application dependent. In this work, we explored the use of several ensembles of weak classifiers to classify signals captured from a wearable sensor system to detect food intake based on chewing. Three sensor signals (Piezoelectric sensor, accelerometer, and hand to mouth gesture) were collected from 12 subjects in free-living conditions for 24 hrs. Sensor signals were divided into 10 seconds epochs and for each epoch combination of time and frequency domain features were computed. In this work, we present a comparison of three different ensemble techniques: boosting (AdaBoost), bootstrap aggregation (bagging) and stacking, each trained with 3 different weak classifiers (Decision Trees, Linear Discriminant Analysis (LDA) and Logistic Regression). Type of feature normalization used can also impact the classification results. For each ensemble method, three feature normalization techniques: (no-normalization, z-score normalization, and minmax normalization) were tested. A 12 fold cross-validation scheme was used to evaluate the performance of each model where the performance was evaluated in terms of precision, recall, and accuracy. Best results achieved here show an improvement of about 4% over our previous algorithms.

  3. Observation of thermally etched grain boundaries with the FIB/TEM technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palizdar, Y., E-mail: y.palizdar@merc.ac.ir; San Martin, D.; Ward, M.

    2013-10-15

    Thermal etching is a method which is able to reveal and characterize grain boundaries, twins or dislocation structures and determine parameters such as grain boundary energies, surface diffusivities or study phase transformations in steels, intermetallics or ceramic materials. This method relies on the preferential transfer of matter away from grain boundaries on a polished sample during heating at high temperatures in an inert/vacuum atmosphere. The evaporation/diffusion of atoms at high temperatures results in the formation of grooves at the intersections of the planes of grain/twin boundaries with the polished surface. This work describes how the combined use of Focussed Ionmore » Beam and Transmission Electron Microscopy can be used to characterize not only the grooves and their profile with the surface, but also the grain boundary line below the groove, this method being complementary to the commonly used scanning probe techniques. - Highlights: • Thermally etched low-carbon steel samples have been characterized by FIB/TEM • Grain boundary (GB) lines below the groove have been characterized in this way • Absence of ghost traces and large θ angle suggests that GB are not stationary but mobile • Observations correlate well with previous works and Mullins' investigations [22].« less

  4. Production of gold and ruby-red lustres in Gubbio (Umbria, Italy) during the Renaissance period

    NASA Astrophysics Data System (ADS)

    Padeletti, G.; Fermo, P.

    The aim of this work is to gain a further insight into the knowledge of the production process of lustre-decorated ancient majolicas. Lustre is a sophisticated technique employed in the decoration of majolicas as used in central Italy during the Renaissance period. It consists of a beautiful iridescent gold or ruby-red thin metallic film, containing silver, copper and other substances and obtained in a reducing atmosphere on a previously glazed ceramic. Nowadays, it is not possible to replicate the outstanding results obtained by the ancient ceramicists, since the original recipes were lost. It is quite interesting to study lustre-production technology by means of analytical techniques now employed for advanced research on materials (XRD, ETAAS, ICP-OES, TEM-EDX-SAED and UV-Vis). In this work, we have focussed our attention on ceramic fragments decorated with both gold and ruby-red lustres, which were difficult to obtain due to complex reduction conditions required and which were a prerogative of Gubbio production. The two lustre colours differ in their chemical composition as well in their nanostructure. The presence of bismuth was disclosed and it was ascertained to be a distinctive feature of the Italian production.

  5. Experimental characterization of a small custom-built double-acting gamma-type stirling engine

    NASA Astrophysics Data System (ADS)

    Intsiful, Peter; Mensah, Francis; Thorpe, Arthur

    This paper investigates characterization of a small custom-built double-acting gamma-type stirling engine. Stirling-cycle engine is a reciprocating energy conversion machine with working spaces operating under conditions of oscillating pressure and flow. These conditions may be due to compressibility as wells as pressure and temperature fluctuations. In standard literature, research indicates that there is lack of basic physics to account for the transport phenomena that manifest themselves in the working spaces of reciprocating engines. Previous techniques involve governing equations: mass, momentum and energy. Some authors use engineering thermodynamics. None of these approaches addresses this particular engine. A technique for observing and analyzing the behavior of this engine via parametric spectral profiles has been developed, using laser beams. These profiles enabled the generation of pv-curves and other trajectories for investigating the thermos-physical and thermos-hydrodynamic phenomena that manifest in the exchangers. The engine's performance was examined. The results indicate that with current load of 35.78A, electric power of 0.505 kW was generated at a speed of 240 rpm and 29.50 percent efficiency was obtained. Nasa grants to Howard University NASA/HBCU-NHRETU & CSTEA.

  6. Doing that thing that scientists do: A discovery-driven module on protein purification and characterization for the undergraduate biochemistry laboratory classroom.

    PubMed

    Garrett, Teresa A; Osmundson, Joseph; Isaacson, Marisa; Herrera, Jennifer

    2015-01-01

    In traditional introductory biochemistry laboratory classes students learn techniques for protein purification and analysis by following provided, established, step-by-step procedures. Students are exposed to a variety of biochemical techniques but are often not developing procedures or collecting new, original data. In this laboratory module, students develop research skills through work on an original research project and gain confidence in their ability to design and execute an experiment while faculty can enhance their scholarly pursuits through the acquisition of original data in the classroom laboratory. Students are prepared for a 6-8 week discovery-driven project on the purification of the Escherichia coli cytidylate kinase (CMP kinase) through in class problems and other laboratory exercises on bioinformatics and protein structure analysis. After a minimal amount of guidance on how to perform the CMP kinase in vitro enzyme assay, SDS-PAGE, and the basics of protein purification, students, working in groups of three to four, develop a protein purification protocol based on the scientific literature and investigate some aspect of CMP kinase that interests them. Through this process, students learn how to implement a new but perhaps previously worked out procedure to answer their research question. In addition, they learn the importance of keeping a clear and thorough laboratory notebook and how to interpret their data and use that data to inform the next set of experiments. Following this module, students had increased confidence in their ability to do basic biochemistry techniques and reported that the "self-directed" nature of this lab increased their engagement in the project. © 2015 The International Union of Biochemistry and Molecular Biology.

  7. Combination of acoustic levitation with small angle scattering techniques and synchrotron radiation circular dichroism. Application to the study of protein solutions.

    PubMed

    Cristiglio, Viviana; Grillo, Isabelle; Fomina, Margarita; Wien, Frank; Shalaev, Evgenyi; Novikov, Alexey; Brassamin, Séverine; Réfrégiers, Matthieu; Pérez, Javier; Hennet, Louis

    2017-01-01

    The acoustic levitation technique is a useful sample handling method for small solid and liquids samples, suspended in air by means of an ultrasonic field. This method was previously used at synchrotron sources for studying pharmaceutical liquids and protein solutions using x-ray diffraction and small angle x-ray scattering (SAXS). In this work we combined for the first time this containerless method with small angle neutron scattering (SANS) and synchrotron radiation circular dichroism (SRCD) to study the structural behavior of proteins in solutions during the water evaporation. SANS results are also compared with SAXS experiments. The aggregation behavior of 45μl droplets of lysozyme protein diluted in water was followed during the continuous increase of the sample concentration by evaporating the solvent. The evaporation kinetics was followed at different drying stage by SANS and SAXS with a good data quality. In a prospective work using SRCD, we also studied the evolution of the secondary structure of the myoglobin protein in water solution in the same evaporation conditions. Acoustic levitation was applied for the first time with SANS and the high performances of the used neutron instruments made it possible to monitor fast container-less reactions in situ. A preliminary work using SRCD shows the potentiality of its combination with acoustic levitation for studying the evolution of the protein structure with time. This multi-techniques approach could give novel insights into crystallization and self-assembly phenomena of biological compound with promising potential applications in pharmaceutical, food and cosmetics industry. This article is part of a Special Issue entitled "Science for Life" Guest Editor: Dr. Austen Angell, Dr. Salvatore Magazù and Dr. Federica Migliardo. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. The effect of hydrostatic vs. shock pressure treatment on plant seeds

    NASA Astrophysics Data System (ADS)

    Mustey, Adrian; Leighs, James; Appleby-Thomas, Gareth; Wood, David; Hazael, Rachael; McMillan, Paul; Hazell, Paul

    2013-06-01

    The hydrostatic pressure and shock response of plant seeds have both been previously investigated (primarily driven by an interest in reducing bacterial contamination of crops and the theory of panspermia respectively). However, comparisons have not previously been made between these two methods of applying pressure to plant seeds. Here such a comparison has been undertaken based on the premise that any correlations in such data may provide a route to inform understanding of damage mechanisms in the seeds under test. In this work two varieties of plant seeds were subjected to hydrostatic pressure via a non-end-loaded piston cylinder set-up and shock compression via employment of a 50-mm bore, single stage gas gun using the flyer-plate technique. Results from germination tests of recovered seed samples have been compared and contrasted, and initial conclusions made regarding causes of trends in the resultant data-set.

  9. Mechanical property studies of human gallstones.

    PubMed

    Stranne, S K; Cocks, F H; Gettliffe, R

    1990-08-01

    The recent development of gallstone fragmentation methods has increased the significance of the study of the mechanical properties of human gallstones. In the present work, fracture strength data and microhardness values of gallstones of various chemical compositions are presented as tested in both dry and simulated bile environments. Generally, both gallstone hardness and fracture strength values were significantly less than kidney stone values found in previous studies. However, a single calcium carbonate stone was found to have an outer shell hardness exceeding those values found for kidney stones. Diametral compression measurements in simulated bile conclusively demonstrated low gallstone fracture strength as well as brittle fracture in the stones tested. Based on the results of this study, one may conclude that the wide range of gallstone microhardnesses found may explain the reported difficulties previous investigators have experienced using various fragmentation techniques on specific gallstones. Moreover, gallstone mechanical properties may be relatively sensitive to bile-environment composition.

  10. Relativistic, correlation, and polarization effects in two-photon photoionization of Xe

    NASA Astrophysics Data System (ADS)

    Lagutin, B. M.; Petrov, I. D.; Sukhorukov, V. L.; Demekhin, Ph. V.; Knie, A.; Ehresmann, A.

    2017-06-01

    Two-photon ionization of xenon was investigated theoretically for exciting-photon energies from 6.7 to 11.5 eV, which results in the ionization of Xe between 5 p1 /2 (13.43 eV) and 5 s (23.40 eV) thresholds. We describe the extension of a previously developed computational technique for the inclusion of relativistic effects to calculate energies of intermediate resonance state and cross sections for two-photon ionization. Reasonable consistency of cross sections calculated in length and velocity form was obtained only after considering many-electron correlations. Agreement between calculated and measured resonance energies is found when core polarization was additionally included in the calculations. The presently computed two-photon photoionization cross sections of Xe are compared with Ar cross sections in our previous work. Photoelectron angular distribution parameters calculated here indicate that intermediated resonances strongly influence photoelectron angular distribution of Xe.

  11. Strain Measurements of Chondrules and Refraction Inclusion in Allende

    NASA Technical Reports Server (NTRS)

    Tait, Alastair W.; Fisher, Kent R.; Simon, Justin I.

    2013-01-01

    This study uses traditional strain measurement techniques, combined with X-ray computerized tomography (CT), to evaluate petrographic evidence in the Allende CV3 chondrite for preferred orientation and to measure strain in three dimensions. The existence of petrofabrics and lineations was first observed in carbonaceous meteorites in the 1960's. Yet, fifty years later only a few studies have reported that meteorites record such features. Impacts are often cited as the mechanism for this feature, although plastic deformation from overburden and nebular imbrication have also been proposed. Previous work conducted on the Leoville CV3 and the Parnallee LL3 chondrites, exhibited a minimum uniaxial shortening of 33% and 21%, respectively. Petrofabrics in Allende CV3 have been looked at before; previous workers using Electron Back Scatter Diffraction (EBSD) found a major-axis alignment of olivine inside dark inclusions and an "augen"-like preferred orientation of olivine grains around more competent chondrules

  12. Does overgeneral autobiographical memory result from poor memory for task instructions?

    PubMed

    Yanes, Paula K; Roberts, John E; Carlos, Erica L

    2008-10-01

    Considerable previous research has shown that retrieval of overgeneral autobiographical memories (OGM) is elevated among individuals suffering from various emotional disorders and those with a history of trauma. Although previous theories suggest that OGM serves the function of regulating acute negative affect, it is also possible that OGM results from difficulties in keeping the instruction set for the Autobiographical Memory Test (AMT) in working memory, or what has been coined "secondary goal neglect" (Dalgleish, 2004). The present study tested whether OGM is associated with poor memory for the task's instruction set, and whether an instruction set reminder would improve memory specificity over repeated trials. Multilevel modelling data-analytic techniques demonstrated a significant relationship between poor recall of instruction set and probability of retrieving OGMs. Providing an instruction set reminder for the AMT relative to a control task's instruction set improved memory specificity immediately afterward.

  13. Observation of partial relaxation mechanisms via anisotropic strain relief on epitaxial islands using semiconductor nanomembranes

    NASA Astrophysics Data System (ADS)

    Rosa, Barbara L. T.; Marçal, Lucas A. B.; Ribeiro Andrade, Rodrigo; Dornellas Pinto, Luciana; Rodrigues, Wagner N.; Lustoza Souza, Patrícia; Pamplona Pires, Mauricio; Wagner Nunes, Ricardo; Malachias, Angelo

    2017-07-01

    In this work we attempt to directly observe anisotropic partial relaxation of epitaxial InAs islands using transmission electron microscopy (TEM) and synchrotron x-ray diffraction on a 15 nm thick InAs:GaAs nanomembrane. We show that under such conditions TEM provides improved real-space statistics, allowing the observation of partial relaxation processes that were not previously detected by other techniques or by usual TEM cross section images. Besides the fully coherent and fully relaxed islands that are known to exist above previously established critical thickness, we prove the existence of partially relaxed islands, where incomplete 60° half-loop misfit dislocations lead to a lattice relaxation along one of the <110> directions, keeping a strained lattice in the perpendicular direction. Although individual defects cannot be directly observed, their implications to the resulting island registry are identified and discussed within the frame of half-loops propagations.

  14. Absolute 1* quantum yields for the ICN A state by diode laser gain versus absorption spectroscopy

    NASA Technical Reports Server (NTRS)

    Hess, Wayne P.; Leone, Stephen R.

    1987-01-01

    Absolute I* quantum yields were measured as a function of wavelength for room temperature photodissociation of the ICN A state continuum. The temperature yields are obtained by the technique of time-resolved diode laser gain-versus-absorption spectroscopy. Quantum yields are evaluated at seven wavelengths from 248 to 284 nm. The yield at 266 nm is 66.0 +/- 2% and it falls off to 53.4 +/- 2% and 44.0 +/- 4% at 284 and 248 respectively. The latter values are significantly higher than those obtained by previous workers using infrared fluorescence. Estimates of I* quantum yields obtained from analysis of CN photofragment rotational distributions, as discussed by other workers, are in good agreement with the I* yields. The results are considered in conjunction with recent theoretical and experimental work on the CN rotational distributions and with previous I* yield results.

  15. Observation of partial relaxation mechanisms via anisotropic strain relief on epitaxial islands using semiconductor nanomembranes.

    PubMed

    Rosa, Barbara L T; Marçal, Lucas A B; Andrade, Rodrigo Ribeiro; Pinto, Luciana Dornellas; Rodrigues, Wagner N; Souza, Patrícia Lustoza; Pires, Mauricio Pamplona; Nunes, Ricardo Wagner; Malachias, Angelo

    2017-07-28

    In this work we attempt to directly observe anisotropic partial relaxation of epitaxial InAs islands using transmission electron microscopy (TEM) and synchrotron x-ray diffraction on a 15 nm thick InAs:GaAs nanomembrane. We show that under such conditions TEM provides improved real-space statistics, allowing the observation of partial relaxation processes that were not previously detected by other techniques or by usual TEM cross section images. Besides the fully coherent and fully relaxed islands that are known to exist above previously established critical thickness, we prove the existence of partially relaxed islands, where incomplete 60° half-loop misfit dislocations lead to a lattice relaxation along one of the 〈110〉 directions, keeping a strained lattice in the perpendicular direction. Although individual defects cannot be directly observed, their implications to the resulting island registry are identified and discussed within the frame of half-loops propagations.

  16. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; Russell, Samuel S.

    2012-01-01

    Objective Develop a software application utilizing high performance computing techniques, including general purpose graphics processing units (GPGPUs), for the analysis and visualization of large thermographic data sets. Over the past several years, an increasing effort among scientists and engineers to utilize graphics processing units (GPUs) in a more general purpose fashion is allowing for previously unobtainable levels of computation by individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU which yield significant increases in performance. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Image processing is one area were GPUs are being used to greatly increase the performance of certain analysis and visualization techniques.

  17. An earth imaging camera simulation using wide-scale construction of reflectance surfaces

    NASA Astrophysics Data System (ADS)

    Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk

    2013-10-01

    Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.

  18. In vivo measurement of mechanical properties of human long bone by using sonic sound

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hossain, M. Jayed, E-mail: zed.hossain06@gmail.com; Rahman, M. Moshiur, E-mail: razib-121@yahoo.com; Alam, Morshed

    Vibration analysis has evaluated as non-invasive techniques for the in vivo assessment of bone mechanical properties. The relation between the resonant frequencies, long bone geometry and mechanical properties can be obtained by vibration analysis. In vivo measurements were performed on human ulna as a simple beam model with an experimental technique and associated apparatus. The resonant frequency of the ulna was obtained by Fast Fourier Transformation (FFT) analysis of the vibration response of piezoelectric accelerometer. Both elastic modulus and speed of the sound were inferred from the resonant frequency. Measurement error in the improved experimental setup was comparable with themore » previous work. The in vivo determination of bone elastic response has potential value in screening programs for metabolic bone disease, early detection of osteoporosis and evaluation of skeletal effects of various therapeutic modalities.« less

  19. Target Fishing for Chemical Compounds using Target-Ligand Activity data and Ranking based Methods

    PubMed Central

    Wale, Nikil; Karypis, George

    2009-01-01

    In recent years the development of computational techniques that identify all the likely targets for a given chemical compound, also termed as the problem of Target Fishing, has been an active area of research. Identification of likely targets of a chemical compound helps to understand problems such as toxicity, lack of efficacy in humans, and poor physical properties associated with that compound in the early stages of drug discovery. In this paper we present a set of techniques whose goal is to rank or prioritize targets in the context of a given chemical compound such that most targets that this compound may show activity against appear higher in the ranked list. These methods are based on our extensions to the SVM and Ranking Perceptron algorithms for this problem. Our extensive experimental study shows that the methods developed in this work outperform previous approaches by 2% to 60% under different evaluation criterions. PMID:19764745

  20. Application of the fuzzy topsis multi-attribute decision making method to determine scholarship recipients

    NASA Astrophysics Data System (ADS)

    Irvanizam, I.

    2018-03-01

    Some scholarships have been routinely offered by Ministry of Research, Technology and Higher Education of the Republic of Indonesia for students at Syiah Kuala University. In reality, the scholarship selection process is becoming subjective and highly complex problem. Multi-Attribute Decision Making (MADM) techniques can be a solution in order to solve scholarship selection problem. In this study, we demonstrated the application of a fuzzy TOPSIS as an MADM technique by using a numerical example in order to calculate a triangular fuzzy number for the fuzzy data onto a normalized weight. We then use this normalized value to construct the normalized fuzzy decision matrix. We finally use the fuzzy TOPSIS to rank alternatives in descending order based on the relative closeness to the ideal solution. The result in terms of final ranking shows slightly different from the previous work.

  1. Citrus juice extraction systems: effect on chemical composition and antioxidant activity of clementine juice.

    PubMed

    Álvarez, Rafael; Carvalho, Catarina P; Sierra, Jelver; Lara, Oscar; Cardona, David; Londoño-Londoño, Julian

    2012-01-25

    Clementines are especially appreciated for their delicious flavor, and recent years have seen a great increase in the consumption of clementine juice. In previous decades, antioxidant compounds have received particular attention because of widely demonstrated beneficial health effects. In this work, the organoleptic, volatile flavor, and antioxidant quality of clementine juice were studied with regard to the influence on them by different juice extraction systems: plug inside fruit and rotating cylinders. The results showed that juice extracted by the former method presented higher yields and hesperidin content, which was related to higher antioxidant activity, demonstrated by ORAC and LDL assays. The organoleptic quality was not affected by the processing technique, whereas there were significant differences in the chemical flavor profile. There are important differences in chemical and functional quality between juice extraction techniques, which must be taken into account when employing processing systems to produce high-quality products.

  2. Factorizing the factorization - a spectral-element solver for elliptic equations with linear operation count

    NASA Astrophysics Data System (ADS)

    Huismann, Immo; Stiller, Jörg; Fröhlich, Jochen

    2017-10-01

    The paper proposes a novel factorization technique for static condensation of a spectral-element discretization matrix that yields a linear operation count of just 13N multiplications for the residual evaluation, where N is the total number of unknowns. In comparison to previous work it saves a factor larger than 3 and outpaces unfactored variants for all polynomial degrees. Using the new technique as a building block for a preconditioned conjugate gradient method yields linear scaling of the runtime with N which is demonstrated for polynomial degrees from 2 to 32. This makes the spectral-element method cost effective even for low polynomial degrees. Moreover, the dependence of the iterative solution on the element aspect ratio is addressed, showing only a slight increase in the number of iterations for aspect ratios up to 128. Hence, the solver is very robust for practical applications.

  3. Marine environment pollution: The contribution of mass spectrometry to the study of seawater.

    PubMed

    Magi, Emanuele; Di Carro, Marina

    2016-09-09

    The study of marine pollution has been traditionally addressed to persistent chemicals, generally known as priority pollutants; a current trend in environmental analysis is a shift toward "emerging pollutants," defined as newly identified or previously unrecognized contaminants. The present review is focused on the peculiar contribution of mass spectrometry (MS) to the study of pollutants in the seawater compartment. The work is organized in five paragraphs where the most relevant groups of pollutants, both "classical" and "emerging," are presented and discussed, highlighting the relative data obtained by the means of different MS techniques. The hyphenation of MS and separative techniques, together with the development of different ion sources, makes MS and tandem MS the analytical tool of choice for the determination of trace organic contaminants in seawater. © 2016 Wiley Periodicals, Inc. Mass Spec Rev. © 2016 Wiley Periodicals, Inc.

  4. Characterising encapsulated nuclear waste using cosmic-ray muon tomography

    NASA Astrophysics Data System (ADS)

    Clarkson, A.; Hamilton, D. J.; Hoek, M.; Ireland, D. G.; Johnstone, J. R.; Kaiser, R.; Keri, T.; Lumsden, S.; Mahon, D. F.; McKinnon, B.; Murray, M.; Nutbeam-Tuffs, S.; Shearer, C.; Yang, G.; Zimmerman, C.

    2015-03-01

    Tomographic imaging techniques using the Coulomb scattering of cosmic-ray muons have been shown previously to successfully identify and characterise low- and high-Z materials within an air matrix using a prototype scintillating-fibre tracker system. Those studies were performed as the first in a series to assess the feasibility of this technology and image reconstruction techniques in characterising the potential high-Z contents of legacy nuclear waste containers for the U.K. Nuclear Industry. The present work continues the feasibility study and presents the first images reconstructed from experimental data collected using this small-scale prototype system of low- and high-Z materials encapsulated within a concrete-filled stainless-steel container. Clear discrimination is observed between the thick steel casing, the concrete matrix and the sample materials assayed. These reconstructed objects are presented and discussed in detail alongside the implications for future industrial scenarios.

  5. State-of-the-art and dissemination of computational tools for drug-design purposes: a survey among Italian academics and industrial institutions.

    PubMed

    Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco

    2013-05-01

    During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted.

  6. MONTE CARLO SIMULATION OF THE BREMSSTRAHLUNG RADIATION FOR THE MEASUREMENT OF AN INTERNAL CONTAMINATION WITH PURE-BETA EMITTERS IN VIVO.

    PubMed

    Fantínová, K; Fojtík, P; Malátová, I

    2016-09-01

    Rapid measurement techniques are required for a large-scale emergency monitoring of people. In vivo measurement of the bremsstrahlung radiation produced by incorporated pure-beta emitters can offer a rapid technique for the determination of such radionuclides in the human body. This work presents a method for the calibration of spectrometers, based on the use of UPh-02T (so-called IGOR) phantom and specific (90)Sr/(90)Y sources, which can account for recent as well as previous contaminations. The process of the whole- and partial-body counter calibration in combination with application of a Monte Carlo code offers readily extension also to other pure-beta emitters and various exposure scenarios. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. A new technique for calculations of binary stellar evolution, with application to magnetic braking

    NASA Technical Reports Server (NTRS)

    Rappaport, S.; Joss, P. C.; Verbunt, F.

    1983-01-01

    The development of appropriate computer programs has made it possible to conduct studies of stellar evolution which are more detailed and accurate than the investigations previously feasible. However, the use of such programs can also entail some serious drawbacks which are related to the time and expense required for the work. One approach for overcoming these drawbacks involves the employment of simplified stellar evolution codes which incorporate the essential physics of the problem of interest without attempting either great generality or maximal accuracy. Rappaport et al. (1982) have developed a simplified code to study the evolution of close binary stellar systems composed of a collapsed object and a low-mass secondary. The present investigation is concerned with a more general, but still simplified, technique for calculating the evolution of close binary systems with collapsed binaries and mass-losing secondaries.

  8. A hybrid, auto-adaptive and rule-based multi-agent approach using evolutionary algorithms for improved searching

    NASA Astrophysics Data System (ADS)

    Izquierdo, Joaquín; Montalvo, Idel; Campbell, Enrique; Pérez-García, Rafael

    2016-08-01

    Selecting the most appropriate heuristic for solving a specific problem is not easy, for many reasons. This article focuses on one of these reasons: traditionally, the solution search process has operated in a given manner regardless of the specific problem being solved, and the process has been the same regardless of the size, complexity and domain of the problem. To cope with this situation, search processes should mould the search into areas of the search space that are meaningful for the problem. This article builds on previous work in the development of a multi-agent paradigm using techniques derived from knowledge discovery (data-mining techniques) on databases of so-far visited solutions. The aim is to improve the search mechanisms, increase computational efficiency and use rules to enrich the formulation of optimization problems, while reducing the search space and catering to realistic problems.

  9. Exploring patterns of epigenetic information with data mining techniques.

    PubMed

    Aguiar-Pulido, Vanessa; Seoane, José A; Gestal, Marcos; Dorado, Julián

    2013-01-01

    Data mining, a part of the Knowledge Discovery in Databases process (KDD), is the process of extracting patterns from large data sets by combining methods from statistics and artificial intelligence with database management. Analyses of epigenetic data have evolved towards genome-wide and high-throughput approaches, thus generating great amounts of data for which data mining is essential. Part of these data may contain patterns of epigenetic information which are mitotically and/or meiotically heritable determining gene expression and cellular differentiation, as well as cellular fate. Epigenetic lesions and genetic mutations are acquired by individuals during their life and accumulate with ageing. Both defects, either together or individually, can result in losing control over cell growth and, thus, causing cancer development. Data mining techniques could be then used to extract the previous patterns. This work reviews some of the most important applications of data mining to epigenetics.

  10. Out of the Autoclave Fabrication of LaRC[TradeMark] PETI-9 Polyimide Laminates

    NASA Technical Reports Server (NTRS)

    Cano, Robert J.; Jensen, Brian J.

    2013-01-01

    The NASA Langley Research Center developed polyimide system, LaRC PETI-9, has successfully been processed into composites by high temperature vacuum assisted resin transfer molding (HT-VARTM). To extend the application of this high use temperature material to other out-of-autoclave (OOA) processing techniques, the fabrication of PETI- 9 laminates was evaluated using only a vacuum bag and oven cure. A LaRC PETI-9 polyimide solution in NMP was prepared and successfully utilized to fabricate unidirectional IM7 carbon fiber prepreg that was subsequently processed into composites with a vacuum bag and oven cure OOA process. Composite panels of good quality were successfully fabricated and mechanically tested. Processing characteristics, composite panel quality and mechanical properties are presented in this work. The resultant properties are compared to previously developed LaRC material systems processed by both autoclave and OOA techniques including the well characterized, autoclave processed LaRC PETI-5.

  11. Diffusion length measurements in bulk and epitaxially grown 3-5 semiconductors using charge collection microscopy

    NASA Technical Reports Server (NTRS)

    Leon, R. P.

    1987-01-01

    Diffusion lengths and surface recombination velocities were measured in GaAs diodes and InP finished solar cells. The basic techniques used was charge collection microscopy also known as electron beam induced current (EBIC). The normalized currents and distances from the pn junction were read directly from the calibrated curves obtained while using the line scan mode in an SEM. These values were then equated to integral and infinite series expressions resulting from the solution of the diffusion equation with both extended generation and point generation functions. This expands previous work by examining both thin and thick samples. The surface recombination velocity was either treated as an unknown in a system of two equations, or measured directly using low e(-) beam accelerating voltages. These techniques give accurate results by accounting for the effects of surface recombination and the finite size of the generation volume.

  12. Chemical reactivation of quenched fluorescent protein molecules enables resin-embedded fluorescence microimaging

    PubMed Central

    Xiong, Hanqing; Zhou, Zhenqiao; Zhu, Mingqiang; Lv, Xiaohua; Li, Anan; Li, Shiwei; Li, Longhui; Yang, Tao; Wang, Siming; Yang, Zhongqin; Xu, Tonghui; Luo, Qingming; Gong, Hui; Zeng, Shaoqun

    2014-01-01

    Resin embedding is a well-established technique to prepare biological specimens for microscopic imaging. However, it is not compatible with modern green-fluorescent protein (GFP) fluorescent-labelling technique because it significantly quenches the fluorescence of GFP and its variants. Previous empirical optimization efforts are good for thin tissue but not successful on macroscopic tissue blocks as the quenching mechanism remains uncertain. Here we show most of the quenched GFP molecules are structurally preserved and not denatured after routine embedding in resin, and can be chemically reactivated to a fluorescent state by alkaline buffer during imaging. We observe up to 98% preservation in yellow-fluorescent protein case, and improve the fluorescence intensity 11.8-fold compared with unprocessed samples. We demonstrate fluorescence microimaging of resin-embedded EGFP/EYFP-labelled tissue block without noticeable loss of labelled structures. This work provides a turning point for the imaging of fluorescent protein-labelled specimens after resin embedding. PMID:24886825

  13. Transport relaxation processes in supercritical fluids

    NASA Astrophysics Data System (ADS)

    Jonas, J.

    The technique for solubility measurements of solids in compressed supercritical fluids using NMR and theoretical analysis of experimental data on collision induced scattering were examined. Initial tests for a determination of solid solubilities in supercritical fluids without mixing were previously described and these preparations have continued. Super critical carbon dioxide dissolving naphthalene, for which solubility data is already available (M. McHugh, M.E. Paulaitis, J. Chem. Eng. Data, Vol. 25 (4), 1980) is being studied. This initial testing of the NMR technique for measuring solubilities in a well characterized system should prove very valuable for our later determinations with the proposed mixing probe. Systematic experimental studies of collision induced spectra in several supercritical fluids using both Raman and Rayleigh scattering are continued. The experimental work on SF6 and CH4 was finished and the experimental data testing of the various theoretical models for collision induced scattering is being analyzed.

  14. A new improved study of cyanotoxins presence from experimental cyanobacteria concentrations in the Trasona reservoir (Northern Spain) using the MARS technique.

    PubMed

    García Nieto, P J; Alonso Fernández, J R; Sánchez Lasheras, F; de Cos Juez, F J; Díaz Muñiz, C

    2012-07-15

    Cyanotoxins, a kind of poisonous substances produced by cyanobacteria, are responsible for health risks in drinking and recreational water uses. The aim of this study is to improve our previous and successful work about cyanotoxins prediction from some experimental cyanobacteria concentrations in the Trasona reservoir (Asturias, Northern Spain) using the multivariate adaptive regression splines (MARS) technique at a local scale. In fact, this new improvement consists of using not only biological variables, but also the physical-chemical ones. As a result, the coefficient of determination has improved from 0.84 to 0.94, that is to say, more accurate predictive calculations and a better approximation to the real problem were obtained. Finally the agreement of the MARS model with experimental data confirmed the good performance. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Doppler ultrasound surveillance in deep tunneling compressed-air work with Trimix breathing: bounce dive technique compared to saturation-excursion technique.

    PubMed

    Vellinga, T P van Rees; Sterk, W; de Boer, A G E M; van der Beek, A J; Verhoeven, A C; van Dijk, F J H

    2008-01-01

    The Western Scheldt Tunneling Project in The Netherlands provided a unique opportunity to evaluate two deep-diving techniques with Doppler ultrasound surveillance. Divers used the bounce diving techniques for repair and maintenance of the TBM. The tunnel boring machine jammed at its deepest depth. As a result the work time was not sufficient. The saturation diving technique was developed and permitted longer work time at great depth. Thirty-one divers were involved in this project. Twenty-three divers were examined using Doppler ultrasound. Data analysis addressed 52 exposures to Trimix at 4.6-4.8 bar gauge using the bounce technique and 354 exposures to Trimix at 4.0-6.9 bar gauge on saturation excursions. No decompression incidents occurred with either technique during the described phase of the project. Doppler ultrasound revealed that the bubble loads assessed in both techniques were generally low. We find out, that despite longer working hours, shorter decompression times and larger physical workloads, the saturation-excursion technique was associated with significant lower bubble grades than in the bounce technique using Doppler Ultrasound. We conclude that the saturation-excursion technique with Trimix is a good option for deep and long exposures in caisson work. The Doppler technique proved valuable, and it should be incorporated in future compressed-air work.

  16. Signal processing methods for in-situ creep specimen monitoring

    NASA Astrophysics Data System (ADS)

    Guers, Manton J.; Tittmann, Bernhard R.

    2018-04-01

    Previous work investigated using guided waves for monitoring creep deformation during accelerated life testing. The basic objective was to relate observed changes in the time-of-flight to changes in the environmental temperature and specimen gage length. The work presented in this paper investigated several signal processing strategies for possible application in the in-situ monitoring system. Signal processing methods for both group velocity (wave-packet envelope) and phase velocity (peak tracking) time-of-flight were considered. Although the Analytic Envelope found via the Hilbert transform is commonly applied for group velocity measurements, erratic behavior in the indicated time-of-flight was observed when this technique was applied to the in-situ data. The peak tracking strategies tested had generally linear trends, and tracking local minima in the raw waveform ultimately showed the most consistent results.

  17. Wall function treatment for bubbly boundary layers at low void fractions.

    PubMed

    Soares, Daniel V; Bitencourt, Marcelo C; Loureiro, Juliana B R; Silva Freire, Atila P

    2018-01-01

    The present work investigates the role of different treatments of the lower boundary condition on the numerical prediction of bubbly flows. Two different wall function formulations are tested against experimental data obtained for bubbly boundary layers: (i) a new analytical solution derived through asymptotic techniques and (ii) the previous formulation of Troshko and Hassan (IJHMT, 44, 871-875, 2001a). A modified k-e model is used to close the averaged Navier-Stokes equations together with the hypothesis that turbulence can be modelled by a linear superposition of bubble and shear induced eddy viscosities. The work shows, in particular, how four corrections must the implemented in the standard single-phase k-e model to account for the effects of bubbles. The numerical implementation of the near wall functions is made through a finite elements code.

  18. Building NYX [Engineering Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-01-01

    From April, 1951, to August 1954, New York Shipbuilding corporation carried out a subcontract with E.I. du Pont de Nemours company that was without parallel in the shipyard's history. The work, designated the NYX Project'' for reasons of security, which was vital to the operations of the Savannah River Plant, Aiken, S.C., which was then being designed and constructed by du Pont for the Atomic Energy Commission. It consisted of three broad parts: Development and experimental work; fabrication and testing of a prototype unit; and fabrication of production units. Five production units were ultimately built, one of them converted frommore » the prototype. All were fabricated from stainless steel, and involved welding techniques, control of thermal distortion and tolerances never previously attempted on assemblies of comparable size. This report provides engineering drawings for this project.« less

  19. Building NYX [Engineering Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-12-31

    From April, 1951, to August 1954, New York Shipbuilding corporation carried out a subcontract with E.I. du Pont de Nemours & company that was without parallel in the shipyard`s history. The work, designated the ``NYX Project`` for reasons of security, which was vital to the operations of the Savannah River Plant, Aiken, S.C., which was then being designed and constructed by du Pont for the Atomic Energy Commission. It consisted of three broad parts: Development and experimental work; fabrication and testing of a prototype unit; and fabrication of production units. Five production units were ultimately built, one of them convertedmore » from the prototype. All were fabricated from stainless steel, and involved welding techniques, control of thermal distortion and tolerances never previously attempted on assemblies of comparable size. This report provides engineering drawings for this project.« less

  20. Hydraphiles enhance antimicrobial potency against Escherichia coli, Pseudomonas aeruginosa, and Bacillus subtilis.

    PubMed

    Patel, Mohit B; Garrad, Evan C; Stavri, Ariel; Gokel, Michael R; Negin, Saeedeh; Meisel, Joseph W; Cusumano, Zachary; Gokel, George W

    2016-06-15

    Hydraphiles are synthetic amphiphiles that form ion-conducting pores in liposomal membranes. These pores exhibit open-close behavior when studied by planar bilayer conductance techniques. In previous work, we showed that when co-administered with various antibiotics to the DH5α strain of Escherichia coli, they enhanced the drug's potency. We report here potency enhancements at low concentrations of hydraphiles for the structurally and mechanistically unrelated antibiotics erythromycin, kanamycin, rifampicin, and tetracycline against Gram negative E. coli (DH5α and K-12) and Pseudomonas aeruginosa, as well as Gram positive Bacillus subtilis. Earlier work suggested that potency increases correlated to ion transport function. The data presented here comport with the function of hydraphiles to enhance membrane permeability in addition to, or instead of, their known function as ion conductors. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Image Reconstruction for a Partially Collimated Whole Body PET Scanner

    PubMed Central

    Alessio, Adam M.; Schmitz, Ruth E.; MacDonald, Lawrence R.; Wollenweber, Scott D.; Stearns, Charles W.; Ross, Steven G.; Ganin, Alex; Lewellen, Thomas K.; Kinahan, Paul E.

    2008-01-01

    Partially collimated PET systems have less collimation than conventional 2-D systems and have been shown to offer count rate improvements over 2-D and 3-D systems. Despite this potential, previous efforts have not established image-based improvements with partial collimation and have not customized the reconstruction method for partially collimated data. This work presents an image reconstruction method tailored for partially collimated data. Simulated and measured sensitivity patterns are presented and provide a basis for modification of a fully 3-D reconstruction technique. The proposed method uses a measured normalization correction term to account for the unique sensitivity to true events. This work also proposes a modified scatter correction based on simulated data. Measured image quality data supports the use of the normalization correction term for true events, and suggests that the modified scatter correction is unnecessary. PMID:19096731

  2. Image Reconstruction for a Partially Collimated Whole Body PET Scanner.

    PubMed

    Alessio, Adam M; Schmitz, Ruth E; Macdonald, Lawrence R; Wollenweber, Scott D; Stearns, Charles W; Ross, Steven G; Ganin, Alex; Lewellen, Thomas K; Kinahan, Paul E

    2008-06-01

    Partially collimated PET systems have less collimation than conventional 2-D systems and have been shown to offer count rate improvements over 2-D and 3-D systems. Despite this potential, previous efforts have not established image-based improvements with partial collimation and have not customized the reconstruction method for partially collimated data. This work presents an image reconstruction method tailored for partially collimated data. Simulated and measured sensitivity patterns are presented and provide a basis for modification of a fully 3-D reconstruction technique. The proposed method uses a measured normalization correction term to account for the unique sensitivity to true events. This work also proposes a modified scatter correction based on simulated data. Measured image quality data supports the use of the normalization correction term for true events, and suggests that the modified scatter correction is unnecessary.

  3. Improving Lunar Exploration with Robotic Follow-up

    NASA Technical Reports Server (NTRS)

    Fong, T.; Bualat, M.; Deans, M.; Heggy E.; Helper, M.; Hodges, K.; Lee, P.

    2011-01-01

    We are investigating how augmenting human field work with subsequent robot activity can improve lunar exploration. Robotic "follow-up" might involve: completing geology observations; making tedious or long-duration measurements of a target site or feature; curating samples in-situ; and performing unskilled, labor-intensive work. To study this technique, we have begun conducting a series of lunar analog field tests at Haughton Crater (Canada). Motivation: In most field geology studies on Earth, explorers often find themselves left with a set of observations they would have liked to make, or samples they would have liked to take, if only they had been able to stay longer in the field. For planetary field geology, we can imagine mobile robots - perhaps teleoperated vehicles previously used for manned exploration or dedicated planetary rovers - being deployed to perform such follow-up activities [1].

  4. Strain, doping, and disorder effects in GaAs/Ge/Si heterostructures: A Raman spectroscopy investigation

    NASA Astrophysics Data System (ADS)

    Mlayah, A.; Carles, R.; Leycuras, A.

    1992-01-01

    The present work is devoted to a Raman study of GaAs/Ge/Si heterostructures grown by the vapor-phase epitaxy technique. We first show that the GaAs epilayers are submitted to a biaxial tensile strain. The strain relaxation generates misfit dislocations and thus disorder effects which we analyze in terms of translational invariance loss and Raman selection rules violation. The first-order Raman spectra of annealed samples exhibit an unexpected broadband we identify as due to scattering by a coupled LO phonon-damped plasmon mode. This is corroborated by an accurate line-shape analysis which accounts for the recorded spectra and makes evident the presence of free carriers within the GaAs layers. Their density is estimated from the deduced plasmon frequency and also using a method we have presented in a previous work.

  5. Absolute calorimetric calibration of low energy brachytherapy sources

    NASA Astrophysics Data System (ADS)

    Stump, Kurt E.

    In the past decade there has been a dramatic increase in the use of permanent radioactive source implants in the treatment of prostate cancer. A small radioactive source encapsulated in a titanium shell is used in this type of treatment. The radioisotopes used are generally 125I or 103Pd. Both of these isotopes have relatively short half-lives, 59.4 days and 16.99 days, respectively, and have low-energy emissions and a low dose rate. These factors make these sources well suited for this application, but the calibration of these sources poses significant metrological challenges. The current standard calibration technique involves the measurement of ionization in air to determine the source air-kerma strength. While this has proved to be an improvement over previous techniques, the method has been shown to be metrologically impure and may not be the ideal means of calbrating these sources. Calorimetric methods have long been viewed to be the most fundamental means of determining source strength for a radiation source. This is because calorimetry provides a direct measurement of source energy. However, due to the low energy and low power of the sources described above, current calorimetric methods are inadequate. This thesis presents work oriented toward developing novel methods to provide direct and absolute measurements of source power for low-energy low dose rate brachytherapy sources. The method is the first use of an actively temperature-controlled radiation absorber using the electrical substitution method to determine total contained source power of these sources. The instrument described operates at cryogenic temperatures. The method employed provides a direct measurement of source power. The work presented here is focused upon building a metrological foundation upon which to establish power-based calibrations of clinical-strength sources. To that end instrument performance has been assessed for these source strengths. The intent is to establish the limits of the current instrument to direct further work in this field. It has been found that for sources with powers above approximately 2 muW the instrument is able to determine the source power in agreement to within less than 7% of what is expected based upon the current source strength standard. For lower power sources, the agreement is still within the uncertainty of the power measurement, but the calorimeter noise dominates. Thus, to provide absolute calibration of lower power sources additional measures must be taken. The conclusion of this thesis describes these measures and how they will improve the factors that limit the current instrument. The results of the work presented in this thesis establish the methodology of active radiometric calorimetey for the absolute calibration of radioactive sources. The method is an improvement over previous techniques in that there is no reliance upon the thermal properties of the materials used or the heat flow pathways on the source measurements. The initial work presented here will help to shape future refinements of this technique to allow lower power sources to be calibrated with high precision and high accuracy.

  6. Techniques for detection and localization of weak hippocampal and medial frontal sources using beamformers in MEG.

    PubMed

    Mills, Travis; Lalancette, Marc; Moses, Sandra N; Taylor, Margot J; Quraan, Maher A

    2012-07-01

    Magnetoencephalography provides precise information about the temporal dynamics of brain activation and is an ideal tool for investigating rapid cognitive processing. However, in many cognitive paradigms visual stimuli are used, which evoke strong brain responses (typically 40-100 nAm in V1) that may impede the detection of weaker activations of interest. This is particularly a concern when beamformer algorithms are used for source analysis, due to artefacts such as "leakage" of activation from the primary visual sources into other regions. We have previously shown (Quraan et al. 2011) that we can effectively reduce leakage patterns and detect weak hippocampal sources by subtracting the functional images derived from the experimental task and a control task with similar stimulus parameters. In this study we assess the performance of three different subtraction techniques. In the first technique we follow the same post-localization subtraction procedures as in our previous work. In the second and third techniques, we subtract the sensor data obtained from the experimental and control paradigms prior to source localization. Using simulated signals embedded in real data, we show that when beamformers are used, subtraction prior to source localization allows for the detection of weaker sources and higher localization accuracy. The improvement in localization accuracy exceeded 10 mm at low signal-to-noise ratios, and sources down to below 5 nAm were detected. We applied our techniques to empirical data acquired with two different paradigms designed to evoke hippocampal and frontal activations, and demonstrated our ability to detect robust activations in both regions with substantial improvements over image subtraction. We conclude that removal of the common-mode dominant sources through data subtraction prior to localization further improves the beamformer's ability to project the n-channel sensor-space data to reveal weak sources of interest and allows more accurate localization.

  7. Segmentation and feature extraction of cervical spine x-ray images

    NASA Astrophysics Data System (ADS)

    Long, L. Rodney; Thoma, George R.

    1999-05-01

    As part of an R&D project in mixed text/image database design, the National Library of Medicine has archived a collection of 17,000 digitized x-ray images of the cervical and lumbar spine which were collected as part of the second National Health and Nutrition Examination Survey (NHANES II). To make this image data available and usable to a wide audience, we are investigating techniques for indexing the image content by automated or semi-automated means. Indexing of the images by features of interest to researchers in spine disease and structure requires effective segmentation of the vertebral anatomy. This paper describes work in progress toward this segmentation of the cervical spine images into anatomical components of interest, including anatomical landmarks for vertebral location, and segmentation and identification of individual vertebrae. Our work includes developing a reliable method for automatically fixing an anatomy-based coordinate system in the images, and work to adaptively threshold the images, using methods previously applied by researchers in cardioangiography. We describe the motivation for our work and present our current results in both areas.

  8. Earth observational research using multistage EOS-like data

    NASA Technical Reports Server (NTRS)

    Johannsen, C. J.; Landgrebe, D. A.

    1993-01-01

    This grant is funded as a part of a program in which both research and educational impact are intended. Research work under this grant is directed at the understanding and use of future hyperspectral data such as that from imaging spectrometers. Specifically, the objectives of the work are (1) to prepare suitable means for analyzing data from sensors which have large numbers of spectral bands, (2) to advance the fundamental understanding of the manner in which soils and vegetative materials reflect high spectral resolution optical wavelength radiation, and (3) to maximize the impact of the results on the educational community. Over the life of the grant, the work has thus involved basic Earth science research and information system technique understanding and development in a mutually supportive way, however, more recently it has become necessary to focus the work primarily on areas (1) and (3). During the last year, the level of effort on this grant has been reduced to half its previous value. We have also been advised that this grant will end with the current year, thus this will be the penultimate semiannual progress summary.

  9. a Study on Satellite Diagnostic Expert Systems Using Case-Based Approach

    NASA Astrophysics Data System (ADS)

    Park, Young-Tack; Kim, Jae-Hoon; Park, Hyun-Soo

    1997-06-01

    Many research works are on going to monitor and diagnose diverse malfunctions of satellite systems as the complexity and number of satellites increase. Currently, many works on monitoring and diagnosis are carried out by human experts but there are needs to automate much of the routine works of them. Hence, it is necessary to study on using expert systems which can assist human experts routine work by doing automatically, thereby allow human experts devote their expertise more critical and important areas of monitoring and diagnosis. In this paper, we are employing artificial intelligence techniques to model human experts' knowledge and inference the constructed knowledge. Especially, case-based approaches are used to construct a knowledge base to model human expert capabilities which use previous typical exemplars. We have designed and implemented a prototype case-based system for diagnosing satellite malfunctions using cases. Our system remembers typical failure cases and diagnoses a current malfunction by indexing the case base. Diverse methods are used to build a more user friendly interface which allows human experts can build a knowledge base in as easy way.

  10. Final Project Report for DOE Grant NO.: DE-SC0010534 Period: Sept 2013-March 31, 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunaydin, Murat

    2016-08-01

    Higher spin theories has been an active area of research in recent years. One of the main research activities of the PI Murat Gunaydin over the period of this grant has been the application of quasiconformal methods to construct and study higher spin (HS) algebras and superalgebras in various dimensions. Over the past decade work on amplitudes in gauge theories, supergravity and string theories has been a very active area of research. Enormous progress has been made in the understanding of the structure of amplitudes in these theories. The novel methods and results obtained have made it possible to domore » calculations in gauge theories and supergravity theories that go well beyond the calculations one can do using the old-fashioned Feynman diagram techniques. Work on amplitudes in matter-coupled supergravity theories has been the second main focus of the PI during the funding period. The previous work of the PI on supergravity theories has played a fundamentally important role in the current work on amplitudes.« less

  11. Prosthetic valve sparing aortic root replacement: an improved technique.

    PubMed

    Leacche, Marzia; Balaguer, Jorge M; Umakanthan, Ramanan; Byrne, John G

    2008-10-01

    We describe a modified surgical technique to treat patients with a previous history of isolated aortic valve replacement who now require aortic root replacement for an aneurysmal or dissected aorta. This technique consists of replacing the aortic root with a Dacron conduit, leaving intact the previously implanted prosthesis, and re-implanting the coronary arteries in the Dacron graft. Our technique differs from other techniques in that we do not leave behind any aortic tissue remnant and also in that we use a felt strip to obliterate any gap between the old sewing ring and the newly implanted graft. In our opinion, this promotes better hemostasis. We demonstrate that this technique is safe, feasible, and results in acceptable outcomes.

  12. Paul Ehrlich and the Early History of Granulocytes.

    PubMed

    Kay, A Barry

    2016-08-01

    Paul Ehrlich's techniques, published between 1879 and 1880, for staining blood films using coal tar dyes, and his method of differential blood cell counting, ended years of speculation regarding the classification of white cells. Acidic and basic dyes had allowed him to recognize eosinophil and basophil granules, respectively, work that was a direct continuation of his discovery of the tissue mast cell described in his doctoral thesis. Ehrlich went on to develop neutral dyes that identified epsilon granules in neutrophils ("cells with polymorphous nuclei"). He also speculated, for the most part correctly, on the formation, function, and fate of blood neutrophils and eosinophils. Before Ehrlich, a number of important observations had been made on white cells and their role in health and disease. Among the most notable were William Hewson's studies of blood and lymph; the early descriptions of leukemia by Alfred Donné, John Hughes Bennett, Rudolf Virchow, and others; as well as seminal observations on inflammation by William Addison, Friedrich von Recklinghausen, and Julius Cohnheim. Eosinophils were almost certainly recognized previously by others. In 1846, Thomas Wharton Jones (1808-1891) described "granule blood-cells" in several species including humans. The term "granule cell" had also been used by Julius Vogel (1814-1880), who had previously observed similar cells in inflammatory exudates. Vogel, in turn, was aware of the work of Gottlieb Gluge (1812-1898), who had observed "compound inflammatory globules" in pus and serum that resembled eosinophils. Almost 20 years before Ehrlich developed his staining methods, Max Johann Schultze (1825-1874) performed functional experiments on fine and coarse granular cells using a warm stage microscopic technique and showed they had amoeboid movement and phagocytic abilities. Despite these earlier observations, it was Ehrlich's use of stains that heralded the modern era of studies of leukocyte biology and pathology.

  13. Meteor showers of the southern hemisphere

    NASA Astrophysics Data System (ADS)

    Molau, Sirko; Kerr, Steve

    2014-04-01

    We present the results of an exhaustive meteor shower search in the southern hemisphere. The underlying data set is a subset of the IMO Video Meteor Database comprising 50,000 single station meteors obtained by three Australian cameras between 2001 and 2012. The detection technique was similar to previous single station analysis. In the data set we find 4 major and 6 minor northern hemisphere meteor showers, and 12 segments of the Antihelion source (including the Northern and Southern Taurids and six streams from the MDC working list). We present details for 14 southern hemisphere showers plus the Centaurid and Puppid-Velid complex, with the η Aquariids and the Southern δ Aquariids being the strongest southern showers. Two of the showers (θ^2 Sagittariids and τ Cetids) were previously unknown and have received preliminary designations by the MDC. Overall we find that the fraction of southern meteor showers south of -30deg declination (roughly 25%) is clearly smaller than the fraction of northern meteor showers north of +30deg declination (more than 50%) obtained in our previous analysis.

  14. Infrared Spectra of the n-PROPYL and i-PROPYL Radicals in Solid Para-Hydrogen

    NASA Astrophysics Data System (ADS)

    Pullen, Gregory T.; Franke, Peter R.; Douberly, Gary E.; Lee, Yuan-Pern

    2017-06-01

    We report the infrared spectra of the n-propyl and i-propyl radicals measured in solid para-hydrogen (p-H_2) matrices at 3.2 K. n-Propyl and i-propyl radicals were produced via the 248 nm irradiation of matrices formed by co-depositing p-H_2 and either 1-Iodopropane (n-propyl) or 2-Iodopropane (i-propyl). Secondary photolysis was used to group spectral lines all due to the same species. Lines in the C-H stretching region were compared to previous work using the Helium Nanodroplet Isolation (HENDI) technique, and are in excellent agreement. In addition to a few lines previously measured in Ar matrices, we observe many previously unreported bands below 2000 \\wn, which we attribute to the n-propyl and i-propyl radicals. The assignment of features below 2000 \\wn are made via comparisons to anharmonic VPT2+K frequency computations. Peter R. Franke, Daniel P. Tabor, Christopher P. Moradi, Gary E. Douberly, Jay Agarwal, Henry F. Schaefer III, and Edwin L. Sibert III, Journal of Chemical Physics 145, 224304 (2016).

  15. Uncover the mantle: rediscovering Gregório Lopes palette and technique with a study on the painting "Mater Misericordiae"

    NASA Astrophysics Data System (ADS)

    Antunes, Vanessa; Candeias, António; Oliveira, Maria J.; Carvalho, Maria L.; Dias, Cristina Barrocas; Manhita, Ana; Francisco, Maria J.; Costa, Sónia; Lauw, Alexandra; Manso, Marta

    2016-11-01

    Gregório Lopes (c. 1490-1550) was one of the most prominent painters of the renaissance and Mannerism in Portugal. The painting "Mater Misericordiae" made for the Sesimbra Holy House of Mercy, circa 1535-1538, is one of the most significant works of the artist, and his only painting on this theme, being also one of the most significant Portuguese paintings of sixteenth century. The recent restoration provided the possibility to study materially the painting for the first time, with a multianalytical methodology incorporating portable energy-dispersive X-ray fluorescence spectroscopy, scanning electron microscopy-energy-dispersive spectroscopy, micro-X-ray diffraction, micro-Raman spectroscopy and high-performance liquid chromatography coupled to diode array and mass spectrometry detectors. The analytical study was complemented by infrared reflectography, allowing the study of the underdrawing technique and also by dendrochronology to confirm the date of the wooden panels (1535-1538). The results of this study were compared with previous ones on the painter's workshop, and significant differences and similitudes were found in the materials and techniques used.

  16. Ab initio quantum direct dynamics simulations of ultrafast photochemistry with Multiconfigurational Ehrenfest approach

    NASA Astrophysics Data System (ADS)

    Makhov, Dmitry V.; Symonds, Christopher; Fernandez-Alberti, Sebastian; Shalashilin, Dmitrii V.

    2017-08-01

    The Multiconfigurational Ehrenfest (MCE) method is a quantum dynamics technique which allows treatment of a large number of quantum nuclear degrees of freedom. This paper presents a review of MCE and its recent applications, providing a summary of the formalisms, including its ab initio direct dynamics versions and also giving a summary of recent results. Firstly, we describe the Multiconfigurational Ehrenfest version 2 (MCEv2) method and its applicability to direct dynamics and report new calculations which show that the approach converges to the exact result in model systems with tens of degrees of freedom. Secondly, we review previous ;on the fly; ab initio Multiple Cloning (AIMC-MCE) MCE dynamics results obtained for systems of a similar size, in which the calculations treat every electron and every nucleus of a polyatomic molecule on a fully quantum basis. We also review the Time Dependent Diabatic Basis (TDDB) version of the technique and give an example of its application. We summarise the details of the sampling techniques and interpolations used for calculation of the matrix elements, which make our approach efficient. Future directions of work are outlined.

  17. Development of tritium permeation barriers on Al base in Europe

    NASA Astrophysics Data System (ADS)

    Benamati, G.; Chabrol, C.; Perujo, A.; Rigal, E.; Glasbrenner, H.

    The development of the water cooled lithium lead (WCLL) DEMO fusion reactor requires the production of a material capable of acting as a tritium permeation barrier (TPB). In the DEMO blanket reactor permeation barriers on the structural material are required to reduce the tritium permeation from the Pb-17Li or the plasma into the cooling water to acceptable levels (<1 g/d). Because of experimental work previously performed, one of the most promising TPB candidates is A1 base coatings. Within the EU a large R&D programme is in progress to develop a TPB fabrication technique, compatible with the structural materials requirements and capable of producing coatings with acceptable performances. The research is focused on chemical vapour deposition (CVD), hot dipping, hot isostatic pressing (HIP) technology and spray (this one developed also for repair) deposition techniques. The final goal is to select a reference technique to be used in the blanket of the DEMO reactor and in the ITER test module fabrication. The activities performed in four European laboratories are summarised here.

  18. Proteomic Characterization of Dermal Interstitial Fluid Extracted Using a Novel Microneedle-Assisted Technique.

    PubMed

    Tran, Bao Quoc; Miller, Philip R; Taylor, Robert M; Boyd, Gabrielle; Mach, Phillip M; Rosenzweig, C Nicole; Baca, Justin T; Polsky, Ronen; Glaros, Trevor

    2018-01-05

    As wearable fitness devices have gained commercial acceptance, interest in real-time monitoring of an individual's physiological status using noninvasive techniques has grown. Microneedles have been proposed as a minimally invasive technique for sampling the dermal interstitial fluid (ISF) for clinical monitoring and diagnosis, but little is known about its composition. In this study, a novel microneedle array was used to collect dermal ISF from three healthy human donors and compared with matching serum and plasma samples. Using a shotgun quantitative proteomic approach, 407 proteins were quantified with at least one unique peptide, and of those, 135 proteins were differently expressed at least 2-fold. Collectively, these proteins tended to originate from the cytoplasm, membrane bound vesicles, and extracellular vesicular exosomes. Proteomic analysis confirmed previously published work that indicates that ISF is highly similar to both plasma and serum. In this study, less than one percent of proteins were uniquely identified in ISF. Taken together, ISF could serve as a minimally invasive alternative for blood-derived fluids with potential for real-time monitoring applications.

  19. Automated simultaneous multiple feature classification of MTI data

    NASA Astrophysics Data System (ADS)

    Harvey, Neal R.; Theiler, James P.; Balick, Lee K.; Pope, Paul A.; Szymanski, John J.; Perkins, Simon J.; Porter, Reid B.; Brumby, Steven P.; Bloch, Jeffrey J.; David, Nancy A.; Galassi, Mark C.

    2002-08-01

    Los Alamos National Laboratory has developed and demonstrated a highly capable system, GENIE, for the two-class problem of detecting a single feature against a background of non-feature. In addition to the two-class case, however, a commonly encountered remote sensing task is the segmentation of multispectral image data into a larger number of distinct feature classes or land cover types. To this end we have extended our existing system to allow the simultaneous classification of multiple features/classes from multispectral data. The technique builds on previous work and its core continues to utilize a hybrid evolutionary-algorithm-based system capable of searching for image processing pipelines optimized for specific image feature extraction tasks. We describe the improvements made to the GENIE software to allow multiple-feature classification and describe the application of this system to the automatic simultaneous classification of multiple features from MTI image data. We show the application of the multiple-feature classification technique to the problem of classifying lava flows on Mauna Loa volcano, Hawaii, using MTI image data and compare the classification results with standard supervised multiple-feature classification techniques.

  20. Study of an Acid-Free Technique for the Preparation of Glycyrrhetinic Acid from Ammonium Glycyrrhizinate in Subcritical Water.

    PubMed

    Lekar, Anna V; Borisenko, Sergey N; Vetrova, Elena V; Filonova, Olga V; Maksimenko, Elena V; Borisenko, Nikolai I; Minkin, Vladimir I

    2015-11-01

    The aim of this work was to study an application of a previously developed expedient acid-free technique for the preparation of glycyrrhetinic acid from ammonium glycyrrhizinate that requires no use of acids and toxic organic solvents. Subcritical water that serves as a reactant and a solvent was used in order to obtain glycyrrhetinic acid in good yields starting from ammonium glycyrrhizinate. It has been shown that variation of only one parameter of the process (temperature) allows alteration to thecomposition of the hydrolysis products. A new method was used for the synthesis of glycyrrhetinic acid (glycyrrhizic acid aglycone) and its monoglycoside. HPLC combined with mass spectrometry and NMR spectroscopy were used to determine the quantitative and qualitative compositions of the obtained products. The method developed for the production of glycyrrhetinic acid in subcritical water is environmentally friendly and faster than conventional hydrolysis methods that use acids and-expensive and toxic organic solvents. The proposed technique has a potential for the future development of inexpensive and environmentally friendly technologies for production of new pharmaceutical plant-based substances.

  1. Industrial and occupational ergonomics in the petrochemical process industry: a regression trees approach.

    PubMed

    Bevilacqua, M; Ciarapica, F E; Giacchetta, G

    2008-07-01

    This work is an attempt to apply classification tree methods to data regarding accidents in a medium-sized refinery, so as to identify the important relationships between the variables, which can be considered as decision-making rules when adopting any measures for improvement. The results obtained using the CART (Classification And Regression Trees) method proved to be the most precise and, in general, they are encouraging concerning the use of tree diagrams as preliminary explorative techniques for the assessment of the ergonomic, management and operational parameters which influence high accident risk situations. The Occupational Injury analysis carried out in this paper was planned as a dynamic process and can be repeated systematically. The CART technique, which considers a very wide set of objective and predictive variables, shows new cause-effect correlations in occupational safety which had never been previously described, highlighting possible injury risk groups and supporting decision-making in these areas. The use of classification trees must not, however, be seen as an attempt to supplant other techniques, but as a complementary method which can be integrated into traditional types of analysis.

  2. Improved spatial and temporal characteristics of ionospheric irregularities and polar mesospheric summer echoes using coherent MIMO and aperture synthesis radar imaging

    NASA Astrophysics Data System (ADS)

    Chau, J. L.; Urco, J. M.; Milla, M. A.; Vierinen, J.

    2017-12-01

    We have recently implemented Multiple-input multiple-output (MIMO) radar techniques to resolve temporal and spatial ambiguities of ionospheric and atmospheric irregularities, with improve capabilities than previously experiments using single-input multi-output (SIMO) techniques. SIMO techniques in the atmospheric and ionospheric coherent scatter radar field are usually called aperture synthesis radar imaging. Our implementations have done at the Jicamarca Radio Observatory (JRO) in Lima, Peru, and at the Middle Atmosphere Alomar Radar System (MAARSY) in Andenes, Norway, to study equatorial electrojet (EEJ) field-aligned irregularities and polar mesospheric summer echoes (PMSE), respectively. Figure 1 shows an example of a configuration used at MAARSY and the comparison between the SIMO and MIMO resulting antenna point spread functions, respectively. Although in this work we present the details of the implementations at each facility, we will focus on the observed peculiarities of each phenomenon, making emphasis in the underlying physical mechanisms that govern their existence and their spatial and temporal modulation. For example, what are the typical horizontal scales of PMSE variability in both intensity and wind field?

  3. New Ground Truth Capability from InSAR Time Series Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckley, S; Vincent, P; Yang, D

    2005-07-13

    We demonstrate that next-generation interferometric synthetic aperture radar (InSAR) processing techniques applied to existing data provide rich InSAR ground truth content for exploitation in seismic source identification. InSAR time series analyses utilize tens of interferograms and can be implemented in different ways. In one such approach, conventional InSAR displacement maps are inverted in a final post-processing step. Alternatively, computationally intensive data reduction can be performed with specialized InSAR processing algorithms. The typical final result of these approaches is a synthesized set of cumulative displacement maps. Examples from our recent work demonstrate that these InSAR processing techniques can provide appealing newmore » ground truth capabilities. We construct movies showing the areal and temporal evolution of deformation associated with previous nuclear tests. In other analyses, we extract time histories of centimeter-scale surface displacement associated with tunneling. The potential exists to identify millimeter per year surface movements when sufficient data exists for InSAR techniques to isolate and remove phase signatures associated with digital elevation model errors and the atmosphere.« less

  4. Thermal Imaging of Convecting Opaque Fluids using Ultrasound

    NASA Technical Reports Server (NTRS)

    Xu, Hongzhou; Fife, Sean; Andereck, C. David

    2002-01-01

    An ultrasound technique has been developed to non-intrusively image temperature fields in small-scale systems of opaque fluids undergoing convection. Fluids such as molten metals, semiconductors, and polymers are central to many industrial processes, and are often found in situations where natural convection occurs, or where thermal gradients are otherwise important. However, typical thermal and velocimetric diagnostic techniques rely upon transparency of the fluid and container, or require the addition of seed particles, or require mounting probes inside the fluid, all of which either fail altogether in opaque fluids, or necessitate significant invasion of the flow and/or modification of the walls of the container to allow access to the fluid. The idea behind our work is to use the temperature dependence of sound velocity, and the ease of propagation of ultrasound through fluids and solids, to probe the thermal fields of convecting opaque fluids non-intrusively and without the use of seed particles. The technique involves the timing of the return echoes from ultrasound pulses, a variation on an approach used previously in large-scale systems.

  5. Comparison of different estimation techniques for biomass concentration in large scale yeast fermentation.

    PubMed

    Hocalar, A; Türker, M; Karakuzu, C; Yüzgeç, U

    2011-04-01

    In this study, previously developed five different state estimation methods are examined and compared for estimation of biomass concentrations at a production scale fed-batch bioprocess. These methods are i. estimation based on kinetic model of overflow metabolism; ii. estimation based on metabolic black-box model; iii. estimation based on observer; iv. estimation based on artificial neural network; v. estimation based on differential evaluation. Biomass concentrations are estimated from available measurements and compared with experimental data obtained from large scale fermentations. The advantages and disadvantages of the presented techniques are discussed with regard to accuracy, reproducibility, number of primary measurements required and adaptation to different working conditions. Among the various techniques, the metabolic black-box method seems to have advantages although the number of measurements required is more than that for the other methods. However, the required extra measurements are based on commonly employed instruments in an industrial environment. This method is used for developing a model based control of fed-batch yeast fermentations. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  6. The missed inferior alveolar block: a new look at an old problem.

    PubMed

    Milles, M

    1984-01-01

    A variation of a previously described technique to obtain mandibular block anesthesia is presented. This technique varies from those previously described in that is uses palpable anatomic landmarks, both extra- and intraoral, to orient the placement of the needle. This technique relies on several readily observed landmarks and the integration of these landmarks. Because palpable landmarks are used, consistent results can be easily obtained even in patients who present with a wide variety of anatomical variances which otherwise make this injection technique difficult and prone to failure.

  7. Evaluation of MOSFET-type glucose sensor using platinum electrode with glucose oxidase

    NASA Astrophysics Data System (ADS)

    Ooe, Katsutoshi; Hamamoto, Yasutaro; Hirano, Yoshiaki

    2005-02-01

    As the population ages, health management will be one of the important issues. The development of a safe medical machine based on MEMS technologies for the human body will be the primary research project in the future. We have developed the glucose sensor, as one of the medical based devices, for use in the Health Monitoring System (HMS). HMS is the device that continuously monitors human health conditions. For example, blood is the monitoring target of HMS. The glucose sensor specifically detects the glucose levels of the blood and monitors the glucose concentration as the blood sugar level. This glucose sensor has a "separated Au electrode", which immobilizes GOx. In our previous work, GOx was immobilized onto Au electrode by the SAMs (Self-Assembled Monolayer) method, and the sensor, using this working electrode, detected the glucose concentration of an aqueous glucose solution. In this report, we used a Pt electrode, which immobilized GOx, as a working electrode. Au electrode, which was used previously, was dissolved by the application of current in the presence of chloride ions. Based on the above-mentioned fact, a new working electrode, which immobilized GOx, was produced using Pt, which did not possess such characteristics. These Pt working electrodes were produced using the covalent binding method and the cross-link method, and both the electrodes displayed a good sensing property. In addition, the electrode using glutaraldehyde (GA) and bovine serum albumin (BSA) as crosslinking agents was produced, and it displayed better characteristics as compared with those displayed by the electrode that used only GA. Based on the above-mentioned techniques, the improvement in performance of the sensor was confirmed.

  8. Examining the Efficacy of the Modified Story Memory Technique (mSMT) in Persons With TBI Using Functional Magnetic Resonance Imaging (fMRI): The TBI-MEM Trial.

    PubMed

    Chiaravalloti, Nancy D; Dobryakova, Ekaterina; Wylie, Glenn R; DeLuca, John

    2015-01-01

    New learning and memory deficits are common following traumatic brain injury (TBI). Yet few studies have examined the efficacy of memory retraining in TBI through the most methodologically vigorous randomized clinical trial. Our previous research has demonstrated that the modified Story Memory Technique (mSMT) significantly improves new learning and memory in multiple sclerosis. The present double-blind, placebo-controlled, randomized clinical trial examined changes in cerebral activation on functional magnetic resonance imaging following mSMT treatment in persons with TBI. Eighteen individuals with TBI were randomly assigned to treatment (n = 9) or placebo (n = 9) groups. Baseline and follow-up functional magnetic resonance imaging was collected during a list-learning task. Significant differences in cerebral activation from before to after treatment were noted in regions belonging to the default mode network and executive control network in the treatment group only. Results are interpreted in light of these networks. Activation differences between the groups likely reflect increased use of strategies taught during treatment. This study demonstrates a significant change in cerebral activation resulting from the mSMT in a TBI sample. Findings are consistent with previous work in multiple sclerosis. Behavioral interventions can show significant changes in the brain, validating clinical utility.

  9. Manual therapy for tension-type headache related to quality of work life and work presenteeism: Secondary analysis of a randomized controlled trial.

    PubMed

    Monzani, Lucas; Espí-López, Gemma Victoria; Zurriaga, Rosario; Andersen, Lars L

    2016-04-01

    The objective of this research is to evaluate the efficacy of manual therapy for tension-type headache (TTH) in restoring workers quality of work life, and how work presenteeism affects this relation. This study is a secondary analysis of a factorial, randomized clinical trial on manual therapy interventions. Altogether, 80 patients (85% women) with TTH and without current symptoms of any other concomitant disease participated. An experienced therapist delivered the treatment: myofascial inhibitory technique (IT), articulatory technique (AT), combined technique (IT and AT), and control group (no treatment). In general, all treatments as compared to our control group had a large effect (f≥.69) in the improvement of participants' quality of work life. Work presenteeism interacted with TTH treatment type's efficacy on participant's quality of work life. The inhibitory technique lead to higher reports of quality of work life than other treatment options only for participants with very low frequency of work presenteeism. In turn, TTH articulatory treatment techniques resulted in higher reports of quality of work life for a high to very high work presenteeism frequency. Articulatory manipulation technique is the more efficient treatment to improve quality of work life when the frequency of work presenteeism is high. Implications for future research and practice are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Reliability and criterion validity of an observation protocol for working technique assessments in cash register work.

    PubMed

    Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina

    2016-06-01

    We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.

  11. The pearls of using real-world evidence to discover social groups

    NASA Astrophysics Data System (ADS)

    Cardillo, Raymond A.; Salerno, John J.

    2005-03-01

    In previous work, we introduced a new paradigm called Uni-Party Data Community Generation (UDCG) and a new methodology to discover social groups (a.k.a., community models) called Link Discovery based on Correlation Analysis (LDCA). We further advanced this work by experimenting with a corpus of evidence obtained from a Ponzi scheme investigation. That work identified several UDCG algorithms, developed what we called "Importance Measures" to compare the accuracy of the algorithms based on ground truth, and presented a Concept of Operations (CONOPS) that criminal investigators could use to discover social groups. However, that work used a rather small random sample of manually edited documents because the evidence contained far too many OCR and other extraction errors. Deferring the evidence extraction errors allowed us to continue experimenting with UDCG algorithms, but only used a small fraction of the available evidence. In attempt to discover techniques that are more practical in the near-term, our most recent work focuses on being able to use an entire corpus of real-world evidence to discover social groups. This paper discusses the complications of extracting evidence, suggests a method of performing name resolution, presents a new UDCG algorithm, and discusses our future direction in this area.

  12. An all-optical fiber optic photoacoustic transducer

    NASA Astrophysics Data System (ADS)

    Thathachary, Supriya V.; Motameni, Cameron; Ashkenazi, Shai

    2018-02-01

    A highly sensitive fiber-optic Fabry-Perot photoacoustic transducer is proposed in this work. The transducer will consist of separate transmit and receive fibers. The receiver will be composed of a Fabry-Perot Ultrasound sensor with a selfwritten waveguide with all-optical ultrasound detection with high sensitivity. In previous work, we have shown an increase in resonator Q-factor from 1900 to 3200 for a simulated Fabry-Perot ultrasound detector of 45 μm thickness upon including a waveguide to limit lateral power losses. Subsequently, we demonstrated a prototype device with 30nm gold mirrors and a cavity composed of the photosensitive polymer Benzocyclobutene. This 80 µm thick device showed an improvement in its Q-factor from 2500 to 5200 after a selfaligned waveguide was written into the cavity using UV exposure. Current work uses a significantly faster fabrication technique using a combination of UV-cured epoxies for the cavity medium, and the waveguide within it. This reduces the fabrication time from several hours to a few minutes, and significantly lowers the cost of fabrication. We use a dip-coating technique to deposit the polymer layer. Future work will include the use of Dielectric Bragg mirrors in place of gold to achieve better reflectivity, thereby further improving the Q-factor of the device. The complete transducer presents an ideal solution for intravascular imaging in cases where tissue differentiation is desirable, an important feature in interventional procedures where arterial perforation is a risk. The final design proposed comprises the transducer within a guidewire to guide interventions for Chronic Total Occlusions, a disease state for which there are currently no invasive imaging options.

  13. Design of Composite Structures Using Knowledge-Based and Case Based Reasoning

    NASA Technical Reports Server (NTRS)

    Lambright, Jonathan Paul

    1996-01-01

    A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of limited well defined rules. The findings indicated that the technique is most effective when used as a design aid and not as a tool to totally automate the composites design process. Other areas of application and implications for future research are discussed.

  14. Stress Management and Relaxation Techniques Use among Underserved Inpatients in an Inner City Hospital

    PubMed Central

    Gardiner, Paula; Sadikova, Ekaterina; Filippelli, Amanda C.; Mitchell, Suzanne; White, Laura F.; Saper, Robert; Kaptchuk, Ted J.; Jack, Brian W.; Fredman, Lisa

    2015-01-01

    Objective Little is known about the use of Stress Management and Relaxation Techniques (SMART) in racially diverse inpatients. We hope to identify socioeconomic status (SES) factors, health behavior factors, and clinical factors associated with the use of SMART. Design and Main Outcome Measures We conducted a secondary analysis of baseline data from 623 hospitalized patients enrolled in the Re-Engineered Discharge (RED) clinical trial. We assessed socio-demographic characteristics and use of SMART. We used bivariate and multivariate logistic regression to test the association of SMART with socio-demographic characteristics, health behaviors, and clinical factors. Results A total of 26.6% of participants reported using SMART and 23.6% used mind body techniques. Thirty six percent of work disabled patients, 39% of illicit drug users, and 38% of participants with depressive symptoms used SMART. Patients who both reported illicit drug use and screened positive for depression had significantly increased odds of using SMART [OR=4.94, 95% CI (1.59, 15.13)]. Compared to non-Hispanic whites, non-Hispanic blacks [0.55, (0.34 to 0.87)] and Hispanic/other race individuals [0.40, (0.20 to 0.76)] were less likely to use SMART. Conclusions We found greater utilization of SMART among all racial groups compared to previous national studies. In the inner city inpatient setting, patients with depression, illicit drug use, and work disability reported higher rates of using SMART. PMID:26051576

  15. Network module detection: Affinity search technique with the multi-node topological overlap measure

    PubMed Central

    Li, Ai; Horvath, Steve

    2009-01-01

    Background Many clustering procedures only allow the user to input a pairwise dissimilarity or distance measure between objects. We propose a clustering method that can input a multi-point dissimilarity measure d(i1, i2, ..., iP) where the number of points P can be larger than 2. The work is motivated by gene network analysis where clusters correspond to modules of highly interconnected nodes. Here, we define modules as clusters of network nodes with high multi-node topological overlap. The topological overlap measure is a robust measure of interconnectedness which is based on shared network neighbors. In previous work, we have shown that the multi-node topological overlap measure yields biologically meaningful results when used as input of network neighborhood analysis. Findings We adapt network neighborhood analysis for the use of module detection. We propose the Module Affinity Search Technique (MAST), which is a generalized version of the Cluster Affinity Search Technique (CAST). MAST can accommodate a multi-node dissimilarity measure. Clusters grow around user-defined or automatically chosen seeds (e.g. hub nodes). We propose both local and global cluster growth stopping rules. We use several simulations and a gene co-expression network application to argue that the MAST approach leads to biologically meaningful results. We compare MAST with hierarchical clustering and partitioning around medoid clustering. Conclusion Our flexible module detection method is implemented in the MTOM software which can be downloaded from the following webpage: PMID:19619323

  16. B-scan technique for localization and characterization of fatigue cracks around fastener holes in multi-layered structures

    NASA Astrophysics Data System (ADS)

    Hopkins, Deborah; Datuin, Marvin; Aldrin, John; Warchol, Mark; Warchol, Lyudmila; Forsyth, David

    2018-04-01

    The work presented here aims to develop and transition angled-beam shear-wave inspection techniques for crack localization at fastener sites in multi-layer aircraft structures. This requires moving beyond detection to achieve reliable crack location and size, thereby providing invaluable information for maintenance actions and service-life management. The technique presented is based on imaging cracks in "True" B-scans (depth view projected in the sheets along the beam path). The crack traces that contribute to localization in the True B-scans depend on small, diffracted signals from the crack edges and tips that are visible in simulations and experimental data acquired with sufficient gain. The most recent work shows that cracks rotated toward and away from the central ultrasonic beam also yield crack traces in True B-scans that allow localization in simulations, even for large obtuse angles where experimental and simulation results show very small or no indications in the C-scans. Similarly, for two sheets joined by sealant, simulations show that cracks in the second sheet can be located in True B-scans for all locations studied: cracks that intersect the front or back wall of the second sheet, as well as relatively small mid-bore cracks. These results are consistent with previous model verification and sensitivity studies that demonstrate crack localization in True B-scans for a single sheet and cracks perpendicular to the ultrasonic beam.

  17. Network module detection: Affinity search technique with the multi-node topological overlap measure.

    PubMed

    Li, Ai; Horvath, Steve

    2009-07-20

    Many clustering procedures only allow the user to input a pairwise dissimilarity or distance measure between objects. We propose a clustering method that can input a multi-point dissimilarity measure d(i1, i2, ..., iP) where the number of points P can be larger than 2. The work is motivated by gene network analysis where clusters correspond to modules of highly interconnected nodes. Here, we define modules as clusters of network nodes with high multi-node topological overlap. The topological overlap measure is a robust measure of interconnectedness which is based on shared network neighbors. In previous work, we have shown that the multi-node topological overlap measure yields biologically meaningful results when used as input of network neighborhood analysis. We adapt network neighborhood analysis for the use of module detection. We propose the Module Affinity Search Technique (MAST), which is a generalized version of the Cluster Affinity Search Technique (CAST). MAST can accommodate a multi-node dissimilarity measure. Clusters grow around user-defined or automatically chosen seeds (e.g. hub nodes). We propose both local and global cluster growth stopping rules. We use several simulations and a gene co-expression network application to argue that the MAST approach leads to biologically meaningful results. We compare MAST with hierarchical clustering and partitioning around medoid clustering. Our flexible module detection method is implemented in the MTOM software which can be downloaded from the following webpage: http://www.genetics.ucla.edu/labs/horvath/MTOM/

  18. The importance of accurate measurement of aortic stiffness in patients with chronic kidney disease and end-stage renal disease.

    PubMed

    Adenwalla, Sherna F; Graham-Brown, Matthew P M; Leone, Francesca M T; Burton, James O; McCann, Gerry P

    2017-08-01

    Cardiovascular (CV) disease is the leading cause of death in chronic kidney disease (CKD) and end-stage renal disease (ESRD). A key driver in this pathology is increased aortic stiffness, which is a strong, independent predictor of CV mortality in this population. Aortic stiffening is a potentially modifiable biomarker of CV dysfunction and in risk stratification for patients with CKD and ESRD. Previous work has suggested that therapeutic modification of aortic stiffness may ameliorate CV mortality. Nevertheless, future clinical implementation relies on the ability to accurately and reliably quantify stiffness in renal disease. Pulse wave velocity (PWV) is an indirect measure of stiffness and is the accepted standard for non-invasive assessment of aortic stiffness. It has typically been measured using techniques such as applanation tonometry, which is easy to use but hindered by issues such as the inability to visualize the aorta. Advances in cardiac magnetic resonance imaging now allow direct measurement of stiffness, using aortic distensibility, in addition to PWV. These techniques allow measurement of aortic stiffness locally and are obtainable as part of a comprehensive, multiparametric CV assessment. The evidence cannot yet provide a definitive answer regarding which technique or parameter can be considered superior. This review discusses the advantages and limitations of non-invasive methods that have been used to assess aortic stiffness, the key studies that have assessed aortic stiffness in patients with renal disease and why these tools should be standardized for use in clinical trial work.

  19. Extraction of Black Hole Shadows Using Ridge Filtering and the Circle Hough Transform

    NASA Astrophysics Data System (ADS)

    Hennessey, Ryan; Akiyama, Kazunori; Fish, Vincent

    2018-01-01

    Supermassive black holes are widely considered to reside at the center of most large galaxies. One of the foremost tasks in modern astronomy is to image the centers of local galaxies, such as that of Messier 87 (M87) and Sagittarius A* at the center of our own Milky Way, to gain the first glimpses of black holes and their surrounding structures. Using data obtained from the Event Horizon Telescope (EHT), a global collection of millimeter-wavelength telescopes designed to perform very long baseline interferometry, new imaging techniques will likely be able to yield images of these structures at fine enough resolutions to compare with the predictions of general relativity and give us more insight into the formation of black holes, their surrounding jets and accretion disks, and galaxies themselves. Techniques to extract features from these images are already being developed. In this work, we present a new method for measuring the size of the black hole shadow, a feature that encodes information about the black hole mass and spin, using ridge filtering and the circle Hough transform. Previous methods have succeeded in extracting the black hole shadow with an accuracy of about 10- 20%, but using this new technique we are able to measure the shadow size with even finer accuracy. Our work indicates that the EHT will be able to significantly reduce the uncertainty in the estimate of the mass of the supermassive black hole in M87.

  20. High-sensitivity explosives detection using dual-excitation-wavelength resonance-Raman detector

    NASA Astrophysics Data System (ADS)

    Yellampalle, Balakishore; McCormick, William B.; Wu, Hai-Shan; Sluch, Mikhail; Martin, Robert; Ice, Robert V.; Lemoff, Brian

    2014-05-01

    A key challenge for standoff explosive sensors is to distinguish explosives, with high confidence, from a myriad of unknown background materials that may have interfering spectral peaks. To meet this challenge a sensor needs to exhibit high specificity and high sensitivity in detection at low signal-to-noise ratio levels. We had proposed a Dual-Excitation- Wavelength Resonance-Raman Detector (DEWRRED) to address this need. In our previous work, we discussed various components designed at WVHTCF for a DEWRRED sensor. In this work, we show a completely assembled laboratory prototype of a DEWRRED sensor and utilize it to detect explosives from two standoff distances. The sensor system includes two novel, compact CW deep-Ultraviolet (DUV) lasers, a compact dual-band high throughput DUV spectrometer, and a highly-sensitive detection algorithm. We choose DUV excitation because Raman intensities from explosive traces are enhanced and fluorescence and solar background are not present. The DEWRRED technique exploits the excitation wavelength dependence of Raman signal strength, arising from complex interplay of resonant enhancement, self-absorption and laser penetration depth. We show measurements from >10 explosives/pre-cursor materials at different standoff distances. The sensor showed high sensitivity in explosive detection even when the signalto- noise ratio was close to one (~1.6). We measured receiver-operating-characteristics, which show a clear benefit in using the dual-excitation-wavelength technique as compared to a single-excitation-wavelength technique. Our measurements also show improved specificity using the amplitude variation information in the dual-excitation spectra.

  1. General Relativistic Precession in Small Solar System Bodies

    NASA Astrophysics Data System (ADS)

    Sekhar, Aswin; Werner, Stephanie; Hoffmann, Volker; Asher, David; Vaubaillon, Jeremie; Hajdukova, Maria; Li, Gongjie

    2016-10-01

    Introduction: One of the greatest successes of the Einstein's General Theory of Relativity (GR) was the correct prediction of the precession of perihelion of Mercury. The closed form expression to compute this precession tells us that substantial GR precession would occur only if the bodies have a combination of both moderately small perihelion distance and semi-major axis. Minimum Orbit Intersection Distance (MOID) is a quantity which helps us to understand the closest proximity of two orbits in space. Hence evaluating MOID is crucial to understand close encounters and collision scenarios better. In this work, we look at the possible scenarios where a small GR precession in argument of pericentre (ω) can create substantial changes in MOID for small bodies ranging from meteoroids to comets and asteroids.Analytical Approach and Numerical Integrations: Previous works have looked into neat analytical techniques to understand different collision scenarios and we use those standard expressions to compute MOID analytically. We find the nature of this mathematical function is such that a relatively small GR precession can lead to drastic changes in MOID values depending on the initial value of ω. Numerical integrations were done with package MERCURY incorporating the GR code to test the same effects. Numerical approach showed the same interesting relationship (as shown by analytical theory) between values of ω and the peaks/dips in MOID values. Previous works have shown that GR precession suppresses Kozai oscillations and this aspect was verified using our integrations. There is an overall agreement between both analytical and numerical methods.Summary and Discussion: We find that GR precession could play an important role in the calculations pertaining to MOID and close encounter scenarios in the case of certain small solar system bodies (depending on their initial orbital elements). Previous works have looked into impact probabilities and collision scenarios on planets from different small body populations. This work aims to find certain sub-sets of orbits where GR could play an interesting role. Certain parallels are drawn between the cases of asteroids, comets and small perihelion distance meteoroid streams.

  2. A Fast Hartley Transform based novel optical OFDM system for VLC indoor application with constant envelope PAPR reduction technique using frequency modulation

    NASA Astrophysics Data System (ADS)

    Singh, Vinay Kumar; Dalal, U. D.

    2017-10-01

    In this research literature we present a unique optical OFDM system for Visible Light Communication (VLC) intended for indoor application which uses a non conventional transform-Fast Hartley Transform and an effective method to reduce the peak to average power ratio (PAPR) of the OFDM signal based on frequency modulation leading to a constant envelope (CE) signal. The proposed system is analyzed by a complete mathematical model and verified by the concurrent simulations results. The use of the non conventional transform makes the system computationally more desirable as it does not require the Hermitian symmetry constraint to yield real signals. The frequency modulation of the baseband signal converge random peaks into a CE signal. This leads to alleviation of the non linearity effects of the LED used in the link for electrical to optical conversion. The PAPR is reduced to 2 dB by this technique in this work. The impact of the modulation index on the performance of the system is also investigated. An optimum modulation depth of 30% gives better results. The additional phase discontinuity incurring on the demodulated signal at the receiver is also significantly reduced. A comparison of the improvement in phase discontinuity of the proposed technique of combating the PAPR with the previously known phase modulation technique is also presented in this work. Based on the channel metrics we evaluate the system performance and report an improvement of 1.2 dB at the FEC threshold. The proposed system is simple in design and computationally efficient and this can be incorporated into the present VLC system without much alteration thereby making it a cost effective solution.

  3. On a two-dimensional mode-matching technique for sound generation and transmission in axial-flow outlet guide vanes

    NASA Astrophysics Data System (ADS)

    Bouley, Simon; François, Benjamin; Roger, Michel; Posson, Hélène; Moreau, Stéphane

    2017-09-01

    The present work deals with the analytical modeling of two aspects of outlet guide vane aeroacoustics in axial-flow fan and compressor rotor-stator stages. The first addressed mechanism is the downstream transmission of rotor noise through the outlet guide vanes, the second one is the sound generation by the impingement of the rotor wakes on the vanes. The elementary prescribed excitation of the stator is an acoustic wave in the first case and a hydrodynamic gust in the second case. The solution for the response of the stator is derived using the same unified approach in both cases, within the scope of a linearized and compressible inviscid theory. It is provided by a mode-matching technique: modal expressions are written in the various sub-domains upstream and downstream of the stator as well as inside the inter-vane channels, and matched according to the conservation laws of fluid dynamics. This quite simple approach is uniformly valid in the whole range of subsonic Mach numbers and frequencies. It is presented for a two-dimensional rectilinear-cascade of zero-staggered flat-plate vanes and completed by the implementation of a Kutta condition. It is then validated in sound generation and transmission test cases by comparing with a previously reported model based on the Wiener-Hopf technique and with reference numerical simulations. Finally it is used to analyze the tonal rotor-stator interaction noise in a typical low-speed fan architecture. The interest of the mode-matching technique is that it could be easily transposed to a three-dimensional annular cascade in cylindrical coordinates in a future work. This makes it an attractive alternative to the classical strip-theory approach.

  4. Distributed Denial of Service Attack Source Detection Using Efficient Traceback Technique (ETT) in Cloud-Assisted Healthcare Environment.

    PubMed

    Latif, Rabia; Abbas, Haider; Latif, Seemab; Masood, Ashraf

    2016-07-01

    Security and privacy are the first and foremost concerns that should be given special attention when dealing with Wireless Body Area Networks (WBANs). As WBAN sensors operate in an unattended environment and carry critical patient health information, Distributed Denial of Service (DDoS) attack is one of the major attacks in WBAN environment that not only exhausts the available resources but also influence the reliability of information being transmitted. This research work is an extension of our previous work in which a machine learning based attack detection algorithm is proposed to detect DDoS attack in WBAN environment. However, in order to avoid complexity, no consideration was given to the traceback mechanism. During traceback, the challenge lies in reconstructing the attack path leading to identify the attack source. Among existing traceback techniques, Probabilistic Packet Marking (PPM) approach is the most commonly used technique in conventional IP- based networks. However, since marking probability assignment has significant effect on both the convergence time and performance of a scheme, it is not directly applicable in WBAN environment due to high convergence time and overhead on intermediate nodes. Therefore, in this paper we have proposed a new scheme called Efficient Traceback Technique (ETT) based on Dynamic Probability Packet Marking (DPPM) approach and uses MAC header in place of IP header. Instead of using fixed marking probability, the proposed scheme uses variable marking probability based on the number of hops travelled by a packet to reach the target node. Finally, path reconstruction algorithms are proposed to traceback an attacker. Evaluation and simulation results indicate that the proposed solution outperforms fixed PPM in terms of convergence time and computational overhead on nodes.

  5. Investigation of the Feasibility of Utilizing Gamma Emission Computed Tomography in Evaluating Fission Product Migration in Irradiated TRISO Fuel Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason M. Harp; Paul A. Demkowicz

    2014-10-01

    In the High Temperature Gas-Cooled Reactor (HTGR) the TRISO particle fuel serves as the primary fission product containment. However the large number of TRISO particles present in proposed HTGRs dictates that there will be a small fraction (~10 -4 to 10 -5) of as manufactured and in-pile particle failures that will lead to some fission product release. The matrix material surrounding the TRISO particles in fuel compacts and the structural graphite holding the TRISO particles in place can also serve as sinks for containing any released fission products. However data on the migration of solid fission products through these materialsmore » is lacking. One of the primary goals of the AGR-3/4 experiment is to study fission product migration from failed TRISO particles in prototypic HTGR components such as structural graphite and compact matrix material. In this work, the potential for a Gamma Emission Computed Tomography (GECT) technique to non-destructively examine the fission product distribution in AGR-3/4 components and other irradiation experiments is explored. Specifically, the feasibility of using the Idaho National Laboratory (INL) Hot Fuels Examination Facility (HFEF) Precision Gamma Scanner (PGS) system for this GECT application is considered. To test the feasibility, the response of the PGS system to idealized fission product distributions has been simulated using Monte Carlo radiation transport simulations. Previous work that applied similar techniques during the AGR-1 experiment will also be discussed as well as planned uses for the GECT technique during the post irradiation examination of the AGR-2 experiment. The GECT technique has also been applied to other irradiated nuclear fuel systems that were currently available in the HFEF hot cell including oxide fuel pins, metallic fuel pins, and monolithic plate fuel.« less

  6. Laser microdissection and capture of pure cardiomyocytes and fibroblasts from infarcted heart regions: perceived hyperoxia induces p21 in peri-infarct myocytes.

    PubMed

    Kuhn, Donald E; Roy, Sashwati; Radtke, Jared; Khanna, Savita; Sen, Chandan K

    2007-03-01

    Myocardial infarction caused by ischemia-reperfusion in the coronary vasculature is a focal event characterized by an infarct-core, bordering peri-infarct zone and remote noninfarct zone. Recently, we have reported the first technique, based on laser microdissection pressure catapulting (LMPC), enabling the dissection of infarction-induced biological responses in multicellular regions of the heart. Molecular mechanisms in play at the peri-infarct zone are central to myocardial healing. At the infarct site, myocytes are more sensitive to insult than robust fibroblasts. Understanding of cell-specific responses in the said zones is therefore critical. In this work, we describe the first technique to collect the myocardial tissue with a single-cell resolution. The infarcted myocardium was identified by using a truncated hematoxylin-eosin stain. Cell elements from the infarct, peri-infarct, and noninfarct zones were collected in a chaotropic RNA lysis solution with micron-level surgical precision. Isolated RNA was analyzed for quality by employing microfluidics technology and reverse transcribed to generate cDNA. Purity of the collected specimen was established by real-time PCR analyses of cell-specific genes. Previously, we have reported that the oxygen-sensitive induction of p21/Cip1/Waf1/Sdi1 in cardiac fibroblasts in the peri-infarct zone plays a vital role in myocardial remodeling. Using the novel LMPC technique developed herein, we confirmed that finding and report for the first time that the induction of p21 in the peri-infarct zone is not limited to fibroblasts but is also evident in myocytes. This work presents the first account of an analytical technique that applies the LMPC technology to study myocardial remodeling with a cell-type specific resolution.

  7. Effects of vitamin D receptor knockout on cornea epithelium gap junctions.

    PubMed

    Lu, Xiaowen; Watsky, Mitchell A

    2014-05-06

    Gap junctions are present in all corneal cell types and have been shown to have a critical role in cell phenotype determination. Vitamin D has been shown to influence cell differentiation, and recent work demonstrates the presence of vitamin D in the ocular anterior segment. This study measured and compared gap junction diffusion coefficients among different cornea epithelium phenotypes and in keratocytes using a noninvasive technique, fluorescence recovery after photobleaching (FRAP), and examined the influence of vitamin D receptor (VDR) knockout on epithelial gap junction communication in intact corneas. Previous gap junction studies in cornea epithelium and keratocytes were performed using cultured cells or ex vivo invasive techniques. These invasive techniques were unable to measure diffusion coefficients and likely were disruptive to normal cell physiology. Corneas from VDR knockout and control mice were stained with 5(6)-carboxyfluorescein diacetate (CFDA). Gap junction diffusion coefficients of the corneal epithelium phenotypes and of keratocytes, residing in intact corneas, were detected using FRAP. Diffusion coefficients equaled 18.7, 9.8, 5.6, and 4.2 μm(2)/s for superficial squamous cells, middle wing cells, basal cells, and keratocytes, respectively. Corneal thickness, superficial cell size, and the superficial squamous cell diffusion coefficient of 10-week-old VDR knockout mice were significantly lower than those of control mice (P < 0.01). The superficial cell diffusion coefficient of heterozygous mice was significantly lower than control mice (P < 0.05). Our results demonstrate differences in gap junction dye spread among the epithelial cell phenotypes, mirroring the epithelial developmental axis. The VDR knockout influences previously unreported cell-to-cell communication in superficial epithelium.

  8. High resolution time-to-space conversion of sub-picosecond pulses at 1.55µm by non-degenerate SFG in PPLN crystal.

    PubMed

    Shayovitz, Dror; Herrmann, Harald; Sohler, Wolfgang; Ricken, Raimund; Silberhorn, Christine; Marom, Dan M

    2012-11-19

    We demonstrate high resolution and increased efficiency background-free time-to-space conversion using spectrally resolved non-degenerate and collinear SFG in a bulk PPLN crystal. A serial-to-parallel resolution factor of 95 and a time window of 42 ps were achieved. A 60-fold increase in conversion efficiency slope compared with our previous work using a BBO crystal [D. Shayovitz and D. M. Marom, Opt. Lett. 36, 1957 (2011)] was recorded. Finally the measured 40 GHz narrow linewidth of the output SFG signal implies the possibility to extract phase information by employing coherent detection techniques.

  9. Application of radar for automotive collision avoidance. Volume 2: Development plan and progress reports

    NASA Technical Reports Server (NTRS)

    Lichtenberg, Christopher L. (Editor)

    1987-01-01

    The purpose of this project was research and development of an automobile collision avoidance radar system. Items within the scope of the one-year effort were to: (1) review previous authors' work in this field; (2) select a suitable radar approach; (3) develop a system design; (4) perform basic analyses and observations pertinent to radar design, performance, and effects; (5) fabricate and collect radar data from a data collection radar; (6) analyze and derive conclusions from the radar data; and (7) make recommendations about the likelihood of success of the investigated radar techniques. The final technical report presenting all conclusions is contained in Volume 1.

  10. Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts

    DOE PAGES

    Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.

    2017-01-18

    Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less

  11. Ovonic switching in tin selenide thin films. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Baxter, C. R.

    1974-01-01

    Amorphous tin selenide thin films which possess Ovonic switching properties were fabricated using vacuum deposition techniques. Results obtained indicate that memory type Ovonic switching does occur in these films the energy density required for switching from a high impedance to a low impedance state is dependent on the spacing between the electrodes of the device. The switching is also function of the magnitude of the applied voltage pulse. A completely automated computer controlled testing procedure was developed which allows precise control over the shape of the applied voltage switching pulse. A survey of previous experimental and theoretical work in the area of Ovonic switching is also presented.

  12. Validating a UAV artificial intelligence control system using an autonomous test case generator

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy; Huber, Justin

    2013-05-01

    The validation of safety-critical applications, such as autonomous UAV operations in an environment which may include human actors, is an ill posed problem. To confidence in the autonomous control technology, numerous scenarios must be considered. This paper expands upon previous work, related to autonomous testing of robotic control algorithms in a two dimensional plane, to evaluate the suitability of similar techniques for validating artificial intelligence control in three dimensions, where a minimum level of airspeed must be maintained. The results of human-conducted testing are compared to this automated testing, in terms of error detection, speed and testing cost.

  13. In situ spectroscopic study of the plastic deformation of amorphous silicon under nonhydrostatic conditions induced by indentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerbig, Yvonne B.; Michaels, C. A.; Bradby, Jodie E.

    Indentation-induced plastic deformation of amorphous silicon (a-Si) thin films was studied by in situ Raman imaging of the deformed contact region of an indented sample, employing a Raman spectroscopy-enhanced instrumented indentation technique (IIT). The occurrence and evolving spatial distribution of changes in the a-Si structure caused by processes, such as polyamorphization and crystallization, induced by indentation loading were observed. Furthermore, the obtained experimental results are linked with previously published work on the plastic deformation of a-Si under hydrostatic compression and shear deformation to establish a model for the deformation behavior of a-Si under indentation loading.

  14. Professional representation and the free-lance medical illustrator.

    PubMed

    Mount, K N; Daugherty, J

    1994-01-01

    We researched factors related to the success or failure in working relationships between free-lance medical illustrators and artist's representatives. In the fall of 1992, surveys were mailed to 230 medical illustrators; 105 (46%) completed surveys were returned. Respondents were divided into three categories: 1) medical illustrators currently represented, 2) medical illustrators previously represented, and 3) medical illustrators who had never been represented. Comparisons made among illustrators from the three groups included business practices, clientele, experience, and self-promotion techniques. These comparisons revealed notable differences and similarities between the three groups and were subsequently analyzed to identify the characteristics of medical illustrators who would benefit from professional representation.

  15. Synthesis of Optimal Constant-Gain Positive-Real Controllers for Passive Systems

    NASA Technical Reports Server (NTRS)

    Mao, Y.; Kelkar, A. G.; Joshi, S. M.

    1999-01-01

    This paper presents synthesis methods for the design of constant-gain positive real controllers for passive systems. The results presented in this paper, in conjunction with the previous work by the authors on passification of non-passive systems, offer a useful synthesis tool for the design of passivity-based robust controllers for non-passive systems as well. Two synthesis approaches are given for minimizing an LQ-type performance index, resulting in optimal controller gains. Two separate algorithms, one for each of these approaches, are given. The synthesis techniques are demonstrated using two numerical examples: control of a flexible structure and longitudinal control of a fighter aircraft.

  16. MindDigger: Feature Identification and Opinion Association for Chinese Movie Reviews

    NASA Astrophysics Data System (ADS)

    Zhao, Lili; Li, Chunping

    In this paper, we present a prototype system called MindDigger, which can be used to analyze the opinions in Chinese movie reviews. Different from previous research that employed techniques on product reviews, we focus on Chinese movie reviews, in which opinions are expressed in subtle and varied ways. The system designed in this work aims to extract the opinion expressions and assign them to the corresponding features. The core tasks include feature and opinion extraction, and feature-opinion association. To deal with Chinese effectively, several novel approaches based on syntactic analysis are proposed in this paper. Running results show the performance is satisfactory.

  17. Detecting duplicate biological entities using Shortest Path Edit Distance.

    PubMed

    Rudniy, Alex; Song, Min; Geller, James

    2010-01-01

    Duplicate entity detection in biological data is an important research task. In this paper, we propose a novel and context-sensitive Shortest Path Edit Distance (SPED) extending and supplementing our previous work on Markov Random Field-based Edit Distance (MRFED). SPED transforms the edit distance computational problem to the calculation of the shortest path among two selected vertices of a graph. We produce several modifications of SPED by applying Levenshtein, arithmetic mean, histogram difference and TFIDF techniques to solve subtasks. We compare SPED performance to other well-known distance algorithms for biological entity matching. The experimental results show that SPED produces competitive outcomes.

  18. The role of fluctuation-induced transport in a toroidal plasma with strong radial electric fields

    NASA Technical Reports Server (NTRS)

    Roth, J. R.; Krawczonek, W. M.; Powers, E. J.; Hong, J. Y.; Kim, Y. C.

    1981-01-01

    Previous work employing digitally implemented spectral analysis techniques is extended to demonstrate that radial fluctuation-induced transport is the dominant ion transport mechanism in an electric field dominated toroidal plasma. Such transport can be made to occur against a density gradient, and hence may have a very beneficial effect on confinement in toroidal plasmas of fusion interest. It is shown that Bohm or classical diffusion down a density gradient, the collisional Pedersen-current mechanism, and the collisionless electric field gradient mechanism described by Cole (1976) all played a minor role, if any, in the radial transport of this plasma.

  19. Evolution of Structure and Composition in Saturn's Rings Due to Ballistic Transport of Micrometeoroid Impact Ejecta

    NASA Astrophysics Data System (ADS)

    Estrada, P. R.; Durisen, R. H.; Cuzzi, J. N.

    2014-04-01

    We introduce improved numerical techniques for simulating the structural and compositional evolution of planetary rings due to micrometeoroid bombardment and subsequent ballistic transport of impact ejecta. Our current, robust code, which is based on the original structural code of [1] and on the pollution transport code of [3], is capable of modeling structural changes and pollution transport simultaneously over long times on both local and global scales. We provide demonstrative simulations to compare with, and extend upon previous work, as well as examples of how ballistic transport can maintain the observed structure in Saturn's rings using available Cassini occultation optical depth data.

  20. Finding text in color images

    NASA Astrophysics Data System (ADS)

    Zhou, Jiangying; Lopresti, Daniel P.; Tasdizen, Tolga

    1998-04-01

    In this paper, we consider the problem of locating and extracting text from WWW images. A previous algorithm based on color clustering and connected components analysis works well as long as the color of each character is relatively uniform and the typography is fairly simple. It breaks down quickly, however, when these assumptions are violated. In this paper, we describe more robust techniques for dealing with this challenging problem. We present an improved color clustering algorithm that measures similarity based on both RGB and spatial proximity. Layout analysis is also incorporated to handle more complex typography. THese changes significantly enhance the performance of our text detection procedure.

  1. Validation of catchment models for predicting land-use and climate change impacts. 1. Method

    NASA Astrophysics Data System (ADS)

    Ewen, J.; Parkin, G.

    1996-02-01

    Computer simulation models are increasingly being proposed as tools capable of giving water resource managers accurate predictions of the impact of changes in land-use and climate. Previous validation testing of catchment models is reviewed, and it is concluded that the methods used do not clearly test a model's fitness for such a purpose. A new generally applicable method is proposed. This involves the direct testing of fitness for purpose, uses established scientific techniques, and may be implemented within a quality assured programme of work. The new method is applied in Part 2 of this study (Parkin et al., J. Hydrol., 175:595-613, 1996).

  2. A novel load balanced energy conservation approach in WSN using biogeography based optimization

    NASA Astrophysics Data System (ADS)

    Kaushik, Ajay; Indu, S.; Gupta, Daya

    2017-09-01

    Clustering sensor nodes is an effective technique to reduce energy consumption of the sensor nodes and maximize the lifetime of Wireless sensor networks. Balancing load of the cluster head is an important factor in long run operation of WSNs. In this paper we propose a novel load balancing approach using biogeography based optimization (LB-BBO). LB-BBO uses two separate fitness functions to perform load balancing of equal and unequal load respectively. The proposed method is simulated using matlab and compared with existing methods. The proposed method shows better performance than all the previous works implemented for energy conservation in WSN

  3. Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.

    Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less

  4. A SAT Based Effective Algorithm for the Directed Hamiltonian Cycle Problem

    NASA Astrophysics Data System (ADS)

    Jäger, Gerold; Zhang, Weixiong

    The Hamiltonian cycle problem (HCP) is an important combinatorial problem with applications in many areas. While thorough theoretical and experimental analyses have been made on the HCP in undirected graphs, little is known for the HCP in directed graphs (DHCP). The contribution of this work is an effective algorithm for the DHCP. Our algorithm explores and exploits the close relationship between the DHCP and the Assignment Problem (AP) and utilizes a technique based on Boolean satisfiability (SAT). By combining effective algorithms for the AP and SAT, our algorithm significantly outperforms previous exact DHCP algorithms including an algorithm based on the award-winning Concorde TSP algorithm.

  5. In situ spectroscopic study of the plastic deformation of amorphous silicon under nonhydrostatic conditions induced by indentation

    DOE PAGES

    Gerbig, Yvonne B.; Michaels, C. A.; Bradby, Jodie E.; ...

    2015-12-17

    Indentation-induced plastic deformation of amorphous silicon (a-Si) thin films was studied by in situ Raman imaging of the deformed contact region of an indented sample, employing a Raman spectroscopy-enhanced instrumented indentation technique (IIT). The occurrence and evolving spatial distribution of changes in the a-Si structure caused by processes, such as polyamorphization and crystallization, induced by indentation loading were observed. Furthermore, the obtained experimental results are linked with previously published work on the plastic deformation of a-Si under hydrostatic compression and shear deformation to establish a model for the deformation behavior of a-Si under indentation loading.

  6. Signal Processing Methods for Liquid Rocket Engine Combustion Spontaneous Stability and Rough Combustion Assessments

    NASA Technical Reports Server (NTRS)

    Kenny, R. Jeremy; Casiano, Matthew; Fischbach, Sean; Hulka, James R.

    2012-01-01

    Liquid rocket engine combustion stability assessments are traditionally broken into three categories: dynamic stability, spontaneous stability, and rough combustion. This work focuses on comparing the spontaneous stability and rough combustion assessments for several liquid engine programs. The techniques used are those developed at Marshall Space Flight Center (MSFC) for the J-2X Workhorse Gas Generator program. Stability assessment data from the Integrated Powerhead Demonstrator (IPD), FASTRAC, and Common Extensible Cryogenic Engine (CECE) programs are compared against previously processed J-2X Gas Generator data. Prior metrics for spontaneous stability assessments are updated based on the compilation of all data sets.

  7. Optimal non-linear health insurance.

    PubMed

    Blomqvist, A

    1997-06-01

    Most theoretical and empirical work on efficient health insurance has been based on models with linear insurance schedules (a constant co-insurance parameter). In this paper, dynamic optimization techniques are used to analyse the properties of optimal non-linear insurance schedules in a model similar to one originally considered by Spence and Zeckhauser (American Economic Review, 1971, 61, 380-387) and reminiscent of those that have been used in the literature on optimal income taxation. The results of a preliminary numerical example suggest that the welfare losses from the implicit subsidy to employer-financed health insurance under US tax law may be a good deal smaller than previously estimated using linear models.

  8. A model for sequential decoding overflow due to a noisy carrier reference. [communication performance prediction

    NASA Technical Reports Server (NTRS)

    Layland, J. W.

    1974-01-01

    An approximate analysis of the effect of a noisy carrier reference on the performance of sequential decoding is presented. The analysis uses previously developed techniques for evaluating noisy reference performance for medium-rate uncoded communications adapted to sequential decoding for data rates of 8 to 2048 bits/s. In estimating the ten to the minus fourth power deletion probability thresholds for Helios, the model agrees with experimental data to within the experimental tolerances. The computational problem involved in sequential decoding, carrier loop effects, the main characteristics of the medium-rate model, modeled decoding performance, and perspectives on future work are discussed.

  9. Application of separable parameter space techniques to multi-tracer PET compartment modeling.

    PubMed

    Zhang, Jeff L; Michael Morey, A; Kadrmas, Dan J

    2016-02-07

    Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg-Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models.

  10. Directly manipulated free-form deformation image registration.

    PubMed

    Tustison, Nicholas J; Avants, Brian B; Gee, James C

    2009-03-01

    Previous contributions to both the research and open source software communities detailed a generalization of a fast scalar field fitting technique for cubic B-splines based on the work originally proposed by Lee . One advantage of our proposed generalized B-spline fitting approach is its immediate application to a class of nonrigid registration techniques frequently employed in medical image analysis. Specifically, these registration techniques fall under the rubric of free-form deformation (FFD) approaches in which the object to be registered is embedded within a B-spline object. The deformation of the B-spline object describes the transformation of the image registration solution. Representative of this class of techniques, and often cited within the relevant community, is the formulation of Rueckert who employed cubic splines with normalized mutual information to study breast deformation. Similar techniques from various groups provided incremental novelty in the form of disparate explicit regularization terms, as well as the employment of various image metrics and tailored optimization methods. For several algorithms, the underlying gradient-based optimization retained the essential characteristics of Rueckert's original contribution. The contribution which we provide in this paper is two-fold: 1) the observation that the generic FFD framework is intrinsically susceptible to problematic energy topographies and 2) that the standard gradient used in FFD image registration can be modified to a well-understood preconditioned form which substantially improves performance. This is demonstrated with theoretical discussion and comparative evaluation experimentation.

  11. A new model for simulating 3-d crystal growth and its application to the study of antifreeze proteins.

    PubMed

    Wathen, Brent; Kuiper, Michael; Walker, Virginia; Jia, Zongchao

    2003-01-22

    A novel computational technique for modeling crystal formation has been developed that combines three-dimensional (3-D) molecular representation and detailed energetics calculations of molecular mechanics techniques with the less-sophisticated probabilistic approach used by statistical techniques to study systems containing millions of molecules undergoing billions of interactions. Because our model incorporates both the structure of and the interaction energies between participating molecules, it enables the 3-D shape and surface properties of these molecules to directly affect crystal formation. This increase in model complexity has been achieved while simultaneously increasing the number of molecules in simulations by several orders of magnitude over previous statistical models. We have applied this technique to study the inhibitory effects of antifreeze proteins (AFPs) on ice-crystal formation. Modeling involving both fish and insect AFPs has produced results consistent with experimental observations, including the replication of ice-etching patterns, ice-growth inhibition, and specific AFP-induced ice morphologies. Our work suggests that the degree of AFP activity results more from AFP ice-binding orientation than from AFP ice-binding strength. This technique could readily be adapted to study other crystal and crystal inhibitor systems, or to study other noncrystal systems that exhibit regularity in the structuring of their component molecules, such as those associated with the new nanotechnologies.

  12. Application of separable parameter space techniques to multi-tracer PET compartment modeling

    NASA Astrophysics Data System (ADS)

    Zhang, Jeff L.; Morey, A. Michael; Kadrmas, Dan J.

    2016-02-01

    Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg-Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models.

  13. A Brain System for Auditory Working Memory.

    PubMed

    Kumar, Sukhbinder; Joseph, Sabine; Gander, Phillip E; Barascud, Nicolas; Halpern, Andrea R; Griffiths, Timothy D

    2016-04-20

    The brain basis for auditory working memory, the process of actively maintaining sounds in memory over short periods of time, is controversial. Using functional magnetic resonance imaging in human participants, we demonstrate that the maintenance of single tones in memory is associated with activation in auditory cortex. In addition, sustained activation was observed in hippocampus and inferior frontal gyrus. Multivoxel pattern analysis showed that patterns of activity in auditory cortex and left inferior frontal gyrus distinguished the tone that was maintained in memory. Functional connectivity during maintenance was demonstrated between auditory cortex and both the hippocampus and inferior frontal cortex. The data support a system for auditory working memory based on the maintenance of sound-specific representations in auditory cortex by projections from higher-order areas, including the hippocampus and frontal cortex. In this work, we demonstrate a system for maintaining sound in working memory based on activity in auditory cortex, hippocampus, and frontal cortex, and functional connectivity among them. Specifically, our work makes three advances from the previous work. First, we robustly demonstrate hippocampal involvement in all phases of auditory working memory (encoding, maintenance, and retrieval): the role of hippocampus in working memory is controversial. Second, using a pattern classification technique, we show that activity in the auditory cortex and inferior frontal gyrus is specific to the maintained tones in working memory. Third, we show long-range connectivity of auditory cortex to hippocampus and frontal cortex, which may be responsible for keeping such representations active during working memory maintenance. Copyright © 2016 Kumar et al.

  14. Methodological integrative review of the work sampling technique used in nursing workload research.

    PubMed

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  15. Ground-based radiometric calibration of the Landsat 8 Operational Land Imager (OLI) using in situ techniques

    NASA Astrophysics Data System (ADS)

    Czapla-Myers, J.

    2013-12-01

    Landsat 8 was successfully launched from Vandenberg Air Force Base in California on 11 February 2013, and was placed into the orbit previously occupied by Landsat 5. Landsat 8 is the latest platform in the 40-year history of the Landsat series of satellites, and it contains two instruments that operate in the solar-reflective and the thermal infrared regimes. The Operational Land Imager (OLI) is a pushbroom sensor that contains eight multispectral bands ranging from 400-2300 nm, and one panchromatic band. The spatial resolution of the multispectral bands is 30 m, which is similar to previous Landsat sensors, and the panchromatic band has a 15-m spatial resolution, which is also similar to previous Landsat sensors. The 12-bit radiometric resolution of OLI improves upon the 8-bit resolution of the Enhanced Thematic Mapper Plus (ETM+) onboard Landsat 7. An important requirement for the Landsat program is the long-term radiometric continuity of its sensors. Ground-based vicarious techniques have been used for over 20 years to determine the absolute radiometric calibration of sensors that encompass a wide variety of spectral and spatial characteristics. This work presents the early radiometric calibration results of Landsat 8 OLI that were obtained using the traditional reflectance-based approach. University of Arizona personnel used five sites in Arizona, California, and Nevada to collect ground-based data. In addition, a unique set of in situ data were collected in March 2013, when Landsat 7 and Landsat 8 were observing the same site within minutes of each other. The tandem overfly schedule occurred while Landsat 8 was shifting to the WRS-2 orbital grid, and lasted only a few days. The ground-based data also include results obtained using the University of Arizona's Radiometric Calibration Test Site (RadCaTS), which is an automated suite of instruments located at Railroad Valley, Nevada. The results presented in this work include a comparison to the L1T at-sensor spectral radiance and the top-of-atmosphere reflectance, both of which are standard products available from the US Geological Survey.

  16. Intracorporeal hybrid single port vs conventional laparoscopic appendectomy in children.

    PubMed

    Karam, Paul Anthony; Hiuser, Amy; Magnuson, David; Seifarth, Federico Gian Filippo

    2016-12-20

    Transumbilical laparoscopic assisted appendectomy combines laparoscopic single port dissection with open appendectomy after exteriorization of the appendix through the port site. Compared to the conventional three-port approach, this technique provides an alternative with excellent cosmetic outcome. We developed a safe and effective technique to perform an intracorporeal single port appendectomy, using the same laparoscope employed in the extracorporeal procedure. Retrospective review of 71 consecutively performed intracorporeal single port appendectomies and 30 conventional three-port appendectomies in children 6 to 17 years of age. A straight 10-mm Storz telescope with inbuilt 6 mm working channel is used to dissect the appendix, combined with one port-less 2.3 mm percutaneous grasper. Polymer WECK® hem-o-lock® clips are applied to seal the base of the appendix and the appendiceal vessels. No intraoperative complications were reported with the hybrid intracorporeal single port appendectomy or three-port appendectomy. There were two post-operative complications in the group treated with the single port hybrid technique: one intra-abdominal abscess and one surgical site infection. Groups did not differ in age, weight, and types of appendicitis. Operative times were shorter for the hybrid technique (70 vs 79 minutes) but did not differ significantly (P=0.19). This modified technique to a previously described single port extracorporeal appendectomy is easy to master and implement. It provides exposure similar to a three-port laparoscopic appendectomy, while maintaining virtually scarless results and potentially reduces the risk for surgical site infections compared to the extracorporeal technique.

  17. Chemotaxis in P. Aeruginosa Biofilm Formation

    NASA Astrophysics Data System (ADS)

    Bienvenu, Samuel; Strain, Shinji; Thatcher, Travis; Gordon, Vernita

    2010-10-01

    Pseudomonas biofilms form infections in the lungs of Cystic Fibrosis (CF) patients that damage lung tissue and lead to death. Previous work shows chemotaxis is important for Pseudomonas in CF lungs. The work studied swimming bacteria at high concentrations. In contrast, medically relevant biofilms initiate from sparse populations of surface-bound bacteria. The recent development of software techniques for automated, high-throughput bacteria tracking leaves us well-poised to quantitatively study these chemotactic conditions. We will develop experimental systems for such studies, focusing on L-Arginine (an amino acid), D-Galactose (a sugar present in lungs), and succinate and glucose (carbon sources for bacteria). This suite of chemoattractants will allow us to study how chemoattractant characteristics--size and diffusion behavior--change bacterial response; the interaction of competing chemoattractants; and, differences in bacterial behaviors, like motility modes, in response to different types of chemoattractions and varying neighbor cell density.

  18. High-Resolution Infrared Spectroscopy of Carbon-Sulfur Chains: II. C_5S and SC_5S

    NASA Astrophysics Data System (ADS)

    Thorwirth, Sven; Salomon, Thomas; Dudek, John B.

    2016-06-01

    Unbiased high-resolution infrared survey scans of the ablation products from carbon-sulfur targets in the 2100 to 2150 cm-1 regime reveal two bands previously not observed in the gas phase. On the basis of comparison against laboratory matrix-isolation work and new high-level quantum-chemical calculations these bands are attributed to the linear C_5S and SC_5S clusters. While polar C_5S was studied earlier using Fourier-transform microwave techniques, the present work marks the first gas-phase spectroscopic detection of SC_5S. H. Wang, J. Szczepanski, P. Brucat, and M. Vala 2005, Int. J. Quant. Chem. 102, 795 Y. Kasai, K. Obi, Y. Ohshima, Y. Hirahara, Y. Endo, K. Kawaguchi, and A. Murakami 1993, ApJ 410, L45 V. D. Gordon, M. C. McCarthy, A. J. Apponi, and P. Thaddeus 2001, ApJS 134, 311

  19. Control theory based airfoil design using the Euler equations

    NASA Technical Reports Server (NTRS)

    Jameson, Antony; Reuther, James

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

  20. Using a trichromatic CCD camera for spectral skylight estimation.

    PubMed

    López-Alvarez, Miguel A; Hernández-Andrés, Javier; Romero, Javier; Olmo, F J; Cazorla, A; Alados-Arboledas, L

    2008-12-01

    In a previous work [J. Opt. Soc. Am. A 24, 942-956 (2007)] we showed how to design an optimum multispectral system aimed at spectral recovery of skylight. Since high-resolution multispectral images of skylight could be interesting for many scientific disciplines, here we also propose a nonoptimum but much cheaper and faster approach to achieve this goal by using a trichromatic RGB charge-coupled device (CCD) digital camera. The camera is attached to a fish-eye lens, hence permitting us to obtain a spectrum of every point of the skydome corresponding to each pixel of the image. In this work we show how to apply multispectral techniques to the sensors' responses of a common trichromatic camera in order to obtain skylight spectra from them. This spectral information is accurate enough to estimate experimental values of some climate parameters or to be used in algorithms for automatic cloud detection, among many other possible scientific applications.

Top