Sample records for analysis techniques provide

  1. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  2. Qualitative and quantitative analysis of lignocellulosic biomass using infrared techniques: A mini-review

    USDA-ARS?s Scientific Manuscript database

    Current wet chemical methods for biomass composition analysis using two-step sulfuric acid hydrolysis are time-consuming, labor-intensive, and unable to provide structural information about biomass. Infrared techniques provide fast, low-cost analysis, are non-destructive, and have shown promising re...

  3. Cluster analysis and subgrouping to investigate inter-individual variability to non-invasive brain stimulation: a systematic review.

    PubMed

    Pellegrini, Michael; Zoghi, Maryam; Jaberzadeh, Shapour

    2018-01-12

    Cluster analysis and other subgrouping techniques have risen in popularity in recent years in non-invasive brain stimulation research in the attempt to investigate the issue of inter-individual variability - the issue of why some individuals respond, as traditionally expected, to non-invasive brain stimulation protocols and others do not. Cluster analysis and subgrouping techniques have been used to categorise individuals, based on their response patterns, as responder or non-responders. There is, however, a lack of consensus and consistency on the most appropriate technique to use. This systematic review aimed to provide a systematic summary of the cluster analysis and subgrouping techniques used to date and suggest recommendations moving forward. Twenty studies were included that utilised subgrouping techniques, while seven of these additionally utilised cluster analysis techniques. The results of this systematic review appear to indicate that statistical cluster analysis techniques are effective in identifying subgroups of individuals based on response patterns to non-invasive brain stimulation. This systematic review also reports a lack of consensus amongst researchers on the most effective subgrouping technique and the criteria used to determine whether an individual is categorised as a responder or a non-responder. This systematic review provides a step-by-step guide to carrying out statistical cluster analyses and subgrouping techniques to provide a framework for analysis when developing further insights into the contributing factors of inter-individual variability in response to non-invasive brain stimulation.

  4. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  5. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  6. [Development of sample pretreatment techniques-rapid detection coupling methods for food security analysis].

    PubMed

    Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke

    2013-07-01

    This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.

  7. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  8. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  9. Rasch Analysis: A Primer for School Psychology Researchers and Practitioners

    ERIC Educational Resources Information Center

    Boone, William J.; Noltemeyer, Amity

    2017-01-01

    In order to progress as a field, school psychology research must be informed by effective measurement techniques. One approach to address the need for careful measurement is Rasch analysis. This technique can (a) facilitate the development of instruments that provide useful data, (b) provide data that can be used confidently for both descriptive…

  10. An R package for the integrated analysis of metabolomics and spectral data.

    PubMed

    Costa, Christopher; Maraschin, Marcelo; Rocha, Miguel

    2016-06-01

    Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as nuclear magnetic resonance, gas or liquid chromatography, mass spectrometry, infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  12. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  13. A review of intelligent systems for heart sound signal analysis.

    PubMed

    Nabih-Ali, Mohammed; El-Dahshan, El-Sayed A; Yahia, Ashraf S

    2017-10-01

    Intelligent computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of physicians and reduce the time required for accurate diagnosis. CAD systems could provide physicians with a suggestion about the diagnostic of heart diseases. The objective of this paper is to review the recent published preprocessing, feature extraction and classification techniques and their state of the art of phonocardiogram (PCG) signal analysis. Published literature reviewed in this paper shows the potential of machine learning techniques as a design tool in PCG CAD systems and reveals that the CAD systems for PCG signal analysis are still an open problem. Related studies are compared to their datasets, feature extraction techniques and the classifiers they used. Current achievements and limitations in developing CAD systems for PCG signal analysis using machine learning techniques are presented and discussed. In the light of this review, a number of future research directions for PCG signal analysis are provided.

  14. A Sparsity-based Framework for Resolution Enhancement in Optical Fault Analysis of Integrated Circuits

    DTIC Science & Technology

    2015-01-01

    for IC fault detection . This section provides background information on inversion methods. Conventional inversion techniques and their shortcomings are...physical techniques, electron beam imaging/analysis, ion beam techniques, scanning probe techniques. Electrical tests are used to detect faults in 13 an...hand, there is also the second harmonic technique through which duty cycle degradation faults are detected by collecting the magnitude and the phase of

  15. Digital Dental X-ray Database for Caries Screening

    NASA Astrophysics Data System (ADS)

    Rad, Abdolvahab Ehsani; Rahim, Mohd Shafry Mohd; Rehman, Amjad; Saba, Tanzila

    2016-06-01

    Standard database is the essential requirement to compare the performance of image analysis techniques. Hence the main issue in dental image analysis is the lack of available image database which is provided in this paper. Periapical dental X-ray images which are suitable for any analysis and approved by many dental experts are collected. This type of dental radiograph imaging is common and inexpensive, which is normally used for dental disease diagnosis and abnormalities detection. Database contains 120 various Periapical X-ray images from top to bottom jaw. Dental digital database is constructed to provide the source for researchers to use and compare the image analysis techniques and improve or manipulate the performance of each technique.

  16. Rasch Analysis for Instrument Development: Why, When, and How?

    PubMed Central

    Boone, William J.

    2016-01-01

    This essay describes Rasch analysis psychometric techniques and how such techniques can be used by life sciences education researchers to guide the development and use of surveys and tests. Specifically, Rasch techniques can be used to document and evaluate the measurement functioning of such instruments. Rasch techniques also allow researchers to construct “Wright maps” to explain the meaning of a test score or survey score and develop alternative forms of tests and surveys. Rasch techniques provide a mechanism by which the quality of life sciences–related tests and surveys can be optimized and the techniques can be used to provide a context (e.g., what topics a student has mastered) when explaining test and survey results. PMID:27856555

  17. A combination of selected mapping and clipping to increase energy efficiency of OFDM systems

    PubMed Central

    Lee, Byung Moo; Rim, You Seung

    2017-01-01

    We propose an energy efficient combination design for OFDM systems based on selected mapping (SLM) and clipping peak-to-average power ratio (PAPR) reduction techniques, and show the related energy efficiency (EE) performance analysis. The combination of two different PAPR reduction techniques can provide a significant benefit in increasing EE, because it can take advantages of both techniques. For the combination, we choose the clipping and SLM techniques, since the former technique is quite simple and effective, and the latter technique does not cause any signal distortion. We provide the structure and the systematic operating method, and show the various analyzes to derive the EE gain based on the combined technique. Our analysis show that the combined technique increases the EE by 69% compared to no PAPR reduction, and by 19.34% compared to only using SLM technique. PMID:29023591

  18. GLO-STIX: Graph-Level Operations for Specifying Techniques and Interactive eXploration

    PubMed Central

    Stolper, Charles D.; Kahng, Minsuk; Lin, Zhiyuan; Foerster, Florian; Goel, Aakash; Stasko, John; Chau, Duen Horng

    2015-01-01

    The field of graph visualization has produced a wealth of visualization techniques for accomplishing a variety of analysis tasks. Therefore analysts often rely on a suite of different techniques, and visual graph analysis application builders strive to provide this breadth of techniques. To provide a holistic model for specifying network visualization techniques (as opposed to considering each technique in isolation) we present the Graph-Level Operations (GLO) model. We describe a method for identifying GLOs and apply it to identify five classes of GLOs, which can be flexibly combined to re-create six canonical graph visualization techniques. We discuss advantages of the GLO model, including potentially discovering new, effective network visualization techniques and easing the engineering challenges of building multi-technique graph visualization applications. Finally, we implement the GLOs that we identified into the GLO-STIX prototype system that enables an analyst to interactively explore a graph by applying GLOs. PMID:26005315

  19. Predicting Effective Course Conduction Strategy Using Datamining Techniques

    ERIC Educational Resources Information Center

    Parkavi, A.; Lakshmi, K.; Srinivasa, K. G.

    2017-01-01

    Data analysis techniques can be used to analyze the pattern of data in different fields. Based on the analysis' results, it is recommended that suggestions be provided to decision making authorities. The data mining techniques can be used in educational domain to improve the outcome of the educational sectors. The authors carried out this research…

  20. Thermoreflectance spectroscopy—Analysis of thermal processes in semiconductor lasers

    NASA Astrophysics Data System (ADS)

    Pierścińska, D.

    2018-01-01

    This review focuses on theoretical foundations, experimental implementation and an overview of experimental results of the thermoreflectance spectroscopy as a powerful technique for temperature monitoring and analysis of thermal processes in semiconductor lasers. This is an optical, non-contact, high spatial resolution technique providing high temperature resolution and mapping capabilities. Thermoreflectance is a thermometric technique based on measuring of relative change of reflectivity of the surface of laser facet, which provides thermal images useful in hot spot detection and reliability studies. In this paper, principles and experimental implementation of the technique as a thermography tool is discussed. Some exemplary applications of TR to various types of lasers are presented, proving that thermoreflectance technique provides new insight into heat management problems in semiconductor lasers and in particular, that it allows studying thermal degradation processes occurring at laser facets. Additionally, thermal processes and basic mechanisms of degradation of the semiconductor laser are discussed.

  1. Applications of mass spectrometry techniques to autoclave curing of materials

    NASA Technical Reports Server (NTRS)

    Smith, A. C.

    1983-01-01

    Mass spectrometer analysis of gases evolved from polymer materials during a cure cycle can provide a wealth of information useful for studying cure properties and procedures. In this paper data is presented for two materials to support the feasibility of using mass spectrometer gas analysis techniques to enhance the knowledge of autoclave curing of composite materials and provide additional information for process control evaluation. It is expected that this technique will also be useful in working out the details involved in determining the proper cure cycle for new or experimental materials.

  2. A microhistological technique for analysis of food habits of mycophagous rodents.

    Treesearch

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  3. Evidential Reasoning in Expert Systems for Image Analysis.

    DTIC Science & Technology

    1985-02-01

    techniques to image analysis (IA). There is growing evidence that these techniques offer significant improvements in image analysis , particularly in the...2) to provide a common framework for analysis, (3) to structure the ER process for major expert-system tasks in image analysis , and (4) to identify...approaches to three important tasks for expert systems in the domain of image analysis . This segment concluded with an assessment of the strengths

  4. The Generation of Novel MR Imaging Techniques to Visualize Inflammatory/Degenerative Mechanisms and the Correlation of MR Data with 3D Microscopic Changes

    DTIC Science & Technology

    2013-09-01

    existing MR scanning systems providing the ability to visualize structures that are impossible with current methods . Using techniques to concurrently...and unique system for analysis of affected brain regions and coupled with other imaging techniques and molecular measurements holds significant...scanning systems providing the ability to visualize structures that are impossible with current methods . Using techniques to concurrently stain

  5. 40 CFR 68.28 - Alternative release scenario analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... overfilling and spill, or overpressurization and venting through relief valves or rupture disks; and (v... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as...

  6. Separation and Analysis of Citral Isomers.

    ERIC Educational Resources Information Center

    Sacks, Jeff; And Others

    1983-01-01

    Provides background information, procedures, and results of an experiments designed to introduce undergraduates to the technique of steam distillation as a means of isolating thermally sensitive compounds. Chromatographic techniques (HPLC) and mass spectrometric analysis are used in the experiment which requires three laboratory periods. (JN)

  7. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    PubMed

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  8. Rapid analysis of controlled substances using desorption electrospray ionization mass spectrometry.

    PubMed

    Rodriguez-Cruz, Sandra E

    2006-01-01

    The recently developed technique of desorption electrospray ionization (DESI) has been applied to the rapid analysis of controlled substances. Experiments have been performed using a commercial ThermoFinnigan LCQ Advantage MAX ion-trap mass spectrometer with limited modifications. Results from the ambient sampling of licit and illicit tablets demonstrate the ability of the DESI technique to detect the main active ingredient(s) or controlled substance(s), even in the presence of other higher-concentration components. Full-scan mass spectrometry data provide preliminary identification by molecular weight determination, while rapid analysis using the tandem mass spectrometry (MS/MS) mode provides fragmentation data which, when compared to the laboratory-generated ESI-MS/MS spectral library, provide structural information and final identification of the active ingredient(s). The consecutive analysis of tablets containing different active components indicates there is no cross-contamination or interference from tablet to tablet, demonstrating the reliability of the DESI technique for rapid sampling (one tablet/min or better). Active ingredients have been detected for tablets in which the active component represents less than 1% of the total tablet weight, demonstrating the sensitivity of the technique. The real-time sampling of cannabis plant material is also presented.

  9. Thermal radiation analysis system TRASYS 2: User's manual

    NASA Technical Reports Server (NTRS)

    Goble, R. G.; Jensen, C. L.

    1980-01-01

    The Thermal Radiation Analyzer System (TRASYS) program put thermal radiation analysis on the same basis as thermal analysis using program systems such as MITAS and SINDA. The user is provided the powerful options of writing his own executive, or driver logic and choosing, among several available options, the most desirable solution technique(s) for the problem at hand. This User's Manual serves the twofold purpose of instructing the user in all applications and providing a convenient reference book that presents the features and capabilities in a concise, easy-to-find manner.

  10. Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.

    2004-01-01

    A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.

  11. Determining Tooth Occlusal Surface Relief Indicator by Means of Automated 3d Shape Analysis

    NASA Astrophysics Data System (ADS)

    Gaboutchian, A. V.; Knyaz, V. A.

    2017-05-01

    Determining occlusal surface relief indicator plays an important role in odontometric tooth shape analysis. An analysis of the parameters of surface relief indicators provides valuable information about closure of dental arches (occlusion) and changes in structure of teeth in lifetime. Such data is relevant for dentistry or anthropology applications. Descriptive techniques commonly used for surface relief evaluation have limited precision which, as a result, does not provide for reliability of conclusions about structure and functioning of teeth. Parametric techniques developed for such applications need special facilities and are time-consuming which limits their spread and ease to access. Nevertheless the use of 3D models, obtained by photogrammetric techniques, allows attaining required measurements accuracy and has a potential for process automation. We introduce new approaches for determining tooth occlusal surface relief indicator and provide data on efficiency in use of different indicators in natural attrition evaluation.

  12. 40 CFR 68.28 - Alternative release scenario analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...

  13. 40 CFR 68.28 - Alternative release scenario analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...

  14. 40 CFR 68.28 - Alternative release scenario analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...

  15. Protocol Analysis: A Methodology for Exploring the Information Processing of Gifted Students.

    ERIC Educational Resources Information Center

    Anderson, Margaret A.

    1986-01-01

    Protocol analysis techniques, in which subjects are taught to think aloud, can provide information on the mental operations used by gifted learners. Concerns over the use of such data are described and new directions for the technique are proposed. (CL)

  16. Behavior Analysis: Methodological Foundations.

    ERIC Educational Resources Information Center

    Owen, James L.

    Behavior analysis provides a unique way of coming to understand intrapersonal and interpersonal communication behaviors, and focuses on control techniques available to a speaker and counter-control techniques available to a listener. "Time-series methodology" is a convenient term because it subsumes under one label a variety of baseline…

  17. Low-thrust chemical propulsion system propellant expulsion and thermal conditioning study. Executive summary

    NASA Technical Reports Server (NTRS)

    Merino, F.; Wakabayashi, I.; Pleasant, R. L.; Hill, M.

    1982-01-01

    Preferred techniques for providing abort pressurization and engine feed system net positive suction pressure (NPSP) for low thrust chemical propulsion systems (LTPS) were determined. A representative LTPS vehicle configuration is presented. Analysis tasks include: propellant heating analysis; pressurant requirements for abort propellant dump; and comparative analysis of pressurization techniques and thermal subcoolers.

  18. Performance of Modified Test Statistics in Covariance and Correlation Structure Analysis under Conditions of Multivariate Nonnormality.

    ERIC Educational Resources Information Center

    Fouladi, Rachel T.

    2000-01-01

    Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…

  19. Authentication techniques for smart cards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, R.A.

    1994-02-01

    Smart card systems are most cost efficient when implemented as a distributed system, which is a system without central host interaction or a local database of card numbers for verifying transaction approval. A distributed system, as such, presents special card and user authentication problems. Fortunately, smart cards offer processing capabilities that provide solutions to authentication problems, provided the system is designed with proper data integrity measures. Smart card systems maintain data integrity through a security design that controls data sources and limits data changes. A good security design is usually a result of a system analysis that provides a thoroughmore » understanding of the application needs. Once designers understand the application, they may specify authentication techniques that mitigate the risk of system compromise or failure. Current authentication techniques include cryptography, passwords, challenge/response protocols, and biometrics. The security design includes these techniques to help prevent counterfeit cards, unauthorized use, or information compromise. This paper discusses card authentication and user identity techniques that enhance security for microprocessor card systems. It also describes the analysis process used for determining proper authentication techniques for a system.« less

  20. Programmable Logic Application Notes

    NASA Technical Reports Server (NTRS)

    Katz, Richard

    2000-01-01

    This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will continue a series of notes concentrating on analysis techniques with this issue's section discussing: Digital Timing Analysis Tools and Techniques. Articles in this issue include: SX and SX-A Series Devices Power Sequencing; JTAG and SXISX-AISX-S Series Devices; Analysis Techniques (i.e., notes on digital timing analysis tools and techniques); Status of the Radiation Hard reconfigurable Field Programmable Gate Array Program, Input Transition Times; Apollo Guidance Computer Logic Study; RT54SX32S Prototype Data Sets; A54SX32A - 0.22 micron/UMC Test Results; Ramtron FM1608 FRAM; and Analysis of VHDL Code and Synthesizer Output.

  1. Panel Discussion on Multi-Disciplinary Analysis

    NASA Technical Reports Server (NTRS)

    Garcia, Robert

    2002-01-01

    The Marshall Space Flight Center (MSFC) is hosting the Thermal and Fluids Analysis Workshop (TFAWS) during the week of September 10, 2001. Included in this year's TFAWS is a panel session on Multidisciplinary Analysis techniques. The intent is to provide an opportunity for the users to gain information as to what product may be best suited for their applications environment and to provide feedback to you, the developers, on future desired developments. Potential users of multidisciplinary analysis (MDA) techniques are often overwhelmed by the number of choices available to them via commercial products and by the pace of new developments in this area. The purpose of this panel session is to provide a forum wherein MDA tools available and under development can be discussed, compared, and contrasted. The intent of this panel is to provide the end-user with the information necessary to make educated decisions on how to proceed with selecting their MDA tool. It is anticipated that the discussions this year will focus on MDA techniques that couple discipline codes or algorithms (as opposed to monolithic, unified MDA approaches). The MDA developers will be asked to prepare a product overview presentation addressing specific questions provided by the panel organizers. The purpose of these questions will be to establish the method employed by the particular MDA technique for communication between the discipline codes, to establish the similarities and differences amongst the various approaches, and to establish the range of experience and applications for each particular MDA approach.

  2. Change analysis in the United Arab Emirates: An investigation of techniques

    USGS Publications Warehouse

    Sohl, Terry L.

    1999-01-01

    Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change. 

  3. The discrimination of geoforensic trace material from close proximity locations by organic profiling using HPLC and plant wax marker analysis by GC.

    PubMed

    McCulloch, G; Dawson, L A; Ross, J M; Morgan, R M

    2018-07-01

    There is a need to develop a wider empirical research base to expand the scope for utilising the organic fraction of soil in forensic geoscience, and to demonstrate the capability of the analytical techniques used in forensic geoscience to discriminate samples at close proximity locations. The determination of wax markers from soil samples by GC analysis has been used extensively in court and is known to be effective in discriminating samples from different land use types. A new HPLC method for the analysis of the organic fraction of forensic sediment samples has also been shown recently to add value in conjunction with existing inorganic techniques for the discrimination of samples derived from close proximity locations. This study compares the ability of these two organic techniques to discriminate samples derived from close proximity locations and finds the GC technique to provide good discrimination at this scale, providing quantification of known compounds, whilst the HPLC technique offered a shorter and simpler sample preparation method and provided very good discrimination between groups of samples of different provenance in most cases. The use of both data sets together gave further improved accuracy rates in some cases, suggesting that a combined organic approach can provide added benefits in certain case scenarios and crime reconstruction contexts. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Analysis techniques for residual acceleration data

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.; Snyder, Robert S.

    1990-01-01

    Various aspects of residual acceleration data are of interest to low-gravity experimenters. Maximum and mean values and various other statistics can be obtained from data as collected in the time domain. Additional information may be obtained through manipulation of the data. Fourier analysis is discussed as a means of obtaining information about dominant frequency components of a given data window. Transformation of data into different coordinate axes is useful in the analysis of experiments with different orientations and can be achieved by the use of a transformation matrix. Application of such analysis techniques to residual acceleration data provides additional information than what is provided in a time history and increases the effectiveness of post-flight analysis of low-gravity experiments.

  5. SAINT: A combined simulation language for modeling man-machine systems

    NASA Technical Reports Server (NTRS)

    Seifert, D. J.

    1979-01-01

    SAINT (Systems Analysis of Integrated Networks of Tasks) is a network modeling and simulation technique for design and analysis of complex man machine systems. SAINT provides the conceptual framework for representing systems that consist of discrete task elements, continuous state variables, and interactions between them. It also provides a mechanism for combining human performance models and dynamic system behaviors in a single modeling structure. The SAINT technique is described and applications of the SAINT are discussed.

  6. Bayesian Techniques for Plasma Theory to Bridge the Gap Between Space and Lab Plasmas

    NASA Astrophysics Data System (ADS)

    Crabtree, Chris; Ganguli, Gurudas; Tejero, Erik

    2017-10-01

    We will show how Bayesian techniques provide a general data analysis methodology that is better suited to investigate phenomena that require a nonlinear theory for an explanation. We will provide short examples of how Bayesian techniques have been successfully used in the radiation belts to provide precise nonlinear spectral estimates of whistler mode chorus and how these techniques have been verified in laboratory plasmas. We will demonstrate how Bayesian techniques allow for the direct competition of different physical theories with data acting as the necessary arbitrator. This work is supported by the Naval Research Laboratory base program and by the National Aeronautics and Space Administration under Grant No. NNH15AZ90I.

  7. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  8. Fourier Analysis and the Rhythm of Conversation.

    ERIC Educational Resources Information Center

    Dabbs, James M., Jr.

    Fourier analysis, a common technique in engineering, breaks down a complex wave form into its simple sine wave components. Communication researchers have recently suggested that this technique may provide an index of the rhythm of conversation, since vocalizing and pausing produce a complex wave form pattern of alternation between two speakers. To…

  9. Asking the Right Questions: Techniques for Collaboration and School Change. 2nd Edition.

    ERIC Educational Resources Information Center

    Holcomb, Edie L.

    This work provides school change leaders with tools, techniques, tips, examples, illustrations, and stories about promoting school change. Tools provided include histograms, surveys, run charts, weighted voting, force-field analysis, decision matrices, and many others. Chapter 1, "Introduction," applies a matrix for asking questions…

  10. Informatics for Metabolomics.

    PubMed

    Kusonmano, Kanthida; Vongsangnak, Wanwipa; Chumnanpuen, Pramote

    2016-01-01

    Metabolome profiling of biological systems has the powerful ability to provide the biological understanding of their metabolic functional states responding to the environmental factors or other perturbations. Tons of accumulative metabolomics data have thus been established since pre-metabolomics era. This is directly influenced by the high-throughput analytical techniques, especially mass spectrometry (MS)- and nuclear magnetic resonance (NMR)-based techniques. Continuously, the significant numbers of informatics techniques for data processing, statistical analysis, and data mining have been developed. The following tools and databases are advanced for the metabolomics society which provide the useful metabolomics information, e.g., the chemical structures, mass spectrum patterns for peak identification, metabolite profiles, biological functions, dynamic metabolite changes, and biochemical transformations of thousands of small molecules. In this chapter, we aim to introduce overall metabolomics studies from pre- to post-metabolomics era and their impact on society. Directing on post-metabolomics era, we provide a conceptual framework of informatics techniques for metabolomics and show useful examples of techniques, tools, and databases for metabolomics data analysis starting from preprocessing toward functional interpretation. Throughout the framework of informatics techniques for metabolomics provided, it can be further used as a scaffold for translational biomedical research which can thus lead to reveal new metabolite biomarkers, potential metabolic targets, or key metabolic pathways for future disease therapy.

  11. Component-specific modeling

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.

    1985-01-01

    A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.

  12. Text mining and its potential applications in systems biology.

    PubMed

    Ananiadou, Sophia; Kell, Douglas B; Tsujii, Jun-ichi

    2006-12-01

    With biomedical literature increasing at a rate of several thousand papers per week, it is impossible to keep abreast of all developments; therefore, automated means to manage the information overload are required. Text mining techniques, which involve the processes of information retrieval, information extraction and data mining, provide a means of solving this. By adding meaning to text, these techniques produce a more structured analysis of textual knowledge than simple word searches, and can provide powerful tools for the production and analysis of systems biology models.

  13. Iterative categorization (IC): a systematic technique for analysing qualitative data

    PubMed Central

    2016-01-01

    Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155

  14. A constant current charge technique for low Earth orbit life testing

    NASA Technical Reports Server (NTRS)

    Glueck, Peter

    1991-01-01

    A constant current charge technique for low earth orbit testing of nickel cadmium cells is presented. The method mimics the familiar taper charge of the constant potential technique while maintaining cell independence for statistical analysis. A detailed example application is provided and the advantages and disadvantages of this technique are discussed.

  15. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  16. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.

  17. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  18. A Flipped Classroom Approach to Teaching Systems Analysis, Design and Implementation

    ERIC Educational Resources Information Center

    Tanner, Maureen; Scott, Elsje

    2015-01-01

    This paper describes a flipped classroom approach followed to teach systems analysis, design and implementation at university level. The techniques employed are described. These techniques were underpinned by a theory of coherent practice: a pedagogy that provides a framework for the design of highly structured interventions to guide students in…

  19. A Market-oriented Approach To Maximizing Product Benefits: Cases in U.S. Forest Products Industries

    Treesearch

    Vijay S. Reddy; Robert J. Bush; Ronen Roudik

    1996-01-01

    Conjoint analysis, a decompositional customer preference modelling technique, has seen little application to forest products. However, the technique provides useful information for marketing decisions by quantifying consumer preference functions for multiattribute product alternatives. The results of a conjoint analysis include the contribution of each attribute and...

  20. Qualitative research in nutrition and dietetics: data analysis issues.

    PubMed

    Fade, S A; Swift, J A

    2011-04-01

    Although much of the analysis conducted in qualitative research falls within the broad church of thematic analysis, the wide scope of qualitative enquiry presents the researcher with a number of choices regarding data analysis techniques. This review, the third in the series, provides an overview of a number of techniques and practical steps that can be taken to provide some structure and focus to the intellectual work of thematic analysis in nutrition and dietetics. Because appropriate research methods are crucial to ensure high-quality research, it also describes a process for choosing appropriate analytical methods that considers the extent to which they help answer the research question(s) and are compatible with the philosophical assumptions about ontology, epistemology and methodology that underpin the overall design of a study. Other reviews in this series provide a model for embarking on a qualitative research project in nutrition and dietetics, an overview of the principal techniques of data collection, sampling and quality assessment of this kind of research and some practical advice relevant to nutrition and dietetics, along with glossaries of key terms. © 2010 The Authors. Journal compilation © 2010 The British Dietetic Association Ltd.

  1. Column ratio mapping: a processing technique for atomic resolution high-angle annular dark-field (HAADF) images.

    PubMed

    Robb, Paul D; Craven, Alan J

    2008-12-01

    An image processing technique is presented for atomic resolution high-angle annular dark-field (HAADF) images that have been acquired using scanning transmission electron microscopy (STEM). This technique is termed column ratio mapping and involves the automated process of measuring atomic column intensity ratios in high-resolution HAADF images. This technique was developed to provide a fuller analysis of HAADF images than the usual method of drawing single intensity line profiles across a few areas of interest. For instance, column ratio mapping reveals the compositional distribution across the whole HAADF image and allows a statistical analysis and an estimation of errors. This has proven to be a very valuable technique as it can provide a more detailed assessment of the sharpness of interfacial structures from HAADF images. The technique of column ratio mapping is described in terms of a [110]-oriented zinc-blende structured AlAs/GaAs superlattice using the 1 angstroms-scale resolution capability of the aberration-corrected SuperSTEM 1 instrument.

  2. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    PubMed

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  3. Image-Based 3d Reconstruction and Analysis for Orthodontia

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.

    2012-08-01

    Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.

  4. A review of second law techniques applicable to basic thermal science research

    NASA Astrophysics Data System (ADS)

    Drost, M. Kevin; Zamorski, Joseph R.

    1988-11-01

    This paper reports the results of a review of second law analysis techniques which can contribute to basic research in the thermal sciences. The review demonstrated that second law analysis has a role in basic thermal science research. Unlike traditional techniques, second law analysis accurately identifies the sources and location of thermodynamic losses. This allows the development of innovative solutions to thermal science problems by directing research to the key technical issues. Two classes of second law techniques were identified as being particularly useful. First, system and component investigations can provide information of the source and nature of irreversibilities on a macroscopic scale. This information will help to identify new research topics and will support the evaluation of current research efforts. Second, the differential approach can provide information on the causes and spatial and temporal distribution of local irreversibilities. This information enhances the understanding of fluid mechanics, thermodynamics, and heat and mass transfer, and may suggest innovative methods for reducing irreversibilities.

  5. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  6. Fluorescence Fluctuation Approaches to the Study of Adhesion and Signaling

    PubMed Central

    Bachir, Alexia I.; Kubow, Kristopher E.; Horwitz, Alan R.

    2013-01-01

    Cell–matrix adhesions are large, multimolecular complexes through which cells sense and respond to their environment. They also mediate migration by serving as traction points and signaling centers and allow the cell to modify the surroucnding tissue. Due to their fundamental role in cell behavior, adhesions are germane to nearly all major human health pathologies. However, adhesions are extremely complex and dynamic structures that include over 100 known interacting proteins and operate over multiple space (nm–µm) and time (ms–min) regimes. Fluorescence fluctuation techniques are well suited for studying adhesions. These methods are sensitive over a large spatiotemporal range and provide a wealth of information including molecular transport dynamics, interactions, and stoichiometry from a single time series. Earlier chapters in this volume have provided the theoretical background, instrumentation, and analysis algorithms for these techniques. In this chapter, we discuss their implementation in living cells to study adhesions in migrating cells. Although each technique and application has its own unique instrumentation and analysis requirements, we provide general guidelines for sample preparation, selection of imaging instrumentation, and optimization of data acquisition and analysis parameters. Finally, we review several recent studies that implement these techniques in the study of adhesions. PMID:23280111

  7. Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1994-01-01

    The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.

  8. A survey of visualization systems for network security.

    PubMed

    Shiravi, Hadi; Shiravi, Ali; Ghorbani, Ali A

    2012-08-01

    Security Visualization is a very young term. It expresses the idea that common visualization techniques have been designed for use cases that are not supportive of security-related data, demanding novel techniques fine tuned for the purpose of thorough analysis. Significant amount of work has been published in this area, but little work has been done to study this emerging visualization discipline. We offer a comprehensive review of network security visualization and provide a taxonomy in the form of five use-case classes encompassing nearly all recent works in this area. We outline the incorporated visualization techniques and data sources and provide an informative table to display our findings. From the analysis of these systems, we examine issues and concerns regarding network security visualization and provide guidelines and directions for future researchers and visual system developers.

  9. Classroom Observation Techniques. IDEA Paper No. 4.

    ERIC Educational Resources Information Center

    Acheson, Keith A.

    Techniques for observing the classroom behavior of teachers and students are examined. These techniques provide a framework for analyzing and understanding classroom interaction, for making decisions about what should be happening, and for changing instructional behavior when it is necessary. The observation methods allow collection, analysis, and…

  10. A Reference Model for Software and System Inspections. White Paper

    NASA Technical Reports Server (NTRS)

    He, Lulu; Shull, Forrest

    2009-01-01

    Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.

  11. Monitoring Air Quality with Leaf Yeasts.

    ERIC Educational Resources Information Center

    Richardson, D. H. S.; And Others

    1985-01-01

    Proposes that leaf yeast serve as quick, inexpensive, and effective techniques for monitoring air quality. Outlines procedures and provides suggestions for data analysis. Includes results from sample school groups who employed this technique. (ML)

  12. Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis

    PubMed Central

    Steele, Joe; Bastola, Dhundy

    2014-01-01

    Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502

  13. Sensitive and inexpensive digital DNA analysis by microfluidic enrichment of rolling circle amplified single-molecules

    PubMed Central

    Kühnemund, Malte; Hernández-Neuta, Iván; Sharif, Mohd Istiaq; Cornaglia, Matteo; Gijs, Martin A.M.

    2017-01-01

    Abstract Single molecule quantification assays provide the ultimate sensitivity and precision for molecular analysis. However, most digital analysis techniques, i.e. droplet PCR, require sophisticated and expensive instrumentation for molecule compartmentalization, amplification and analysis. Rolling circle amplification (RCA) provides a simpler means for digital analysis. Nevertheless, the sensitivity of RCA assays has until now been limited by inefficient detection methods. We have developed a simple microfluidic strategy for enrichment of RCA products into a single field of view of a low magnification fluorescent sensor, enabling ultra-sensitive digital quantification of nucleic acids over a dynamic range from 1.2 aM to 190 fM. We prove the broad applicability of our analysis platform by demonstrating 5-plex detection of as little as ∼1 pg (∼300 genome copies) of pathogenic DNA with simultaneous antibiotic resistance marker detection, and the analysis of rare oncogene mutations. Our method is simpler, more cost-effective and faster than other digital analysis techniques and provides the means to implement digital analysis in any laboratory equipped with a standard fluorescent microscope. PMID:28077562

  14. A framework for graph-based synthesis, analysis, and visualization of HPC cluster job data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Kegelmeyer, W. Philip, Jr.; Wong, Matthew H.

    The monitoring and system analysis of high performance computing (HPC) clusters is of increasing importance to the HPC community. Analysis of HPC job data can be used to characterize system usage and diagnose and examine failure modes and their effects. This analysis is not straightforward, however, due to the complex relationships that exist between jobs. These relationships are based on a number of factors, including shared compute nodes between jobs, proximity of jobs in time, etc. Graph-based techniques represent an approach that is particularly well suited to this problem, and provide an effective technique for discovering important relationships in jobmore » queuing and execution data. The efficacy of these techniques is rooted in the use of a semantic graph as a knowledge representation tool. In a semantic graph job data, represented in a combination of numerical and textual forms, can be flexibly processed into edges, with corresponding weights, expressing relationships between jobs, nodes, users, and other relevant entities. This graph-based representation permits formal manipulation by a number of analysis algorithms. This report presents a methodology and software implementation that leverages semantic graph-based techniques for the system-level monitoring and analysis of HPC clusters based on job queuing and execution data. Ontology development and graph synthesis is discussed with respect to the domain of HPC job data. The framework developed automates the synthesis of graphs from a database of job information. It also provides a front end, enabling visualization of the synthesized graphs. Additionally, an analysis engine is incorporated that provides performance analysis, graph-based clustering, and failure prediction capabilities for HPC systems.« less

  15. High-Bandwidth Tactical-Network Data Analysis in a High-Performance-Computing (HPC) Environment: Time Tagging the Data

    DTIC Science & Technology

    2015-09-01

    this report made use of posttest processing techniques to provide packet-level time tagging with an accuracy close to 3 µs relative to Coordinated...h set of test records. The process described herein made use of posttest processing techniques to provide packet-level time tagging with an accuracy

  16. An Overview Of Wideband Signal Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Speiser, Jeffrey M.; Whitehouse, Harper J.

    1989-11-01

    This paper provides a unifying perspective for several narowband and wideband signal processing techniques. It considers narrowband ambiguity functions and Wigner-Ville distibutions, together with the wideband ambiguity function and several proposed approaches to a wideband version of the Wigner-Ville distribution (WVD). A unifying perspective is provided by the methodology of unitary representations and ray representations of transformation groups.

  17. Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.

    NASA Technical Reports Server (NTRS)

    Leonard, Desiree M.

    1991-01-01

    Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.

  18. Random safety auditing, root cause analysis, failure mode and effects analysis.

    PubMed

    Ursprung, Robert; Gray, James

    2010-03-01

    Improving quality and safety in health care is a major concern for health care providers, the general public, and policy makers. Errors and quality issues are leading causes of morbidity and mortality across the health care industry. There is evidence that patients in the neonatal intensive care unit (NICU) are at high risk for serious medical errors. To facilitate compliance with safe practices, many institutions have established quality-assurance monitoring procedures. Three techniques that have been found useful in the health care setting are failure mode and effects analysis, root cause analysis, and random safety auditing. When used together, these techniques are effective tools for system analysis and redesign focused on providing safe delivery of care in the complex NICU system. Copyright 2010 Elsevier Inc. All rights reserved.

  19. Utilization of a CRT display light pen in the design of feedback control systems

    NASA Technical Reports Server (NTRS)

    Thompson, J. G.; Young, K. R.

    1972-01-01

    A hierarchical structure of the interlinked programs was developed to provide a flexible computer-aided design tool. A graphical input technique and a data structure are considered which provide the capability of entering the control system model description into the computer in block diagram form. An information storage and retrieval system was developed to keep track of the system description, and analysis and simulation results, and to provide them to the correct routines for further manipulation or display. Error analysis and diagnostic capabilities are discussed, and a technique was developed to reduce a transfer function to a set of nested integrals suitable for digital simulation. A general, automated block diagram reduction procedure was set up to prepare the system description for the analysis routines.

  20. Sampling and Data Analysis for Environmental Microbiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, Christopher J.

    2001-06-01

    A brief review of the literature indicates the importance of statistical analysis in applied and environmental microbiology. Sampling designs are particularly important for successful studies, and it is highly recommended that researchers review their sampling design before heading to the laboratory or the field. Most statisticians have numerous stories of scientists who approached them after their study was complete only to have to tell them that the data they gathered could not be used to test the hypothesis they wanted to address. Once the data are gathered, a large and complex body of statistical techniques are available for analysis ofmore » the data. Those methods include both numerical and graphical techniques for exploratory characterization of the data. Hypothesis testing and analysis of variance (ANOVA) are techniques that can be used to compare the mean and variance of two or more groups of samples. Regression can be used to examine the relationships between sets of variables and is often used to examine the dependence of microbiological populations on microbiological parameters. Multivariate statistics provides several methods that can be used for interpretation of datasets with a large number of variables and to partition samples into similar groups, a task that is very common in taxonomy, but also has applications in other fields of microbiology. Geostatistics and other techniques have been used to examine the spatial distribution of microorganisms. The objectives of this chapter are to provide a brief survey of some of the statistical techniques that can be used for sample design and data analysis of microbiological data in environmental studies, and to provide some examples of their use from the literature.« less

  1. CPM and PERT in Library Management.

    ERIC Educational Resources Information Center

    Main, Linda

    1989-01-01

    Discusses two techniques of systems analysis--Critical Path Method (CPM) and Program Evaluation Review Techniques (PERT)--and their place in library management. An overview of CPM and PERT charting procedures is provided. (11 references) (Author/MES)

  2. NPS transportation innovative finance options

    DOT National Transportation Integrated Search

    2013-05-01

    This paper provides a summary of innovative transportation finance techniques and discusses their applicability to the National Park Service (NPS). The primary finding of this analysis is that while NPS is engaging in innovative finance techniques su...

  3. Earth orientation from lunar laser ranging and an error analysis of polar motion services

    NASA Technical Reports Server (NTRS)

    Dickey, J. O.; Newhall, X. X.; Williams, J. G.

    1985-01-01

    Lunar laser ranging (LLR) data are obtained on the basis of the timing of laser pulses travelling from observatories on earth to retroreflectors placed on the moon's surface during the Apollo program. The modeling and analysis of the LLR data can provide valuable insights into earth's dynamics. The feasibility to model accurately the lunar orbit over the full 13-year observation span makes it possible to conduct relatively long-term studies of variations in the earth's rotation. A description is provided of general analysis techniques, and the calculation of universal time (UT1) from LLR is discussed. Attention is also given to a summary of intercomparisons with different techniques, polar motion results and intercomparisons, and a polar motion error analysis.

  4. A Survey of Shape Parameterization Techniques

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper provides a survey of shape parameterization techniques for multidisciplinary optimization and highlights some emerging ideas. The survey focuses on the suitability of available techniques for complex configurations, with suitability criteria based on the efficiency, effectiveness, ease of implementation, and availability of analytical sensitivities for geometry and grids. The paper also contains a section on field grid regeneration, grid deformation, and sensitivity analysis techniques.

  5. Evaluation of analysis techniques for low frequency interior noise and vibration of commercial aircraft

    NASA Technical Reports Server (NTRS)

    Landmann, A. E.; Tillema, H. F.; Marshall, S. E.

    1989-01-01

    The application of selected analysis techniques to low frequency cabin noise associated with advanced propeller engine installations is evaluated. Three design analysis techniques were chosen for evaluation including finite element analysis, statistical energy analysis (SEA), and a power flow method using element of SEA (computer program Propeller Aircraft Interior Noise). An overview of the three procedures is provided. Data from tests of a 727 airplane (modified to accept a propeller engine) were used to compare with predictions. Comparisons of predicted and measured levels at the end of the first year's effort showed reasonable agreement leading to the conclusion that each technique had value for propeller engine noise predictions on large commercial transports. However, variations in agreement were large enough to remain cautious and to lead to recommendations for further work with each technique. Assessment of the second year's results leads to the conclusion that the selected techniques can accurately predict trends and can be useful to a designer, but that absolute level predictions remain unreliable due to complexity of the aircraft structure and low modal densities.

  6. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  7. Application of remote sensing to land and water resource planning: The Pocomoke River Basin, Maryland

    NASA Technical Reports Server (NTRS)

    Wildesen, S. E.; Phillips, E. P.

    1981-01-01

    Because of the size of the Pocomoke River Basin, the inaccessibility of certain areas, and study time constraints, several remote sensing techniques were used to collect base information on the river corridor, (a 23.2 km channel) and on a 1.2 km wooded floodplain. This information provided an adequate understanding of the environment and its resources, thus enabling effective management options to be designed. The remote sensing techniques used for assessment included manual analysis of high altitude color-infrared photography, computer-assisted analysis of LANDSAT-2 imagery, and the application of airborne oceanographic Lidar for topographic mapping. Results show that each techniques was valuable in providing the needed base data necessary for resource planning.

  8. Uranium determination in natural water by the fissiontrack technique

    USGS Publications Warehouse

    Reimer, G.M.

    1975-01-01

    The fission track technique, utilizing the neutron-induced fission of uranium-235, provides a versatile analytical method for the routine analysis of uranium in liquid samples of natural water. A detector is immersed in the sample and both are irradiated. The fission track density observed in the detector is directly proportional to the uranium concentration. The specific advantages of this technique are: (1) only a small quantity of sample, typically 0.1-1 ml, is needed; (2) no sample concentration is necessary; (3) it is capable of providing analyses with a lower reporting limit of 1 ??g per liter; and (4) the actual time spent on an analysis can be only a few minutes. This paper discusses and describes the method. ?? 1975.

  9. UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.

    PubMed

    Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois

    2018-03-01

    Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.

  10. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, M.

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.

  11. High-resolution measurements of the multilayer ultra-structure of articular cartilage and their translational potential

    PubMed Central

    2014-01-01

    Current musculoskeletal imaging techniques usually target the macro-morphology of articular cartilage or use histological analysis. These techniques are able to reveal advanced osteoarthritic changes in articular cartilage but fail to give detailed information to distinguish early osteoarthritis from healthy cartilage, and this necessitates high-resolution imaging techniques measuring cells and the extracellular matrix within the multilayer structure of articular cartilage. This review provides a comprehensive exploration of the cellular components and extracellular matrix of articular cartilage as well as high-resolution imaging techniques, including magnetic resonance image, electron microscopy, confocal laser scanning microscopy, second harmonic generation microscopy, and laser scanning confocal arthroscopy, in the measurement of multilayer ultra-structures of articular cartilage. This review also provides an overview for micro-structural analysis of the main components of normal or osteoarthritic cartilage and discusses the potential and challenges associated with developing non-invasive high-resolution imaging techniques for both research and clinical diagnosis of early to late osteoarthritis. PMID:24946278

  12. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  13. Combinations of techniques that effectively change health behavior: evidence from Meta-CART analysis.

    PubMed

    Dusseldorp, Elise; van Genugten, Lenneke; van Buuren, Stef; Verheijden, Marieke W; van Empelen, Pepijn

    2014-12-01

    Many health-promoting interventions combine multiple behavior change techniques (BCTs) to maximize effectiveness. Although, in theory, BCTs can amplify each other, the available meta-analyses have not been able to identify specific combinations of techniques that provide synergistic effects. This study overcomes some of the shortcomings in the current methodology by applying classification and regression trees (CART) to meta-analytic data in a special way, referred to as Meta-CART. The aim was to identify particular combinations of BCTs that explain intervention success. A reanalysis of data from Michie, Abraham, Whittington, McAteer, and Gupta (2009) was performed. These data included effect sizes from 122 interventions targeted at physical activity and healthy eating, and the coding of the interventions into 26 BCTs. A CART analysis was performed using the BCTs as predictors and treatment success (i.e., effect size) as outcome. A subgroup meta-analysis using a mixed effects model was performed to compare the treatment effect in the subgroups found by CART. Meta-CART identified the following most effective combinations: Provide information about behavior-health link with Prompt intention formation (mean effect size ḡ = 0.46), and Provide information about behavior-health link with Provide information on consequences and Use of follow-up prompts (ḡ = 0.44). Least effective interventions were those using Provide feedback on performance without using Provide instruction (ḡ = 0.05). Specific combinations of BCTs increase the likelihood of achieving change in health behavior, whereas other combinations decrease this likelihood. Meta-CART successfully identified these combinations and thus provides a viable methodology in the context of meta-analysis.

  14. Chromosome Analysis

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Perceptive Scientific Instruments, Inc., provides the foundation for the Powergene line of chromosome analysis and molecular genetic instrumentation. This product employs image processing technology from NASA's Jet Propulsion Laboratory and image enhancement techniques from Johnson Space Center. Originally developed to send pictures back to earth from space probes, digital imaging techniques have been developed and refined for use in a variety of medical applications, including diagnosis of disease.

  15. Iterative categorization (IC): a systematic technique for analysing qualitative data.

    PubMed

    Neale, Joanne

    2016-06-01

    The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  16. Recent development of electrochemiluminescence sensors for food analysis.

    PubMed

    Hao, Nan; Wang, Kun

    2016-10-01

    Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.

  17. A Stereo Imaging Velocimetry Technique for Analyzing Structure of Flame Balls at Low Lewis-Number (SOFBALL) Data

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth

    2008-01-01

    Stereo Imaging Velocimetry (SIV) is a NASA Glenn Research Center (GRC) developed fluid physics technique for measuring threedimensional (3-D) velocities in any optically transparent fluid that can be seeded with tracer particles. SIV provides a means to measure 3-D fluid velocities quantitatively and qualitatively at many points. This technique provides full-field 3-D analysis of any optically clear fluid or gas experiment using standard off-the-shelf CCD cameras to provide accurate and reproducible 3-D velocity profiles for experiments that require 3-D analysis. A flame ball is a steady flame in a premixed combustible atmosphere which, due to the transport properties (low Lewis-number) of the mixture, does not propagate but is instead supplied by diffusive transport of the reactants, forming a premixed flame. This flame geometry presents a unique environment for testing combustion theory. We present our analysis of flame ball phenomena utilizing SIV technology in order to accurately calculate the 3-D position of a flame ball(s) during an experiment, which can be used as a direct comparison of numerical simulations.

  18. Flow analysis techniques for phosphorus: an overview.

    PubMed

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  19. A simple white noise analysis of neuronal light responses.

    PubMed

    Chichilnisky, E J

    2001-05-01

    A white noise technique is presented for estimating the response properties of spiking visual system neurons. The technique is simple, robust, efficient and well suited to simultaneous recordings from multiple neurons. It provides a complete and easily interpretable model of light responses even for neurons that display a common form of response nonlinearity that precludes classical linear systems analysis. A theoretical justification of the technique is presented that relies only on elementary linear algebra and statistics. Implementation is described with examples. The technique and the underlying model of neural responses are validated using recordings from retinal ganglion cells, and in principle are applicable to other neurons. Advantages and disadvantages of the technique relative to classical approaches are discussed.

  20. Myocardial blood flow: Roentgen videodensitometry techniques

    NASA Technical Reports Server (NTRS)

    Smith, H. C.; Robb, R. A.; Wood, E. H.

    1975-01-01

    The current status of roentgen videodensitometric techniques that provide an objective assessment of blood flow at selected sites within the coronary circulation were described. Roentgen videodensitometry employs conventional radiopaque indicators, radiological equipment and coronary angiographic techniques. Roentgen videodensitometry techniques developed in the laboratory during the past nine years, and for the past three years were applied to analysis of angiograms in the clinical cardiac catheterization laboratory.

  1. Efficient morse decompositions of vector fields.

    PubMed

    Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene

    2008-01-01

    Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.

  2. Chemical fingerprinting of Arabidopsis using Fourier transform infrared (FT-IR) spectroscopic approaches.

    PubMed

    Gorzsás, András; Sundberg, Björn

    2014-01-01

    Fourier transform infrared (FT-IR) spectroscopy is a fast, sensitive, inexpensive, and nondestructive technique for chemical profiling of plant materials. In this chapter we discuss the instrumental setup, the basic principles of analysis, and the possibilities for and limitations of obtaining qualitative and semiquantitative information by FT-IR spectroscopy. We provide detailed protocols for four fully customizable techniques: (1) Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS): a sensitive and high-throughput technique for powders; (2) attenuated total reflectance (ATR) spectroscopy: a technique that requires no sample preparation and can be used for solid samples as well as for cell cultures; (3) microspectroscopy using a single element (SE) detector: a technique used for analyzing sections at low spatial resolution; and (4) microspectroscopy using a focal plane array (FPA) detector: a technique for rapid chemical profiling of plant sections at cellular resolution. Sample preparation, measurement, and data analysis steps are listed for each of the techniques to help the user collect the best quality spectra and prepare them for subsequent multivariate analysis.

  3. Personal Constructions of Biological Concepts--The Repertory Grid Approach

    ERIC Educational Resources Information Center

    McCloughlin, Thomas J. J.; Matthews, Philip S. C.

    2017-01-01

    This work discusses repertory grid analysis as a tool for investigating the structures of students' representations of biological concepts. Repertory grid analysis provides the researcher with a variety of techniques that are not associated with standard methods of concept mapping for investigating conceptual structures. It can provide valuable…

  4. Mole Pi: Using New Technology to Teach the Magnitude of a Mole

    ERIC Educational Resources Information Center

    Geyer, Michael J.

    2014-01-01

    A modified technique for demonstrating the magnitude of Avogadro's number using a new Raspberry Pi computer and the Python language is described. The technique also provides students the opportunity to review dimensional analysis.

  5. Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.

    PubMed

    Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy

    2014-11-01

    Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  6. Application of modern tools and techniques to maximize engineering productivity in the development of orbital operations plans for the space station progrm

    NASA Technical Reports Server (NTRS)

    Manford, J. S.; Bennett, G. R.

    1985-01-01

    The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.

  7. Discrete ordinates-Monte Carlo coupling: A comparison of techniques in NERVA radiation analysis

    NASA Technical Reports Server (NTRS)

    Lindstrom, D. G.; Normand, E.; Wilcox, A. D.

    1972-01-01

    In the radiation analysis of the NERVA nuclear rocket system, two-dimensional discrete ordinates calculations are sufficient to provide detail in the pressure vessel and reactor assembly. Other parts of the system, however, require three-dimensional Monte Carlo analyses. To use these two methods in a single analysis, a means of coupling was developed whereby the results of a discrete ordinates calculation can be used to produce source data for a Monte Carlo calculation. Several techniques for producing source detail were investigated. Results of calculations on the NERVA system are compared and limitations and advantages of the coupling techniques discussed.

  8. Composition and stratigraphy of the paint layers: investigation on the Madonna dei Fusi by ion beam analysis techniques

    NASA Astrophysics Data System (ADS)

    Grassi, N.

    2005-06-01

    In the framework of the extensive study on the wood painting "Madonna dei fusi" attributed to Leonardo da Vinci, Ion Beam Analysis (IBA) techniques were used at the Florence accelerator laboratory to get information about the elemental composition of the paint layers. After a brief description of the basic principle and the general features of IBA techniques, we will illustrate in detail how the analysis allowed us to characterise the pigments of original and restored areas and the substrate composition, and to obtain information about the stratigraphy of the painting, also providing an estimate of the paint layer thickness.

  9. Sensitive and inexpensive digital DNA analysis by microfluidic enrichment of rolling circle amplified single-molecules.

    PubMed

    Kühnemund, Malte; Hernández-Neuta, Iván; Sharif, Mohd Istiaq; Cornaglia, Matteo; Gijs, Martin A M; Nilsson, Mats

    2017-05-05

    Single molecule quantification assays provide the ultimate sensitivity and precision for molecular analysis. However, most digital analysis techniques, i.e. droplet PCR, require sophisticated and expensive instrumentation for molecule compartmentalization, amplification and analysis. Rolling circle amplification (RCA) provides a simpler means for digital analysis. Nevertheless, the sensitivity of RCA assays has until now been limited by inefficient detection methods. We have developed a simple microfluidic strategy for enrichment of RCA products into a single field of view of a low magnification fluorescent sensor, enabling ultra-sensitive digital quantification of nucleic acids over a dynamic range from 1.2 aM to 190 fM. We prove the broad applicability of our analysis platform by demonstrating 5-plex detection of as little as ∼1 pg (∼300 genome copies) of pathogenic DNA with simultaneous antibiotic resistance marker detection, and the analysis of rare oncogene mutations. Our method is simpler, more cost-effective and faster than other digital analysis techniques and provides the means to implement digital analysis in any laboratory equipped with a standard fluorescent microscope. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Statistical and Economic Techniques for Site-specific Nematode Management.

    PubMed

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  11. Radar fall detection using principal component analysis

    NASA Astrophysics Data System (ADS)

    Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem

    2016-05-01

    Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.

  12. Fundamentals of quantitative dynamic contrast-enhanced MR imaging.

    PubMed

    Paldino, Michael J; Barboriak, Daniel P

    2009-05-01

    Quantitative analysis of dynamic contrast-enhanced MR imaging (DCE-MR imaging) has the power to provide information regarding physiologic characteristics of the microvasculature and is, therefore, of great potential value to the practice of oncology. In particular, these techniques could have a significant impact on the development of novel anticancer therapies as a promising biomarker of drug activity. Standardization of DCE-MR imaging acquisition and analysis to provide more reproducible measures of tumor vessel physiology is of crucial importance to realize this potential. The purpose of this article is to review the pathophysiologic basis and technical aspects of DCE-MR imaging techniques.

  13. Development Context Driven Change Awareness and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha

    2014-01-01

    Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.

  14. Development Context Driven Change Awareness and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian

    2014-01-01

    Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.

  15. Business Case Analysis: Continuous Integrated Logistics Support-Targeted Allowance Technique (CILS-TAT)

    DTIC Science & Technology

    2013-06-01

    In this research, we examine the Naval Sea Logistics Command s Continuous Integrated Logistics Support Targeted Allowancing Technique (CILS TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS TAT, and provide recommendations concerning possible improvements to the

  16. The Use of a Context-Based Information Retrieval Technique

    DTIC Science & Technology

    2009-07-01

    provided in context. Latent Semantic Analysis (LSA) is a statistical technique for inferring contextual and structural information, and previous studies...WAIS). 10 DSTO-TR-2322 1.4.4 Latent Semantic Analysis LSA, which is also known as latent semantic indexing (LSI), uses a statistical and...1.4.6 Language Models In contrast, natural language models apply algorithms that combine statistical information with semantic information. Semantic

  17. Evaluation of radioisotope tracer and activation analysis techniques for contamination monitoring in space environment simulation chambers

    NASA Technical Reports Server (NTRS)

    Smathers, J. B.; Kuykendall, W. E., Jr.; Wright, R. E., Jr.; Marshall, J. R.

    1973-01-01

    Radioisotope measurement techniques and neutron activation analysis are evaluated for use in identifying and locating contamination sources in space environment simulation chambers. The alpha range method allows the determination of total contaminant concentration in vapor state and condensate state. A Cf-252 neutron activation analysis system for detecting oils and greases tagged with stable elements is described. While neutron activation analysis of tagged contaminants offers specificity, an on-site system is extremely costly to implement and provides only marginal detection sensitivity under even the most favorable conditions.

  18. Group decision-making techniques for natural resource management applications

    USGS Publications Warehouse

    Coughlan, Beth A.K.; Armour, Carl L.

    1992-01-01

    This report is an introduction to decision analysis and problem-solving techniques for professionals in natural resource management. Although these managers are often called upon to make complex decisions, their training in the natural sciences seldom provides exposure to the decision-making tools developed in management science. Our purpose is to being to fill this gap. We present a general analysis of the pitfalls of group problem solving, and suggestions for improved interactions followed by the specific techniques. Selected techniques are illustrated. The material is easy to understand and apply without previous training or excessive study and is applicable to natural resource management issues.

  19. The edge of chaos: A nonlinear view of psychoanalytic technique.

    PubMed

    Galatzer-Levy, Robert M

    2016-04-01

    The field of nonlinear dynamics (or chaos theory) provides ways to expand concepts of psychoanalytic process that have implications for the technique of psychoanalysis. This paper describes how concepts of "the edge of chaos," emergence, attractors, and coupled oscillators can help shape analytic technique resulting in an approach to doing analysis which is at the same time freer and more firmly based in an enlarged understanding of the ways in which psychoanalysis works than some current recommendation about technique. Illustrations from a lengthy analysis of an analysand with obsessive-compulsive disorder show this approach in action. Copyright © 2016 Institute of Psychoanalysis.

  20. Computer-aided analysis of Skylab scanner data for land use mapping, forestry and water resource applications

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1975-01-01

    Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.

  1. Performance of dental impression materials: Benchmarking of materials and techniques by three-dimensional analysis.

    PubMed

    Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G

    2015-01-01

    Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.

  2. Methodologies for Evaluating the Impact of Contraceptive Social Marketing Programs.

    ERIC Educational Resources Information Center

    Bertrand, Jane T.; And Others

    1989-01-01

    An overview of the evaluation issues associated with contraceptive social marketing programs is provided. Methodologies covered include survey techniques, cost-effectiveness analyses, retail audits of sales data, time series analysis, nested logit analysis, and discriminant analysis. (TJH)

  3. 3D Feature Extraction for Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Silver, Deborah

    1996-01-01

    Visualization techniques provide tools that help scientists identify observed phenomena in scientific simulation. To be useful, these tools must allow the user to extract regions, classify and visualize them, abstract them for simplified representations, and track their evolution. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This article explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and those from Finite Element Analysis.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carla J. Miller

    This report provides a summary of the literature review that was performed and based on previous work performed at the Idaho National Laboratory studying the Three Mile Island 2 (TMI-2) nuclear reactor accident, specifically the melted fuel debris. The purpose of the literature review was to document prior published work that supports the feasibility of the analytical techniques that were developed to provide quantitative results of the make-up of the fuel and reactor component debris located inside and outside the containment. The quantitative analysis provides a technique to perform nuclear fuel accountancy measurements

  5. Small-Group Instruction: Theory and Practice.

    ERIC Educational Resources Information Center

    Olmstead, Joseph A.

    The volume is an analysis of the state of the art of small-group methods of instruction. It describes some of the more commonly used small-group techniques and the rationale behind them, and provides an analysis of their potential use for various types and conditions of instructional environments. Explicit guidelines are provided to assist…

  6. Guidance on spatial wildland fire analysis: models, tools, and techniques

    Treesearch

    Richard D. Stratton

    2006-01-01

    There is an increasing need for spatial wildland fire analysis in support of incident management, fuel treatment planning, wildland-urban assessment, and land management plan development. However, little guidance has been provided to the field in the form of training, support, or research examples. This paper provides guidance to fire managers, planners, specialists,...

  7. The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Latimer, John A.

    2009-01-01

    This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.

  8. Analysis of Multi-Arm Caliper Data for the U.S. Strategic Petroleum Reserve

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Barry L.

    The U.S. Strategic Petroleum Reserve (SPR) has an increasing reliance on mul t i - arm caliper surveys to assess the integrity of casing for cavern access wells and to determine priorities for casing remediation. Multi - arm caliper (MAC) surveys provide a view of well casing deformation by reporting radial measurements of the inner cas ing wall as the tool is drawn through the casing. Over the last several years the SPR has collected a large number of modern MAC surveys. In total, these surveys account for over 100 million individual measurements. The surveys were collected using diff eringmore » survey vendors and survey hardware. This has resulted in a collection of disparate data sets which confound attempts to make well - to - well or time - dependent evaluations. In addition, the vendor supplied MAC interpretations often involve variables wh ich are not well defined or which may not be applicable to casings for cavern access wells. These factors reduce the usability of these detailed data sets. In order to address this issue and provide an independent analysis of multi - arm caliper survey data, Sandia National Labs has developed processing techniques and analysis variables which allow for the comparison of MAC survey data regardless of the source of the survey data. These techniques use the raw radial arm information and newly developed analysis variables to assess the casing status and provide a means for well - to - well and time - dependent analyses. Well - to - well and t ime - dependent investigation of the MAC survey data provide s information to prioritize well remediation activities and identify wells with integrity issues. This paper presents the challenges in using disparate MAC survey data, techniques developed to address these challenges and some o f the insights gained from these new techniques.« less

  9. The NOAA Local Climate Analysis Tool - An Application in Support of a Weather Ready Nation

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Horsfall, F. M.

    2012-12-01

    Citizens across the U.S., including decision makers from the local to the national level, have a multitude of questions about climate, such as the current state and how that state fits into the historical context, and more importantly, how climate will impact them, especially with regard to linkages to extreme weather events. Developing answers to these types of questions for locations has typically required extensive work to gather data, conduct analyses, and generate relevant explanations and graphics. Too frequently providers don't have ready access to or knowledge of reliable, trusted data sets, nor sound, scientifically accepted analysis techniques such that they can provide a rapid response to queries they receive. In order to support National Weather Service (NWS) local office forecasters with information they need to deliver timely responses to climate-related questions from their customers, we have developed the Local Climate Analysis Tool (LCAT). LCAT uses the principles of artificial intelligence to respond to queries, in particular, through use of machine technology that responds intelligently to input from users. A user translates customer questions into primary variables and issues and LCAT pulls the most relevant data and analysis techniques to provide information back to the user, who in turn responds to their customer. Most responses take on the order of 10 seconds, which includes providing statistics, graphical displays of information, translations for users, metadata, and a summary of the user request to LCAT. Applications in Phase I of LCAT, which is targeted for the NWS field offices, include Climate Change Impacts, Climate Variability Impacts, Drought Analysis and Impacts, Water Resources Applications, Attribution of Extreme Events, and analysis techniques such as time series analysis, trend analysis, compositing, and correlation and regression techniques. Data accessed by LCAT are homogenized historical COOP and Climate Prediction Center climate division data available at NCDC. Applications for other NOAA offices and Federal agencies are currently being investigated, such as incorporation of tidal data, fish stocks, sea surface temperature, health-related data, and analyses relevant to those datasets. We will describe LCAT, its basic functionality, examples of analyses, and progress being made to provide the tool to a broader audience in support of ocean, fisheries, and health applications.

  10. Looking at Fossils in New Ways

    ERIC Educational Resources Information Center

    Flannery, Maura C.

    2005-01-01

    Existing fossils could be studied from a different prospective with the use of new methods of analysis for gathering more information. The new techniques of studying fossils binds the new and the old techniques and information and provides another way to look at fossils.

  11. Rare cell isolation and analysis in microfluidics

    PubMed Central

    Chen, Yuchao; Li, Peng; Huang, Po-Hsun; Xie, Yuliang; Mai, John D.; Wang, Lin; Nguyen, Nam-Trung; Huang, Tony Jun

    2014-01-01

    Rare cells are low-abundance cells in a much larger population of background cells. Conventional benchtop techniques have limited capabilities to isolate and analyze rare cells because of their generally low selectivity and significant sample loss. Recent rapid advances in microfluidics have been providing robust solutions to the challenges in the isolation and analysis of rare cells. In addition to the apparent performance enhancements resulting in higher efficiencies and sensitivity levels, microfluidics provides other advanced features such as simpler handling of small sample volumes and multiplexing capabilities for high-throughput processing. All of these advantages make microfluidics an excellent platform to deal with the transport, isolation, and analysis of rare cells. Various cellular biomarkers, including physical properties, dielectric properties, as well as immunoaffinities, have been explored for isolating rare cells. In this Focus article, we discuss the design considerations of representative microfluidic devices for rare cell isolation and analysis. Examples from recently published works are discussed to highlight the advantages and limitations of the different techniques. Various applications of these techniques are then introduced. Finally, a perspective on the development trends and promising research directions in this field are proposed. PMID:24406985

  12. Characterization of shape and deformation of MEMS by quantitative optoelectronic metrology techniques

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.

  13. Comparative study of two approaches to model the offshore fish cages

    NASA Astrophysics Data System (ADS)

    Zhao, Yun-peng; Wang, Xin-xin; Decew, Jud; Tsukrov, Igor; Bai, Xiao-dong; Bi, Chun-wei

    2015-06-01

    The goal of this paper is to provide a comparative analysis of two commonly used approaches to discretize offshore fish cages: the lumped-mass approach and the finite element technique. Two case studies are chosen to compare predictions of the LMA (lumped-mass approach) and FEA (finite element analysis) based numerical modeling techniques. In both case studies, we consider several loading conditions consisting of different uniform currents and monochromatic waves. We investigate motion of the cage, its deformation, and the resultant tension in the mooring lines. Both model predictions are sufficient close to the experimental data, but for the first experiment, the DUT-FlexSim predictions are slightly more accurate than the ones provided by Aqua-FE™. According to the comparisons, both models can be successfully utilized to the design and analysis of the offshore fish cages provided that an appropriate safety factor is chosen.

  14. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, Max

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.

  15. Business Case Analysis: Continuous Integrated Logistics Support-Targeted Allowance Technique (CILS-TAT)

    DTIC Science & Technology

    2013-05-30

    In this research, we examine the Naval Sea Logistics Command’s Continuous Integrated Logistics Support-Targeted Allowancing Technique (CILS-TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method’s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS-TAT, and provide recommendations concerning possible improvements to the

  16. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  17. Fitting and Modeling in the ASC Data Analysis Environment

    NASA Astrophysics Data System (ADS)

    Doe, S.; Siemiginowska, A.; Joye, W.; McDowell, J.

    As part of the AXAF Science Center (ASC) Data Analysis Environment, we will provide to the astronomical community a Fitting Application. We present a design of the application in this paper. Our design goal is to give the user the flexibility to use a variety of optimization techniques (Levenberg-Marquardt, maximum entropy, Monte Carlo, Powell, downhill simplex, CERN-Minuit, and simulated annealing) and fit statistics (chi (2) , Cash, variance, and maximum likelihood); our modular design allows the user easily to add their own optimization techniques and/or fit statistics. We also present a comparison of the optimization techniques to be provided by the Application. The high spatial and spectral resolutions that will be obtained with AXAF instruments require a sophisticated data modeling capability. We will provide not only a suite of astronomical spatial and spectral source models, but also the capability of combining these models into source models of up to four data dimensions (i.e., into source functions f(E,x,y,t)). We will also provide tools to create instrument response models appropriate for each observation.

  18. BAYESIAN SEMI-BLIND COMPONENT SEPARATION FOR FOREGROUND REMOVAL IN INTERFEROMETRIC 21 cm OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Le; Timbie, Peter T.; Bunn, Emory F.

    In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approachmore » can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.« less

  19. A novel CT acquisition and analysis technique for breathing motion modeling

    NASA Astrophysics Data System (ADS)

    Low, Daniel A.; White, Benjamin M.; Lee, Percy P.; Thomas, David H.; Gaudio, Sergio; Jani, Shyam S.; Wu, Xiao; Lamb, James M.

    2013-06-01

    To report on a novel technique for providing artifact-free quantitative four-dimensional computed tomography (4DCT) image datasets for breathing motion modeling. Commercial clinical 4DCT methods have difficulty managing irregular breathing. The resulting images contain motion-induced artifacts that can distort structures and inaccurately characterize breathing motion. We have developed a novel scanning and analysis method for motion-correlated CT that utilizes standard repeated fast helical acquisitions, a simultaneous breathing surrogate measurement, deformable image registration, and a published breathing motion model. The motion model differs from the CT-measured motion by an average of 0.65 mm, indicating the precision of the motion model. The integral of the divergence of one of the motion model parameters is predicted to be a constant 1.11 and is found in this case to be 1.09, indicating the accuracy of the motion model. The proposed technique shows promise for providing motion-artifact free images at user-selected breathing phases, accurate Hounsfield units, and noise characteristics similar to non-4D CT techniques, at a patient dose similar to or less than current 4DCT techniques.

  20. Sample preparation for the analysis of isoflavones from soybeans and soy foods.

    PubMed

    Rostagno, M A; Villares, A; Guillamón, E; García-Lafuente, A; Martínez, J A

    2009-01-02

    This manuscript provides a review of the actual state and the most recent advances as well as current trends and future prospects in sample preparation and analysis for the quantification of isoflavones from soybeans and soy foods. Individual steps of the procedures used in sample preparation, including sample conservation, extraction techniques and methods, and post-extraction treatment procedures are discussed. The most commonly used methods for extraction of isoflavones with both conventional and "modern" techniques are examined in detail. These modern techniques include ultrasound-assisted extraction, pressurized liquid extraction, supercritical fluid extraction and microwave-assisted extraction. Other aspects such as stability during extraction and analysis by high performance liquid chromatography are also covered.

  1. The use of interpractive graphic displays for interpretation of surface design parameters

    NASA Technical Reports Server (NTRS)

    Talcott, N. A., Jr.

    1981-01-01

    An interactive computer graphics technique known as the Graphic Display Data method has been developed to provide a convenient means for rapidly interpreting large amounts of surface design data. The display technique should prove valuable in such disciplines as aerodynamic analysis, structural analysis, and experimental data analysis. To demonstrate the system's features, an example is presented of the Graphic Data Display method used as an interpretive tool for radiation equilibrium temperature distributions over the surface of an aerodynamic vehicle. Color graphic displays were also examined as a logical extension of the technique to improve its clarity and to allow the presentation of greater detail in a single display.

  2. New techniques for imaging and analyzing lung tissue.

    PubMed Central

    Roggli, V L; Ingram, P; Linton, R W; Gutknecht, W F; Mastin, P; Shelburne, J D

    1984-01-01

    The recent technological revolution in the field of imaging techniques has provided pathologists and toxicologists with an expanding repertoire of analytical techniques for studying the interaction between the lung and the various exogenous materials to which it is exposed. Analytical problems requiring elemental sensitivity or specificity beyond the range of that offered by conventional scanning electron microscopy and energy dispersive X-ray analysis are particularly appropriate for the application of these newer techniques. Electron energy loss spectrometry, Auger electron spectroscopy, secondary ion mass spectrometry, and laser microprobe mass analysis each offer unique advantages in this regard, but also possess their own limitations and disadvantages. Diffraction techniques provide crystalline structural information available through no other means. Bulk chemical techniques provide useful cross-checks on the data obtained by microanalytical approaches. It is the purpose of this review to summarize the methodology of these techniques, acknowledge situations in which they have been used in addressing problems in pulmonary toxicology, and comment on the relative advantages and disadvantages of each approach. It is necessary for an investigator to weigh each of these factors when deciding which technique is best suited for any given analytical problem; often it is useful to employ a combination of two or more of the techniques discussed. It is anticipated that there will be increasing utilization of these technologies for problems in pulmonary toxicology in the decades to come. Images FIGURE 3. A FIGURE 3. B FIGURE 3. C FIGURE 3. D FIGURE 4. FIGURE 5. FIGURE 7. A FIGURE 7. B FIGURE 8. A FIGURE 8. B FIGURE 8. C FIGURE 9. A FIGURE 9. B FIGURE 10. PMID:6090115

  3. Characterizing TPS Microstructure: A Review of Some techniques

    NASA Technical Reports Server (NTRS)

    Gasch, Matthew; Stackpole, Mairead; Agrawal, Parul; Chavez-Garcie, Jose

    2011-01-01

    I. When seeking to understand ablator microstructure and morphology there are several useful techniques A. SEM 1) Visual characteriza3on at various length scales. 2) Chemical mapping by backscatter or x-ray highlights areas of interest. 3) Combined with other techniques (density, weight change, chemical analysis) SEM is a powerful tool to aid in explaining thermo/structural data. B. ASAP. 1) Chemical characteriza3on at various length scales. 2) Chemical mapping of pore structure by gas adsorption. 3) Provides a map of pore size vs. pore volume. 4) Provided surface area of exposed TPS. II. Both methods help characterize and understand how ablators react with other chemical species and provides insight into how they oxidize.

  4. Visual enhancement of images of natural resources: Applications in geology

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Neto, G.; Araujo, E. O.; Mascarenhas, N. D. A.; Desouza, R. C. M.

    1980-01-01

    The principal components technique for use in multispectral scanner LANDSAT data processing results in optimum dimensionality reduction. A powerful tool for MSS IMAGE enhancement, the method provides a maximum impression of terrain ruggedness; this fact makes the technique well suited for geological analysis.

  5. Collection Evaluation Techniques in the Academic Art Library.

    ERIC Educational Resources Information Center

    Kusnerz, Peggy Ann

    1983-01-01

    Presents an overview of library collection evaluation techniques described in the literature--list-checking, quantitative analysis, use studies, and subject specialist review--and offers suggestions to the librarian for the application of these methods in an art library. Twenty-five references are provided. (EJS)

  6. Optimization Techniques for 3D Graphics Deployment on Mobile Devices

    NASA Astrophysics Data System (ADS)

    Koskela, Timo; Vatjus-Anttila, Jarkko

    2015-03-01

    3D Internet technologies are becoming essential enablers in many application areas including games, education, collaboration, navigation and social networking. The use of 3D Internet applications with mobile devices provides location-independent access and richer use context, but also performance issues. Therefore, one of the important challenges facing 3D Internet applications is the deployment of 3D graphics on mobile devices. In this article, we present an extensive survey on optimization techniques for 3D graphics deployment on mobile devices and qualitatively analyze the applicability of each technique from the standpoints of visual quality, performance and energy consumption. The analysis focuses on optimization techniques related to data-driven 3D graphics deployment, because it supports off-line use, multi-user interaction, user-created 3D graphics and creation of arbitrary 3D graphics. The outcome of the analysis facilitates the development and deployment of 3D Internet applications on mobile devices and provides guidelines for future research.

  7. Metabolic Analysis

    NASA Astrophysics Data System (ADS)

    Tolstikov, Vladimir V.

    Analysis of the metabolome with coverage of all of the possibly detectable components in the sample, rather than analysis of each individual metabolite at a given time, can be accomplished by metabolic analysis. Targeted and/or nontargeted approaches are applied as needed for particular experiments. Monitoring hundreds or more metabolites at a given time requires high-throughput and high-end techniques that enable screening for relative changes in, rather than absolute concentrations of, compounds within a wide dynamic range. Most of the analytical techniques useful for these purposes use GC or HPLC/UPLC separation modules coupled to a fast and accurate mass spectrometer. GC separations require chemical modification (derivatization) before analysis, and work efficiently for the small molecules. HPLC separations are better suited for the analysis of labile and nonvolatile polar and nonpolar compounds in their native form. Direct infusion and NMR-based techniques are mostly used for fingerprinting and snap phenotyping, where applicable. Discovery and validation of metabolic biomarkers are exciting and promising opportunities offered by metabolic analysis applied to biological and biomedical experiments. We have demonstrated that GC-TOF-MS, HPLC/UPLC-RP-MS and HILIC-LC-MS techniques used for metabolic analysis offer sufficient metabolome mapping providing researchers with confident data for subsequent multivariate analysis and data mining.

  8. The Sixth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Sixth Annual Thermal and Fluids Analysis Workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysis. Paper topics included advances an uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  9. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  10. Market segmentation for multiple option healthcare delivery systems--an application of cluster analysis.

    PubMed

    Jarboe, G R; Gates, R H; McDaniel, C D

    1990-01-01

    Healthcare providers of multiple option plans may be confronted with special market segmentation problems. This study demonstrates how cluster analysis may be used for discovering distinct patterns of preference for multiple option plans. The availability of metric, as opposed to categorical or ordinal, data provides the ability to use sophisticated analysis techniques which may be superior to frequency distributions and cross-tabulations in revealing preference patterns.

  11. Investigation of advanced phase-shifting projected fringe profilometry techniques

    NASA Astrophysics Data System (ADS)

    Liu, Hongyu

    1999-11-01

    The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.

  12. Analytical techniques of pilot scanning behavior and their application

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.

    1986-01-01

    The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.

  13. Navigating complex sample analysis using national survey data.

    PubMed

    Saylor, Jennifer; Friedmann, Erika; Lee, Hyeon Joo

    2012-01-01

    The National Center for Health Statistics conducts the National Health and Nutrition Examination Survey and other national surveys with probability-based complex sample designs. Goals of national surveys are to provide valid data for the population of the United States. Analyses of data from population surveys present unique challenges in the research process but are valuable avenues to study the health of the United States population. The aim of this study was to demonstrate the importance of using complex data analysis techniques for data obtained with complex multistage sampling design and provide an example of analysis using the SPSS Complex Samples procedure. Illustration of challenges and solutions specific to secondary data analysis of national databases are described using the National Health and Nutrition Examination Survey as the exemplar. Oversampling of small or sensitive groups provides necessary estimates of variability within small groups. Use of weights without complex samples accurately estimates population means and frequency from the sample after accounting for over- or undersampling of specific groups. Weighting alone leads to inappropriate population estimates of variability, because they are computed as if the measures were from the entire population rather than a sample in the data set. The SPSS Complex Samples procedure allows inclusion of all sampling design elements, stratification, clusters, and weights. Use of national data sets allows use of extensive, expensive, and well-documented survey data for exploratory questions but limits analysis to those variables included in the data set. The large sample permits examination of multiple predictors and interactive relationships. Merging data files, availability of data in several waves of surveys, and complex sampling are techniques used to provide a representative sample but present unique challenges. In sophisticated data analysis techniques, use of these data is optimized.

  14. Research in interactive scene analysis

    NASA Technical Reports Server (NTRS)

    Tenenbaum, J. M.; Barrow, H. G.; Weyl, S. A.

    1976-01-01

    Cooperative (man-machine) scene analysis techniques were developed whereby humans can provide a computer with guidance when completely automated processing is infeasible. An interactive approach promises significant near-term payoffs in analyzing various types of high volume satellite imagery, as well as vehicle-based imagery used in robot planetary exploration. This report summarizes the work accomplished over the duration of the project and describes in detail three major accomplishments: (1) the interactive design of texture classifiers; (2) a new approach for integrating the segmentation and interpretation phases of scene analysis; and (3) the application of interactive scene analysis techniques to cartography.

  15. Parity-expanded variational analysis for nonzero momentum

    NASA Astrophysics Data System (ADS)

    Stokes, Finn M.; Kamleh, Waseem; Leinweber, Derek B.; Mahbub, M. Selim; Menadue, Benjamin J.; Owen, Benjamin J.

    2015-12-01

    In recent years, the use of variational analysis techniques in lattice QCD has been demonstrated to be successful in the investigation of the rest-mass spectrum of many hadrons. However, due to parity mixing, more care must be taken for investigations of boosted states to ensure that the projected correlation functions provided by the variational analysis correspond to the same states at zero momentum. In this paper we present the parity-expanded variational analysis (PEVA) technique, a novel method for ensuring the successful and consistent isolation of boosted baryons through a parity expansion of the operator basis used to construct the correlation matrix.

  16. Decision modeling for fire incident analysis

    Treesearch

    Donald G. MacGregor; Armando González-Cabán

    2009-01-01

    This paper reports on methods for representing and modeling fire incidents based on concepts and models from the decision and risk sciences. A set of modeling techniques are used to characterize key fire management decision processes and provide a basis for incident analysis. The results of these methods can be used to provide insights into the structure of fire...

  17. Practical issues of hyperspectral imaging analysis of solid dosage forms.

    PubMed

    Amigo, José Manuel

    2010-09-01

    Hyperspectral imaging techniques have widely demonstrated their usefulness in different areas of interest in pharmaceutical research during the last decade. In particular, middle infrared, near infrared, and Raman methods have gained special relevance. This rapid increase has been promoted by the capability of hyperspectral techniques to provide robust and reliable chemical and spatial information on the distribution of components in pharmaceutical solid dosage forms. Furthermore, the valuable combination of hyperspectral imaging devices with adequate data processing techniques offers the perfect landscape for developing new methods for scanning and analyzing surfaces. Nevertheless, the instrumentation and subsequent data analysis are not exempt from issues that must be thoughtfully considered. This paper describes and discusses the main advantages and drawbacks of the measurements and data analysis of hyperspectral imaging techniques in the development of solid dosage forms.

  18. An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation

    ERIC Educational Resources Information Center

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2007-01-01

    One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to…

  19. An improved technique for the 2H/1H analysis of urines from diabetic volunteers

    USGS Publications Warehouse

    Coplen, T.B.; Harper, I.T.

    1994-01-01

    The H2-H2O ambient-temperature equilibration technique for the determination of 2H/1H ratios in urinary waters from diabetic subjects provides improved accuracy over the conventional Zn reduction technique. The standard deviation, ~ 1-2???, is at least a factor of three better than that of the Zn reduction technique on urinary waters from diabetic volunteers. Experiments with pure water and solutions containing glucose, urea and albumen indicate that there is no measurable bias in the hydrogen equilibration technique.The H2-H2O ambient-temperature equilibration technique for the determination of 2H/1H ratios in urinary waters from diabetic subjects provides improved accuracy over the conventional Zn reduction technique. The standard deviation, approximately 1-2%, is at least a factor of three better than that of the Zn reduction technique on urinary waters from diabetic volunteers. Experiments with pure water and solutions containing glucose, urea and albumen indicate that there is no measurable bias in the hydrogen equilibration technique.

  20. A Lagrangian Analysis of a Developing and Non-Developing Disturbance Observed During the PREDICT Experiment

    DTIC Science & Technology

    2012-12-03

    paper provides an introduction of Lagrangian techniques for locating flow boundaries that encompass regions of recirculation in time- dependent flows...the low- to mid- level embryonic vortex from adverse conditions, while the 1The glossary on NOAA’s Hurricane Research Division’s web - site uses...wave or disturbance. This paper provides an introduction of Lagrangian techniques for locating flow boundaries that encompass regions of recirculation

  1. Tourism English Teaching Techniques Converged from Two Different Angles.

    ERIC Educational Resources Information Center

    Seong, Myeong-Hee

    2001-01-01

    Provides techniques converged from two different angles (learners and tourism English features) for effective tourism English teaching in a junior college in Korea. Used a questionnaire, needs analysis, an instrument for measuring learners' strategies for oral communication, a small-scale classroom study for learners' preferred teaching…

  2. Computer program uses Monte Carlo techniques for statistical system performance analysis

    NASA Technical Reports Server (NTRS)

    Wohl, D. P.

    1967-01-01

    Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.

  3. Introduction to Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.

    1986-01-01

    The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.

  4. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  5. Brownian Motion--a Laboratory Experiment.

    ERIC Educational Resources Information Center

    Kruglak, Haym

    1988-01-01

    Introduces an experiment involving the observation of Brownian motion for college students. Describes the apparatus, experimental procedures, data analysis and results, and error analysis. Lists experimental techniques used in the experiment. Provides a circuit diagram, typical data, and graphs. (YP)

  6. MPATHav: A software prototype for multiobjective routing in transportation risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganter, J.H.; Smith, J.D.

    Most routing problems depend on several important variables: transport distance, population exposure, accident rate, mandated roads (e.g., HM-164 regulations), and proximity to emergency response resources are typical. These variables may need to be minimized or maximized, and often are weighted. `Objectives` to be satisfied by the analysis are thus created. The resulting problems can be approached by combining spatial analysis techniques from geographic information systems (GIS) with multiobjective analysis techniques from the field of operations research (OR); we call this hybrid multiobjective spatial analysis` (MOSA). MOSA can be used to discover, display, and compare a range of solutions that satisfymore » a set of objectives to varying degrees. For instance, a suite of solutions may include: one solution that provides short transport distances, but at a cost of high exposure; another solution that provides low exposure, but long distances; and a range of solutions between these two extremes.« less

  7. Blood volume analysis: a new technique and new clinical interest reinvigorate a classic study.

    PubMed

    Manzone, Timothy A; Dam, Hung Q; Soltis, Daniel; Sagar, Vidya V

    2007-06-01

    Blood volume studies using the indicator dilution technique and radioactive tracers have been performed in nuclear medicine departments for over 50 y. A nuclear medicine study is the gold standard for blood volume measurement, but the classic dual-isotope blood volume study is time-consuming and can be prone to technical errors. Moreover, a lack of normal values and a rubric for interpretation made volume status measurement of limited interest to most clinicians other than some hematologists. A new semiautomated system for blood volume analysis is now available and provides highly accurate results for blood volume analysis within only 90 min. The availability of rapid, accurate blood volume analysis has brought about a surge of clinical interest in using blood volume data for clinical management. Blood volume analysis, long a low-volume nuclear medicine study all but abandoned in some laboratories, is poised to enter the clinical mainstream. This article will first present the fundamental principles of fluid balance and the clinical means of volume status assessment. We will then review the indicator dilution technique and how it is used in nuclear medicine blood volume studies. We will present an overview of the new semiautomated blood volume analysis technique, showing how the study is done, how it works, what results are provided, and how those results are interpreted. Finally, we will look at some of the emerging areas in which data from blood volume analysis can improve patient care. The reader will gain an understanding of the principles underlying blood volume assessment, know how current nuclear medicine blood volume analysis studies are performed, and appreciate their potential clinical impact.

  8. Application of Genetic Algorithm (GA) Assisted Partial Least Square (PLS) Analysis on Trilinear and Non-trilinear Fluorescence Data Sets to Quantify the Fluorophores in Multifluorophoric Mixtures: Improving Quantification Accuracy of Fluorimetric Estimations of Dilute Aqueous Mixtures.

    PubMed

    Kumar, Keshav

    2018-03-01

    Excitation-emission matrix fluorescence (EEMF) and total synchronous fluorescence spectroscopy (TSFS) are the 2 fluorescence techniques that are commonly used for the analysis of multifluorophoric mixtures. These 2 fluorescence techniques are conceptually different and provide certain advantages over each other. The manual analysis of such highly correlated large volume of EEMF and TSFS towards developing a calibration model is difficult. Partial least square (PLS) analysis can analyze the large volume of EEMF and TSFS data sets by finding important factors that maximize the correlation between the spectral and concentration information for each fluorophore. However, often the application of PLS analysis on entire data sets does not provide a robust calibration model and requires application of suitable pre-processing step. The present work evaluates the application of genetic algorithm (GA) analysis prior to PLS analysis on EEMF and TSFS data sets towards improving the precision and accuracy of the calibration model. The GA algorithm essentially combines the advantages provided by stochastic methods with those provided by deterministic approaches and can find the set of EEMF and TSFS variables that perfectly correlate well with the concentration of each of the fluorophores present in the multifluorophoric mixtures. The utility of the GA assisted PLS analysis is successfully validated using (i) EEMF data sets acquired for dilute aqueous mixture of four biomolecules and (ii) TSFS data sets acquired for dilute aqueous mixtures of four carcinogenic polycyclic aromatic hydrocarbons (PAHs) mixtures. In the present work, it is shown that by using the GA it is possible to significantly improve the accuracy and precision of the PLS calibration model developed for both EEMF and TSFS data set. Hence, GA must be considered as a useful pre-processing technique while developing an EEMF and TSFS calibration model.

  9. A comparative analysis of conventional cytopreparatory and liquid based cytological techniques (Sure Path) in evaluation of serous effusion fluids.

    PubMed

    Dadhich, Hrishikesh; Toi, Pampa Ch; Siddaraju, Neelaiah; Sevvanthi, Kalidas

    2016-11-01

    Clinically, detection of malignant cells in serous body fluids is critical, as their presence implies the upstaging of the disease. Cytology of body cavity fluids serves as an important tool when other diagnostic tests cannot be performed. In most laboratories, currently, the effusion fluid samples are analysed chiefly by the conventional cytopreparatory (CCP) technique. Although, there are several studies comparing the liquid-based cytology (LBC), with CCP technique in the field of cervicovaginal cytology; the literature on such comparison with respect to serous body fluid examination is sparse. One hundred samples of serous body fluids were processed by both CCP and LBC techniques. Slides prepared by these techniques were studied using six parameters. A comparative analysis of the advantages and disadvantages of the techniques in detection of malignant cells was carried out with appropriate statistical tests. The samples comprised 52 pleural, 44 peritoneal and four pericardial fluids. No statistically significant difference was noted with respect to cellularity (P values = 0.22), cell distribution (P values = 0.39) and diagnosis of malignancy (P values = 0.20). As for the remaining parameters, LBC provided statistically significant clearer smear background (P values < 0.0001) and shorter screening time (P values < 0.0001), while CPP technique provided a significantly better staining quality (P values 0.01) and sharper cytomorphologic features (P values 0.05). Although, a reduced screening time and clearer smear background are the two major advantages of LBC; the CCP technique provides the better staining quality with sharper cytomorphologic features which is more critical from the cytologic interpretation point of view. Diagn. Cytopathol. 2016;44:874-879. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Simultaneous Comparison of Two Roller Compaction Techniques and Two Particle Size Analysis Methods.

    PubMed

    Saarinen, Tuomas; Antikainen, Osmo; Yliruusi, Jouko

    2017-11-01

    A new dry granulation technique, gas-assisted roller compaction (GARC), was compared with conventional roller compaction (CRC) by manufacturing 34 granulation batches. The process variables studied were roll pressure, roll speed, and sieve size of the conical mill. The main quality attributes measured were granule size and flow characteristics. Within granulations also the real applicability of two particle size analysis techniques, sieve analysis (SA) and fast imaging technique (Flashsizer, FS), was tested. All granules obtained were acceptable. In general, the particle size of GARC granules was slightly larger than that of CRC granules. In addition, the GARC granules had better flowability. For example, the tablet weight variation of GARC granules was close to 2%, indicating good flowing and packing characteristics. The comparison of the two particle size analysis techniques showed that SA was more accurate in determining wide and bimodal size distributions while FS showed narrower and mono-modal distributions. However, both techniques gave good estimates for mean granule sizes. Overall, SA was a time-consuming but accurate technique that provided reliable information for the entire granule size distribution. By contrast, FS oversimplified the shape of the size distribution, but nevertheless yielded acceptable estimates for mean particle size. In general, FS was two to three orders of magnitude faster than SA.

  11. Corporate Advertisements and Environmental Futures: An Educational Odyssey.

    ERIC Educational Resources Information Center

    Thomashow, Mitchell

    1988-01-01

    Analyzes advertisements as vision, myth, and mirror. Provides an interpretive technique that can be used as a curriculum for studying issues advertisements. Offers eight advertisements as examples and provides an analysis and critique of each. (MVL)

  12. Soil chemical insights provided through vibrational spectroscopy

    USDA-ARS?s Scientific Manuscript database

    Vibrational spectroscopy techniques provide a powerful approach to study environmental materials and processes. These multifunctional analysis tools can be used to probe molecular vibrations of solid, liquid, and gaseous samples for characterizing materials, elucidating reaction mechanisms, and exam...

  13. Image analysis library software development

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Bryant, J.

    1977-01-01

    The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.

  14. Nanoscale deformation analysis with high-resolution transmission electron microscopy and digital image correlation

    DOE PAGES

    Wang, Xueju; Pan, Zhipeng; Fan, Feifei; ...

    2015-09-10

    We present an application of the digital image correlation (DIC) method to high-resolution transmission electron microscopy (HRTEM) images for nanoscale deformation analysis. The combination of DIC and HRTEM offers both the ultrahigh spatial resolution and high displacement detection sensitivity that are not possible with other microscope-based DIC techniques. We demonstrate the accuracy and utility of the HRTEM-DIC technique through displacement and strain analysis on amorphous silicon. Two types of error sources resulting from the transmission electron microscopy (TEM) image noise and electromagnetic-lens distortions are quantitatively investigated via rigid-body translation experiments. The local and global DIC approaches are applied for themore » analysis of diffusion- and reaction-induced deformation fields in electrochemically lithiated amorphous silicon. As a result, the DIC technique coupled with HRTEM provides a new avenue for the deformation analysis of materials at the nanometer length scales.« less

  15. The Application of Operations Research Techniques to the Evaluation of Military Management Information Systems.

    DTIC Science & Technology

    systems such as management information systems . To provide a methodology yielding quantitative results which may assist a commander and his staff in...this analysis, it is proposed that management information systems be evaluated as a whole by a technique defined as the semantic differential. Each

  16. Dual parallel mass apectrometry (LC1/MS2 and LC2/MS2) for lipid and vitamin D analysis

    USDA-ARS?s Scientific Manuscript database

    Atmospheric pressure chemical ionization (APCI) mass spectrometry (MS) and electrospray ionization (ESI) MS are complementary techniques that provide different types of information for lipids such as triacylglycerols, phospholipids, and fat-soluble vitamins. Since no one technique is by itself idea...

  17. Dual Parallel Mass Spectrometry (LC1/MS2 and LC2/MS2) for Lipid and Vitamin D Analysis

    USDA-ARS?s Scientific Manuscript database

    Atmospheric pressure chemical ionization (APCI) mass spectrometry (MS) and electrospray ionization (ESI) MS are complementary techniques that provide different types of information for lipids such as triacylglycerols (TAGs), phospholipids, and fat-soluble vitamins. Since no one technique is by itsel...

  18. Checking of individuality by DNA profiling.

    PubMed

    Brdicka, R; Nürnberg, P

    1993-08-25

    A review of methods of DNA analysis used in forensic medicine for identification, paternity testing, etc. is provided. Among other techniques, DNA fingerprinting using different probes and polymerase chain reaction-based techniques such as amplified sequence polymorphisms and minisatellite variant repeat mapping are thoroughly described and both theoretical and practical aspects are discussed.

  19. Coarse-to-fine markerless gait analysis based on PCA and Gauss-Laguerre decomposition

    NASA Astrophysics Data System (ADS)

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Carli, Marco; Neri, Alessandro; D'Alessio, Tommaso

    2005-04-01

    Human movement analysis is generally performed through the utilization of marker-based systems, which allow reconstructing, with high levels of accuracy, the trajectories of markers allocated on specific points of the human body. Marker based systems, however, show some drawbacks that can be overcome by the use of video systems applying markerless techniques. In this paper, a specifically designed computer vision technique for the detection and tracking of relevant body points is presented. It is based on the Gauss-Laguerre Decomposition, and a Principal Component Analysis Technique (PCA) is used to circumscribe the region of interest. Results obtained on both synthetic and experimental tests provide significant reduction of the computational costs, with no significant reduction of the tracking accuracy.

  20. Application of gas chromatography to analysis of spirit-based alcoholic beverages.

    PubMed

    Wiśniewska, Paulina; Śliwińska, Magdalena; Dymerski, Tomasz; Wardencki, Waldemar; Namieśnik, Jacek

    2015-01-01

    Spirit-based beverages are alcoholic drinks; their production processes are dependent on the type and origin of raw materials. The composition of this complex matrix is difficult to analyze, and scientists commonly choose gas chromatography techniques for this reason. With a wide selection of extraction methods and detectors it is possible to provide qualitative and quantitative analysis for many chemical compounds with various functional groups. This article describes different types of gas chromatography techniques and their most commonly used associated extraction techniques (e.g., LLE, SPME, SPE, SFE, and SBME) and detectors (MS, TOFMS, FID, ECD, NPD, AED, O or EPD). Additionally, brief characteristics of internationally popular spirit-based beverages and application of gas chromatography to the analysis of selected alcoholic drinks are presented.

  1. Multivariate statistical analysis: Principles and applications to coorbital streams of meteorite falls

    NASA Technical Reports Server (NTRS)

    Wolf, S. F.; Lipschutz, M. E.

    1993-01-01

    Multivariate statistical analysis techniques (linear discriminant analysis and logistic regression) can provide powerful discrimination tools which are generally unfamiliar to the planetary science community. Fall parameters were used to identify a group of 17 H chondrites (Cluster 1) that were part of a coorbital stream which intersected Earth's orbit in May, from 1855 - 1895, and can be distinguished from all other H chondrite falls. Using multivariate statistical techniques, it was demonstrated that a totally different criterion, labile trace element contents - hence thermal histories - or 13 Cluster 1 meteorites are distinguishable from those of 45 non-Cluster 1 H chondrites. Here, we focus upon the principles of multivariate statistical techniques and illustrate their application using non-meteoritic and meteoritic examples.

  2. Wing download reduction using vortex trapping plates

    NASA Technical Reports Server (NTRS)

    Light, Jeffrey S.; Stremel, Paul M.; Bilanin, Alan J.

    1994-01-01

    A download reduction technique using spanwise plates on the upper and lower wing surfaces has been examined. Experimental and analytical techniques were used to determine the download reduction obtained using this technique. Simple two-dimensional wind tunnel testing confirmed the validity of the technique for reducing two-dimensional airfoil drag. Computations using a two-dimensional Navier-Stokes analysis provided insight into the mechanism causing the drag reduction. Finally, the download reduction technique was tested using a rotor and wing to determine the benefits for a semispan configuration representative of a tilt rotor aircraft.

  3. Flight testing techniques for the evaluation of light aircraft stability derivatives: A review and analysis

    NASA Technical Reports Server (NTRS)

    Smetana, F. O.; Summery, D. C.; Johnson, W. D.

    1972-01-01

    Techniques quoted in the literature for the extraction of stability derivative information from flight test records are reviewed. A recent technique developed at NASA's Langley Research Center was regarded as the most productive yet developed. Results of tests of the sensitivity of this procedure to various types of data noise and to the accuracy of the estimated values of the derivatives are reported. Computer programs for providing these initial estimates are given. The literature review also includes a discussion of flight test measuring techniques, instrumentation, and piloting techniques.

  4. Strategies for Fermentation Medium Optimization: An In-Depth Review

    PubMed Central

    Singh, Vineeta; Haque, Shafiul; Niwas, Ram; Srivastava, Akansha; Pasupuleti, Mukesh; Tripathi, C. K. M.

    2017-01-01

    Optimization of production medium is required to maximize the metabolite yield. This can be achieved by using a wide range of techniques from classical “one-factor-at-a-time” to modern statistical and mathematical techniques, viz. artificial neural network (ANN), genetic algorithm (GA) etc. Every technique comes with its own advantages and disadvantages, and despite drawbacks some techniques are applied to obtain best results. Use of various optimization techniques in combination also provides the desirable results. In this article an attempt has been made to review the currently used media optimization techniques applied during fermentation process of metabolite production. Comparative analysis of the merits and demerits of various conventional as well as modern optimization techniques have been done and logical selection basis for the designing of fermentation medium has been given in the present review. Overall, this review will provide the rationale for the selection of suitable optimization technique for media designing employed during the fermentation process of metabolite production. PMID:28111566

  5. Trailing Ballute Aerocapture: Concept and Feasibility Assessment

    NASA Technical Reports Server (NTRS)

    Miller, Kevin L.; Gulick, Doug; Lewis, Jake; Trochman, Bill; Stein, Jim; Lyons, Daniel T.; Wilmoth, Richard G.

    2003-01-01

    Trailing Ballute Aerocapture offers the potential to obtain orbit insertion around a planetary body at a fraction of the mass of traditional methods. This allows for lower costs for launch, faster flight times and additional mass available for science payloads. The technique involves an inflated ballute (balloon-parachute) that provides aerodynamic drag area for use in the atmosphere of a planetary body to provide for orbit insertion in a relatively benign heating environment. To account for atmospheric, navigation and other uncertainties, the ballute is oversized and detached once the desired velocity change (Delta V) has been achieved. Analysis and trades have been performed for the purpose of assessing the feasibility of the technique including aerophysics, material assessments, inflation system and deployment sequence and dynamics, configuration trades, ballute separation and trajectory analysis. Outlined is the technology development required for advancing the technique to a level that would allow it to be viable for use in space exploration missions.

  6. Computer-aided system design

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  7. Plasma properties of hot coronal loops utilizing coordinated SMM and solar research rocket observations

    NASA Technical Reports Server (NTRS)

    Moses, J. Daniel

    1989-01-01

    Three improvements in photographic x-ray imaging techniques for solar astronomy are presented. The testing and calibration of a new film processor was conducted; the resulting product will allow photometric development of sounding rocket flight film immediately upon recovery at the missile range. Two fine grained photographic films were calibrated and flight tested to provide alternative detector choices when the need for high resolution is greater than the need for high sensitivity. An analysis technique used to obtain the characteristic curve directly from photographs of UV solar spectra were applied to the analysis of soft x-ray photographic images. The resulting procedure provides a more complete and straightforward determination of the parameters describing the x-ray characteristic curve than previous techniques. These improvements fall into the category of refinements instead of revolutions, indicating the fundamental suitability of the photographic process for x-ray imaging in solar astronomy.

  8. A Proposed Data Fusion Architecture for Micro-Zone Analysis and Data Mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin McCarthy; Milos Manic

    Data Fusion requires the ability to combine or “fuse” date from multiple data sources. Time Series Analysis is a data mining technique used to predict future values from a data set based upon past values. Unlike other data mining techniques, however, Time Series places special emphasis on periodicity and how seasonal and other time-based factors tend to affect trends over time. One of the difficulties encountered in developing generic time series techniques is the wide variability of the data sets available for analysis. This presents challenges all the way from the data gathering stage to results presentation. This paper presentsmore » an architecture designed and used to facilitate the collection of disparate data sets well suited to Time Series analysis as well as other predictive data mining techniques. Results show this architecture provides a flexible, dynamic framework for the capture and storage of a myriad of dissimilar data sets and can serve as a foundation from which to build a complete data fusion architecture.« less

  9. A comparison of solute-transport solution techniques based on inverse modelling results

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2000-01-01

    Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results-simulated breakthrough curves, sensitivity analysis, and calibrated parameter values-change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.

  10. A Comparison of Compressed Sensing and Sparse Recovery Algorithms Applied to Simulation Data

    DOE PAGES

    Fan, Ya Ju; Kamath, Chandrika

    2016-09-01

    The move toward exascale computing for scientific simulations is placing new demands on compression techniques. It is expected that the I/O system will not be able to support the volume of data that is expected to be written out. To enable quantitative analysis and scientific discovery, we are interested in techniques that compress high-dimensional simulation data and can provide perfect or near-perfect reconstruction. In this paper, we explore the use of compressed sensing (CS) techniques to reduce the size of the data before they are written out. Using large-scale simulation data, we investigate how the sufficient sparsity condition and themore » contrast in the data affect the quality of reconstruction and the degree of compression. Also, we provide suggestions for the practical implementation of CS techniques and compare them with other sparse recovery methods. Finally, our results show that despite longer times for reconstruction, compressed sensing techniques can provide near perfect reconstruction over a range of data with varying sparsity.« less

  11. A Comparison of Compressed Sensing and Sparse Recovery Algorithms Applied to Simulation Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, Ya Ju; Kamath, Chandrika

    The move toward exascale computing for scientific simulations is placing new demands on compression techniques. It is expected that the I/O system will not be able to support the volume of data that is expected to be written out. To enable quantitative analysis and scientific discovery, we are interested in techniques that compress high-dimensional simulation data and can provide perfect or near-perfect reconstruction. In this paper, we explore the use of compressed sensing (CS) techniques to reduce the size of the data before they are written out. Using large-scale simulation data, we investigate how the sufficient sparsity condition and themore » contrast in the data affect the quality of reconstruction and the degree of compression. Also, we provide suggestions for the practical implementation of CS techniques and compare them with other sparse recovery methods. Finally, our results show that despite longer times for reconstruction, compressed sensing techniques can provide near perfect reconstruction over a range of data with varying sparsity.« less

  12. A Photometric Technique for Determining Fluid Concentration using Consumer-Grade Hardware

    NASA Technical Reports Server (NTRS)

    Leslie, F.; Ramachandran, N.

    1999-01-01

    In support of a separate study to produce an exponential concentration gradient in a magnetic fluid, a noninvasive technique for determining, species concentration from off-the-shelf hardware has been developed. The approach uses a backlighted fluid test cell photographed with a commercial digital camcorder. Because the light extinction coefficient is wavelength dependent, tests were conducted to determine the best filter color to use, although some guidance was also provided using an absorption spectrophotometer. With the appropriate filter in place, the provide attenuation of the light passing, through the test cell was captured by the camcorder. The digital image was analyzed for intensity using, software from Scion Image Corp. downloaded from the Internet. The analysis provides a two-dimensional array of concentration with an average error of 0.0095 ml/ml. This technique is superior to invasive techniques, which require extraction of a sample that disturbs the concentration distribution in the test cell. Refinements of this technique using a true monochromatic laser light Source are also discussed.

  13. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  14. Ultrafast Method for the Analysis of Fluorescence Lifetime Imaging Microscopy Data Based on the Laguerre Expansion Technique

    PubMed Central

    Jo, Javier A.; Fang, Qiyin; Marcu, Laura

    2007-01-01

    We report a new deconvolution method for fluorescence lifetime imaging microscopy (FLIM) based on the Laguerre expansion technique. The performance of this method was tested on synthetic and real FLIM images. The following interesting properties of this technique were demonstrated. 1) The fluorescence intensity decay can be estimated simultaneously for all pixels, without a priori assumption of the decay functional form. 2) The computation speed is extremely fast, performing at least two orders of magnitude faster than current algorithms. 3) The estimated maps of Laguerre expansion coefficients provide a new domain for representing FLIM information. 4) The number of images required for the analysis is relatively small, allowing reduction of the acquisition time. These findings indicate that the developed Laguerre expansion technique for FLIM analysis represents a robust and extremely fast deconvolution method that enables practical applications of FLIM in medicine, biology, biochemistry, and chemistry. PMID:19444338

  15. The role of input-output analysis of energy and ecologic systems. In the early development of ecological economics--a personal perspective.

    PubMed

    Hannon, Bruce

    2010-01-01

    A summary is provided of the early history of research on the flow of nonrenewable energy resources through the economy and of the flow of renewable energy resources through a natural ecosystem. The techniques are similar, and many specific applications are provided. A combined economic and ecological technique is also defined. The early history and people of the International Society Ecological Economic are cited.

  16. Wind profiler signal detection improvements

    NASA Technical Reports Server (NTRS)

    Hart, G. F.; Divis, Dale H.

    1992-01-01

    Research is described on potential improvements to the software used with the NASA 49.25 MHz wind profiler located at Kennedy Space Center. In particular, the analysis and results are provided of a study to (1) identify preferred mathematical techniques for the detection of atmospheric signals that provide wind velocities which are obscured by natural and man-made sources, and (2) to analyze one or more preferred techniques to demonstrate proof of the capability to improve the detection of wind velocities.

  17. Model for spectral and chromatographic data

    DOEpatents

    Jarman, Kristin [Richland, WA; Willse, Alan [Richland, WA; Wahl, Karen [Richland, WA; Wahl, Jon [Richland, WA

    2002-11-26

    A method and apparatus using a spectral analysis technique are disclosed. In one form of the invention, probabilities are selected to characterize the presence (and in another form, also a quantification of a characteristic) of peaks in an indexed data set for samples that match a reference species, and other probabilities are selected for samples that do not match the reference species. An indexed data set is acquired for a sample, and a determination is made according to techniques exemplified herein as to whether the sample matches or does not match the reference species. When quantification of peak characteristics is undertaken, the model is appropriately expanded, and the analysis accounts for the characteristic model and data. Further techniques are provided to apply the methods and apparatuses to process control, cluster analysis, hypothesis testing, analysis of variance, and other procedures involving multiple comparisons of indexed data.

  18. A study of data analysis techniques for the multi-needle Langmuir probe

    NASA Astrophysics Data System (ADS)

    Hoang, H.; Røed, K.; Bekkeng, T. A.; Moen, J. I.; Spicher, A.; Clausen, L. B. N.; Miloch, W. J.; Trondsen, E.; Pedersen, A.

    2018-06-01

    In this paper we evaluate two data analysis techniques for the multi-needle Langmuir probe (m-NLP). The instrument uses several cylindrical Langmuir probes, which are positively biased with respect to the plasma potential in order to operate in the electron saturation region. Since the currents collected by these probes can be sampled at kilohertz rates, the instrument is capable of resolving the ionospheric plasma structure down to the meter scale. The two data analysis techniques, a linear fit and a non-linear least squares fit, are discussed in detail using data from the Investigation of Cusp Irregularities 2 sounding rocket. It is shown that each technique has pros and cons with respect to the m-NLP implementation. Even though the linear fitting technique seems to be better than measurements from incoherent scatter radar and in situ instruments, m-NLPs can be longer and can be cleaned during operation to improve instrument performance. The non-linear least squares fitting technique would be more reliable provided that a higher number of probes are deployed.

  19. Combining Raman spectroscopy and digital holographic microscopy for label-free classification of human immune cells (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    McReynolds, Naomi; Cooke, Fiona G. M.; Chen, Mingzhou; Powis, Simon J.; Dholakia, Kishan

    2017-02-01

    Moving towards label-free techniques for cell identification is essential for many clinical and research applications. Raman spectroscopy and digital holographic microscopy (DHM) are both label-free, non-destructive optical techniques capable of providing complimentary information. We demonstrate a multi-modal system which may simultaneously take Raman spectra and DHM images to provide both a molecular and a morphological description of our sample. In this study we use Raman spectroscopy and DHM to discriminate between three immune cell populations CD4+ T cells, B cells, and monocytes, which together comprise key functional immune cell subsets in immune responses to invading pathogens. Various parameters that may be used to describe the phase images are also examined such as pixel value histograms or texture analysis. Using our system it is possible to consider each technique individually or in combination. Principal component analysis is used on the data set to discriminate between cell types and leave-one-out cross-validation is used to estimate the efficiency of our method. Raman spectroscopy provides specific chemical information but requires relatively long acquisition times, combining this with a faster modality such as DHM could help achieve faster throughput rates. The combination of these two complimentary optical techniques provides a wealth of information for cell characterisation which is a step towards achieving label free technology for the identification of human immune cells.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Richard O.

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less

  1. The Need For Dedicated Bifurcation Stents: A Critical Analysis

    PubMed Central

    Lesiak, Maciej

    2016-01-01

    There is growing evidence that optimally performed two-stent techniques may provide similar or better results compared with the simple techniques for bifurcation lesions, with an observed trend towards improvements in clinical and/or angiographic outcomes with a two-stent strategy. Yet, provisional stenting remains the treatment of choice. Here, the author discusses the evidence – and controversies – concerning when and how to use complex techniques. PMID:29588719

  2. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  3. 38 CFR 1.921 - Analysis of costs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... effectiveness of alternative collection techniques, establish guidelines with respect to points at which costs... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Analysis of costs. 1.921... Standards for Collection of Claims § 1.921 Analysis of costs. VA collection procedures should provide for...

  4. POLO: a user's guide to Probit Or LOgit analysis.

    Treesearch

    Jacqueline L. Robertson; Robert M. Russell; N.E. Savin

    1980-01-01

    This user's guide provides detailed instructions for the use of POLO (Probit Or LOgit), a computer program for the analysis of quantal response data such as that obtained from insecticide bioassays by the techniques of probit or logit analysis. Dosage-response lines may be compared for parallelism or...

  5. Multiple-Group Analysis Using the sem Package in the R System

    ERIC Educational Resources Information Center

    Evermann, Joerg

    2010-01-01

    Multiple-group analysis in covariance-based structural equation modeling (SEM) is an important technique to ensure the invariance of latent construct measurements and the validity of theoretical models across different subpopulations. However, not all SEM software packages provide multiple-group analysis capabilities. The sem package for the R…

  6. PyMVPA: A python toolbox for multivariate pattern analysis of fMRI data.

    PubMed

    Hanke, Michael; Halchenko, Yaroslav O; Sederberg, Per B; Hanson, Stephen José; Haxby, James V; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability.

  7. PyMVPA: A Python toolbox for multivariate pattern analysis of fMRI data

    PubMed Central

    Hanke, Michael; Halchenko, Yaroslav O.; Sederberg, Per B.; Hanson, Stephen José; Haxby, James V.; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine-learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability. PMID:19184561

  8. Discriminant forest classification method and system

    DOEpatents

    Chen, Barry Y.; Hanley, William G.; Lemmond, Tracy D.; Hiller, Lawrence J.; Knapp, David A.; Mugge, Marshall J.

    2012-11-06

    A hybrid machine learning methodology and system for classification that combines classical random forest (RF) methodology with discriminant analysis (DA) techniques to provide enhanced classification capability. A DA technique which uses feature measurements of an object to predict its class membership, such as linear discriminant analysis (LDA) or Andersen-Bahadur linear discriminant technique (AB), is used to split the data at each node in each of its classification trees to train and grow the trees and the forest. When training is finished, a set of n DA-based decision trees of a discriminant forest is produced for use in predicting the classification of new samples of unknown class.

  9. Multimodal biophotonic workstation for live cell analysis.

    PubMed

    Esseling, Michael; Kemper, Björn; Antkowiak, Maciej; Stevenson, David J; Chaudet, Lionel; Neil, Mark A A; French, Paul W; von Bally, Gert; Dholakia, Kishan; Denz, Cornelia

    2012-01-01

    A reliable description and quantification of the complex physiology and reactions of living cells requires a multimodal analysis with various measurement techniques. We have investigated the integration of different techniques into a biophotonic workstation that can provide biological researchers with these capabilities. The combination of a micromanipulation tool with three different imaging principles is accomplished in a single inverted microscope which makes the results from all the techniques directly comparable. Chinese Hamster Ovary (CHO) cells were manipulated by optical tweezers while the feedback was directly analyzed by fluorescence lifetime imaging, digital holographic microscopy and dynamic phase-contrast microscopy. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Bioimaging of cells and tissues using accelerator-based sources.

    PubMed

    Petibois, Cyril; Cestelli Guidi, Mariangela

    2008-07-01

    A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (< 1 mum) or tissue (<1 mm) analyses, (2) a temporal resolution to follow molecular dynamics, and (3) a sensitivity in the micromolar to nanomolar range, thus allowing true investigations on biological dynamics. Third-generation synchrotrons now offer the opportunity of bioanalytical measurements at nanometer resolutions with incredible sensitivity. Linear accelerators are more specialized in their physical features but may exceed synchrotron performances. All these techniques have become irreplaceable tools for developing knowledge in biology. This review highlights the pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.

  11. Linear prediction and single-channel recording.

    PubMed

    Carter, A A; Oswald, R E

    1995-08-01

    The measurement of individual single-channel events arising from the gating of ion channels provides a detailed data set from which the kinetic mechanism of a channel can be deduced. In many cases, the pattern of dwells in the open and closed states is very complex, and the kinetic mechanism and parameters are not easily determined. Assuming a Markov model for channel kinetics, the probability density function for open and closed time dwells should consist of a sum of decaying exponentials. One method of approaching the kinetic analysis of such a system is to determine the number of exponentials and the corresponding parameters which comprise the open and closed dwell time distributions. These can then be compared to the relaxations predicted from the kinetic model to determine, where possible, the kinetic constants. We report here the use of a linear technique, linear prediction/singular value decomposition, to determine the number of exponentials and the exponential parameters. Using simulated distributions and comparing with standard maximum-likelihood analysis, the singular value decomposition techniques provide advantages in some situations and are a useful adjunct to other single-channel analysis techniques.

  12. Novel Passive Clearing Methods for the Rapid Production of Optical Transparency in Whole CNS Tissue.

    PubMed

    Woo, Jiwon; Lee, Eunice Yoojin; Park, Hyo-Suk; Park, Jeong Yoon; Cho, Yong Eun

    2018-05-08

    Since the development of CLARITY, a bioelectrochemical clearing technique that allows for three-dimensional phenotype mapping within transparent tissues, a multitude of novel clearing methodologies including CUBIC (clear, unobstructed brain imaging cocktails and computational analysis), SWITCH (system-wide control of interaction time and kinetics of chemicals), MAP (magnified analysis of the proteome), and PACT (passive clarity technique), have been established to further expand the existing toolkit for the microscopic analysis of biological tissues. The present study aims to improve upon and optimize the original PACT procedure for an array of intact rodent tissues, including the whole central nervous system (CNS), kidneys, spleen, and whole mouse embryos. Termed psPACT (process-separate PACT) and mPACT (modified PACT), these novel techniques provide highly efficacious means of mapping cell circuitry and visualizing subcellular structures in intact normal and pathological tissues. In the following protocol, we provide a detailed, step-by-step outline on how to achieve maximal tissue clearance with minimal invasion of their structural integrity via psPACT and mPACT.

  13. Portable Wireless LAN Device and Two-Way Radio Threat Assessment for Aircraft VHF Communication Radio Band

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  14. Harnessing psychoanalytical methods for a phenomenological neuroscience

    PubMed Central

    Cusumano, Emma P.; Raz, Amir

    2014-01-01

    Psychoanalysis proffers a wealth of phenomenological tools to advance the study of consciousness. Techniques for elucidating the structures of subjective life are sorely lacking in the cognitive sciences; as such, experiential reporting techniques must rise to meet both complex theories of brain function and increasingly sophisticated neuroimaging technologies. Analysis may offer valuable methods for bridging the gap between first-person and third-person accounts of the mind. Using both systematic observational approaches alongside unstructured narrative interactions, psychoanalysts help patients articulate their experience and bring unconscious mental contents into awareness. Similar to seasoned meditators or phenomenologists, individuals who have undergone analysis are experts in discerning and describing their subjective experience, thus making them ideal candidates for neurophenomenology. Moreover, analytic techniques may provide a means of guiding untrained experimental participants to greater awareness of their mental continuum, as well as gathering subjective reports about fundamental yet elusive aspects of experience including selfhood, temporality, and inter-subjectivity. Mining psychoanalysis for its methodological innovations provides a fresh turn for the neuropsychoanalysis movement and cognitive science as a whole – showcasing the integrity of analysis alongside the irreducibility of human experience. PMID:24808869

  15. Analysis of Protein Expression in Cell Microarrays: A Tool for Antibody-based Proteomics

    PubMed Central

    Andersson, Ann-Catrin; Strömberg, Sara; Bäckvall, Helena; Kampf, Caroline; Uhlen, Mathias; Wester, Kenneth; Pontén, Fredrik

    2006-01-01

    Tissue microarray (TMA) technology provides a possibility to explore protein expression patterns in a multitude of normal and disease tissues in a high-throughput setting. Although TMAs have been used for analysis of tissue samples, robust methods for studying in vitro cultured cell lines and cell aspirates in a TMA format have been lacking. We have adopted a technique to homogeneously distribute cells in an agarose gel matrix, creating an artificial tissue. This enables simultaneous profiling of protein expression in suspension- and adherent-grown cell samples assembled in a microarray. In addition, the present study provides an optimized strategy for the basic laboratory steps to efficiently produce TMAs. Presented modifications resulted in an improved quality of specimens and a higher section yield compared with standard TMA production protocols. Sections from the generated cell TMAs were tested for immunohistochemical staining properties using 20 well-characterized antibodies. Comparison of immunoreactivity in cultured dispersed cells and corresponding cells in tissue samples showed congruent results for all tested antibodies. We conclude that a modified TMA technique, including cell samples, provides a valuable tool for high-throughput analysis of protein expression, and that this technique can be used for global approaches to explore the human proteome. PMID:16957166

  16. Failure Analysis for Composition of Web Services Represented as Labeled Transition Systems

    NASA Astrophysics Data System (ADS)

    Nadkarni, Dinanath; Basu, Samik; Honavar, Vasant; Lutz, Robyn

    The Web service composition problem involves the creation of a choreographer that provides the interaction between a set of component services to realize a goal service. Several methods have been proposed and developed to address this problem. In this paper, we consider those scenarios where the composition process may fail due to incomplete specification of goal service requirements or due to the fact that the user is unaware of the functionality provided by the existing component services. In such cases, it is desirable to have a composition algorithm that can provide feedback to the user regarding the cause of failure in the composition process. Such feedback will help guide the user to re-formulate the goal service and iterate the composition process. We propose a failure analysis technique for composition algorithms that views Web service behavior as multiple sequences of input/output events. Our technique identifies the possible cause of composition failure and suggests possible recovery options to the user. We discuss our technique using a simple e-Library Web service in the context of the MoSCoE Web service composition framework.

  17. Decoding human swallowing via electroencephalography: a state-of-the-art review

    PubMed Central

    Jestrović, Iva; Coyle, James L.

    2015-01-01

    Swallowing and swallowing disorders have garnered continuing interest over the past several decades. Electroencephalography (EEG) is an inexpensive and non-invasive procedure with very high temporal resolution which enables analysis of short and fast swallowing events, as well as an analysis of the organizational and behavioral aspects of cortical motor preparation, swallowing execution and swallowing regulation. EEG is a powerful technique which can be used alone or in combination with other techniques for monitoring swallowing, detection of swallowing motor imagery for diagnostic or biofeedback purposes, or to modulate and measure the effects of swallowing rehabilitation. This paper provides a review of the existing literature which has deployed EEG in the investigation of oropharyngeal swallowing, smell, taste and texture related to swallowing, cortical pre-motor activation in swallowing, and swallowing motor imagery detection. Furthermore, this paper provides a brief review of the different modalities of brain imaging techniques used to study swallowing brain activities, as well as the EEG components of interest for studies on swallowing and on swallowing motor imagery. Lastly, this paper provides directions for future swallowing investigations using EEG. PMID:26372528

  18. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    PubMed

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  19. Proposed minimum reporting standards for chemical analysis Chemical Analysis Working Group (CAWG) Metabolomics Standards Initiative (MSI)

    PubMed Central

    Amberg, Alexander; Barrett, Dave; Beale, Michael H.; Beger, Richard; Daykin, Clare A.; Fan, Teresa W.-M.; Fiehn, Oliver; Goodacre, Royston; Griffin, Julian L.; Hankemeier, Thomas; Hardy, Nigel; Harnly, James; Higashi, Richard; Kopka, Joachim; Lane, Andrew N.; Lindon, John C.; Marriott, Philip; Nicholls, Andrew W.; Reily, Michael D.; Thaden, John J.; Viant, Mark R.

    2013-01-01

    There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msi-workgroups-feedback@lists.sourceforge.net. Further, community input related to this document can also be provided via this electronic forum. PMID:24039616

  20. Application of STEM characterization for investigating radiation effects in BCC Fe-based alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parish, Chad M.; Field, Kevin G.; Certain, Alicia G.

    2015-04-20

    This paper provides a general overview of advanced scanning transmission electron microscopy (STEM) techniques used for characterization of irradiated BCC Fe-based alloys. Advanced STEM methods provide the high-resolution imaging and chemical analysis necessary to understand the irradiation response of BCC Fe-based alloys. The use of STEM with energy dispersive x-ray spectroscopy (EDX) for measurement of radiation-induced segregation (RIS) is described, with an illustrated example of RIS in proton- and self-ion irradiated T91. Aberration-corrected STEM-EDX for nanocluster/nanoparticle imaging and chemical analysis is also discussed, and examples are provided from ion-irradiated oxide dispersion strengthened (ODS) alloys. In conclusion, STEM techniques for void,more » cavity, and dislocation loop imaging are described, with examples from various BCC Fe-based alloys.« less

  1. Micromachined patch-clamp apparatus

    DOEpatents

    Okandan, Murat

    2012-12-04

    A micromachined patch-clamp apparatus is disclosed for holding one or more cells and providing electrical, chemical, or mechanical stimulation to the cells during analysis with the patch-clamp technique for studying ion channels in cell membranes. The apparatus formed on a silicon substrate utilizes a lower chamber formed from silicon nitride using surface micromachining and an upper chamber formed from a molded polymer material. An opening in a common wall between the chambers is used to trap and hold a cell for analysis using the patch-clamp technique with sensing electrodes on each side of the cell. Some embodiments of the present invention utilize one or more electrostatic actuators formed on the substrate to provide mechanical stimulation to the cell being analyzed, or to provide information about mechanical movement of the cell in response to electrical or chemical stimulation.

  2. An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP Techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1972-01-01

    Author identified significant preliminary results from the Ouachita portion of the Texoma frame of data indicate many potentials in the analysis and interpretation of ERTS data. It is believed that one of the more significant aspects of this analysis sequence has been the investigation of a technique to relate ERTS analysis and surface observation analysis. At present a sequence involving (1) preliminary analysis based solely upon the spectral characteristics of the data, followed by (2) a surface observation mission to obtain visual information and oblique photography to particular points of interest in the test site area, appears to provide an extremely efficient technique for obtaining particularly meaningful surface observation data. Following such a procedure permits concentration on particular points of interest in the entire ERTS frame and thereby makes the surface observation data obtained to be particularly significant and meaningful. The analysis of the Texoma frame has also been significant from the standpoint of demonstrating a fast turn around analysis capability. Additionally, the analysis has shown the potential accuracy and degree of complexity of features that can be identified and mapped using ERTS data.

  3. Monte Carlo uncertainty analysis of dose estimates in radiochromic film dosimetry with single-channel and multichannel algorithms.

    PubMed

    Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen; González-López, Antonio

    2018-03-01

    To provide a multi-stage model to calculate uncertainty in radiochromic film dosimetry with Monte-Carlo techniques. This new approach is applied to single-channel and multichannel algorithms. Two lots of Gafchromic EBT3 are exposed in two different Varian linacs. They are read with an EPSON V800 flatbed scanner. The Monte-Carlo techniques in uncertainty analysis provide a numerical representation of the probability density functions of the output magnitudes. From this numerical representation, traditional parameters of uncertainty analysis as the standard deviations and bias are calculated. Moreover, these numerical representations are used to investigate the shape of the probability density functions of the output magnitudes. Also, another calibration film is read in four EPSON scanners (two V800 and two 10000XL) and the uncertainty analysis is carried out with the four images. The dose estimates of single-channel and multichannel algorithms show a Gaussian behavior and low bias. The multichannel algorithms lead to less uncertainty in the final dose estimates when the EPSON V800 is employed as reading device. In the case of the EPSON 10000XL, the single-channel algorithms provide less uncertainty in the dose estimates for doses higher than four Gy. A multi-stage model has been presented. With the aid of this model and the use of the Monte-Carlo techniques, the uncertainty of dose estimates for single-channel and multichannel algorithms are estimated. The application of the model together with Monte-Carlo techniques leads to a complete characterization of the uncertainties in radiochromic film dosimetry. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. Why bundled payments could drive innovation: an example from interventional oncology.

    PubMed

    Steele, Joseph R; Jones, A Kyle; Ninan, Elizabeth P; Clarke, Ryan K; Odisio, Bruno C; Avritscher, Rony; Murthy, Ravi; Mahvash, Armeen

    2015-03-01

    Some have suggested that the current fee-for-service health care payment system in the United States stifles innovation. However, there are few published examples supporting this concept. We implemented an innovative temporary balloon occlusion technique for yttrium 90 radioembolization of nonresectable liver cancer. Although our balloon occlusion technique was associated with similar patient outcomes, lower cost, and faster procedure times compared with the standard-of-care coil embolization technique, our technique failed to gain widespread acceptance. Financial analysis revealed that because the balloon occlusion technique avoided a procedural step associated with a lucrative Current Procedural Terminology billing code, this new technique resulted in a significant decrease in hospital and physician revenue in the current fee-for-service payment system, even though the new technique would provide a revenue enhancement through cost savings in a bundled payment system. Our analysis illustrates how in a fee-for-service payment system, financial disincentives can stifle innovation and advancement of health care delivery. Copyright © 2015 by American Society of Clinical Oncology.

  5. Advancing statistical analysis of ambulatory assessment data in the study of addictive behavior: A primer on three person-oriented techniques.

    PubMed

    Foster, Katherine T; Beltz, Adriene M

    2018-08-01

    Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.

  6. Fabrication of a Dipole-assisted Solid Phase Extraction Microchip for Trace Metal Analysis in Water Samples

    PubMed Central

    Chen, Ping-Hung; Chen, Shun-Niang; Tseng, Sheng-Hao; Deng, Ming-Jay; Lin, Yang-Wei; Sun, Yuh-Chang

    2016-01-01

    This paper describes a fabrication protocol for a dipole-assisted solid phase extraction (SPE) microchip available for trace metal analysis in water samples. A brief overview of the evolution of chip-based SPE techniques is provided. This is followed by an introduction to specific polymeric materials and their role in SPE. To develop an innovative dipole-assisted SPE technique, a chlorine (Cl)-containing SPE functionality was implanted into a poly(methyl methacrylate) (PMMA) microchip. Herein, diverse analytical techniques including contact angle analysis, Raman spectroscopic analysis, and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) analysis were employed to validate the utility of the implantation protocol of the C-Cl moieties on the PMMA. The analytical results of the X-ray absorption near-edge structure (XANES) analysis also demonstrated the feasibility of the Cl-containing PMMA used as an extraction medium by virtue of the dipole-ion interactions between the highly electronegative C-Cl moieties and the positively charged metal ions. PMID:27584954

  7. Forensic Discrimination of Latent Fingerprints Using Laser-Induced Breakdown Spectroscopy (LIBS) and Chemometric Approaches.

    PubMed

    Yang, Jun-Ho; Yoh, Jack J

    2018-01-01

    A novel technique is reported for separating overlapping latent fingerprints using chemometric approaches that combine laser-induced breakdown spectroscopy (LIBS) and multivariate analysis. The LIBS technique provides the capability of real time analysis and high frequency scanning as well as the data regarding the chemical composition of overlapping latent fingerprints. These spectra offer valuable information for the classification and reconstruction of overlapping latent fingerprints by implementing appropriate statistical multivariate analysis. The current study employs principal component analysis and partial least square methods for the classification of latent fingerprints from the LIBS spectra. This technique was successfully demonstrated through a classification study of four distinct latent fingerprints using classification methods such as soft independent modeling of class analogy (SIMCA) and partial least squares discriminant analysis (PLS-DA). The novel method yielded an accuracy of more than 85% and was proven to be sufficiently robust. Furthermore, through laser scanning analysis at a spatial interval of 125 µm, the overlapping fingerprints were reconstructed as separate two-dimensional forms.

  8. An analysis technique for testing log grades

    Treesearch

    Carl A. Newport; William G. O' Regan

    1963-01-01

    An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.

  9. An Explorative Learning Approach to Teaching Clinical Anatomy Using Student Generated Content

    ERIC Educational Resources Information Center

    Philip, Christo T.; Unruh, Kenneth P.; Lachman, Nirusha; Pawlina, Wojciech

    2008-01-01

    Translating basic sciences into a clinical framework has been approached through the implementation of various teaching techniques aimed at using a patient case scenario to facilitate learning. These techniques present students with a specific patient case and lead the students to discuss physiological processes through analysis of provided data…

  10. Proceedings of the 1990 IPMAAC Conference on Personnel Assessment (14th, San Diego, California, June 24-28, 1990).

    ERIC Educational Resources Information Center

    International Personnel Management Association, Washington, DC.

    Fifty-seven papers presented at the annual meeting of the International Personnel Management Association Assessment Council (IPMAAC) in 1990 are provided. Selected topics include: using the cloze technique for reading skills assessment; examining assessment techniques; job analysis; alternate strategies for assessing writing skills; assessment of…

  11. A fluctuation-induced plasma transport diagnostic based upon fast-Fourier transform spectral analysis

    NASA Technical Reports Server (NTRS)

    Powers, E. J.; Kim, Y. C.; Hong, J. Y.; Roth, J. R.; Krawczonek, W. M.

    1978-01-01

    A diagnostic, based on fast Fourier-transform spectral analysis techniques, that provides experimental insight into the relationship between the experimentally observable spectral characteristics of the fluctuations and the fluctuation-induced plasma transport is described. The model upon which the diagnostic technique is based and its experimental implementation is discussed. Some characteristic results obtained during the course of an experimental study of fluctuation-induced transport in the electric field dominated NASA Lewis bumpy torus plasma are presented.

  12. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  13. Multidimensional Analysis of Nuclear Detonations

    DTIC Science & Technology

    2015-09-17

    Features on the nuclear weapons testing films because of the expanding and emissive nature of the nuclear fireball. The use of these techniques to produce...Treaty (New Start Treaty) have reduced the acceptable margins of error. Multidimensional analysis provides the modern approach to nuclear weapon ...scientific community access to the information necessary to expand upon the knowledge of nuclear weapon effects. This data set has the potential to provide

  14. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertz, P.R.

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less

  15. Geometric Analyses of Rotational Faults.

    ERIC Educational Resources Information Center

    Schwert, Donald Peters; Peck, Wesley David

    1986-01-01

    Describes the use of analysis of rotational faults in undergraduate structural geology laboratories to provide students with applications of both orthographic and stereographic techniques. A demonstration problem is described, and an orthographic/stereographic solution and a reproducible black model demonstration pattern are provided. (TW)

  16. Social impact analysis: monetary valuation

    USGS Publications Warehouse

    Wainger, Lisa A.; Johnston, Robert J.; Bagstad, Kenneth J.; Casey, Frank; Vegh, Tibor

    2014-01-01

    This section provides basic guidance for using and conducting economic valuation, including criteria for judging whether valuation is appropriate for supporting decisions. It provides an introduction to the economic techniques used to measure changes in social welfare and describes which methods may be most appropriate for use in valuing particular ecosystem services. Rather than providing comprehensive valuation instructions,it directs readers to additional resources.More generally, it establishes that the valuation of ecosystem services is grounded in a long history of non-market valuation and discusses how ecosystem services valuation can be conducted within established economic theory and techniques.

  17. Optimization of radio telemetry receiving systems: Chapter 5.2

    USGS Publications Warehouse

    Evans, Scott D.; Stevenson, John R.; Adams, Noah S.; Beeman, John W.; Eiler, John H.

    2012-01-01

    Telemetry provides a powerful and flexible tool for studying fish and other aquatic animals, and its use has become increasingly commonplace. However, telemetry is gear intensive and typically requires more specialized knowledge and training than many other field techniques. As with other scientific methods, collecting good data is dependent on an understanding of the underlying principles behind the approach, knowing how to use the equipment and techniques properly, and recognizing what to do with the data collected. This book provides a road map for using telemetry to study aquatic animals, and provides the basic information needed to plan, implement, and conduct a telemetry study under field conditions. Topics include acoustic or radio telemetry study design, tag implantation techniques, radio and acoustic telemetry principles and case studies, and data management and analysis.

  18. A history of telemetry in fishery research: Chapter 2

    USGS Publications Warehouse

    Hockersmith, Eric; Beeman, John W.; Adams, Noah S.; Beeman, John W.; Eiler, John H.

    2012-01-01

    Telemetry provides a powerful and flexible tool for studying fish and other aquatic animals, and its use has become increasingly commonplace. However, telemetry is gear intensive and typically requires more specialized knowledge and training than many other field techniques. As with other scientific methods, collecting good data is dependent on an understanding of the underlying principles behind the approach, knowing how to use the equipment and techniques properly, and recognizing what to do with the data collected. This book provides a road map for using telemetry to study aquatic animals, and provides the basic information needed to plan, implement, and conduct a telemetry study under field conditions. Topics include acoustic or radio telemetry study design, tag implantation techniques, radio and acoustic telemetry principles and case studies, and data management and analysis.

  19. An experiment on the dynamics of ion implantation and sputtering of surfaces

    NASA Astrophysics Data System (ADS)

    Wright, G. M.; Barnard, H. A.; Kesler, L. A.; Peterson, E. E.; Stahle, P. W.; Sullivan, R. M.; Whyte, D. G.; Woller, K. B.

    2014-02-01

    A major impediment towards a better understanding of the complex plasma-surface interaction is the limited diagnostic access to the material surface while it is undergoing plasma exposure. The Dynamics of ION Implantation and Sputtering Of Surfaces (DIONISOS) experiment overcomes this limitation by uniquely combining powerful, non-perturbing ion beam analysis techniques with a steady-state helicon plasma exposure chamber, allowing for real-time, depth-resolved in situ measurements of material compositions during plasma exposure. Design solutions are described that provide compatibility between the ion beam analysis requirements in the presence of a high-intensity helicon plasma. The three primary ion beam analysis techniques, Rutherford backscattering spectroscopy, elastic recoil detection, and nuclear reaction analysis, are successfully implemented on targets during plasma exposure in DIONISOS. These techniques measure parameters of interest for plasma-material interactions such as erosion/deposition rates of materials and the concentration of plasma fuel species in the material surface.

  20. An experiment on the dynamics of ion implantation and sputtering of surfaces.

    PubMed

    Wright, G M; Barnard, H A; Kesler, L A; Peterson, E E; Stahle, P W; Sullivan, R M; Whyte, D G; Woller, K B

    2014-02-01

    A major impediment towards a better understanding of the complex plasma-surface interaction is the limited diagnostic access to the material surface while it is undergoing plasma exposure. The Dynamics of ION Implantation and Sputtering Of Surfaces (DIONISOS) experiment overcomes this limitation by uniquely combining powerful, non-perturbing ion beam analysis techniques with a steady-state helicon plasma exposure chamber, allowing for real-time, depth-resolved in situ measurements of material compositions during plasma exposure. Design solutions are described that provide compatibility between the ion beam analysis requirements in the presence of a high-intensity helicon plasma. The three primary ion beam analysis techniques, Rutherford backscattering spectroscopy, elastic recoil detection, and nuclear reaction analysis, are successfully implemented on targets during plasma exposure in DIONISOS. These techniques measure parameters of interest for plasma-material interactions such as erosion/deposition rates of materials and the concentration of plasma fuel species in the material surface.

  1. Analysis of intracranial pressure: past, present, and future.

    PubMed

    Di Ieva, Antonio; Schmitz, Erika M; Cusimano, Michael D

    2013-12-01

    The monitoring of intracranial pressure (ICP) is an important tool in medicine for its ability to portray the brain's compliance status. The bedside monitor displays the ICP waveform and intermittent mean values to guide physicians in the management of patients, particularly those having sustained a traumatic brain injury. Researchers in the fields of engineering and physics have investigated various mathematical analysis techniques applicable to the waveform in order to extract additional diagnostic and prognostic information, although they largely remain limited to research applications. The purpose of this review is to present the current techniques used to monitor and interpret ICP and explore the potential of using advanced mathematical techniques to provide information about system perturbations from states of homeostasis. We discuss the limits of each proposed technique and we propose that nonlinear analysis could be a reliable approach to describe ICP signals over time, with the fractal dimension as a potential predictive clinically meaningful biomarker. Our goal is to stimulate translational research that can move modern analysis of ICP using these techniques into widespread practical use, and to investigate to the clinical utility of a tool capable of simplifying multiple variables obtained from various sensors.

  2. EDNA: Expert fault digraph analysis using CLIPS

    NASA Technical Reports Server (NTRS)

    Dixit, Vishweshwar V.

    1990-01-01

    Traditionally fault models are represented by trees. Recently, digraph models have been proposed (Sack). Digraph models closely imitate the real system dependencies and hence are easy to develop, validate and maintain. However, they can also contain directed cycles and analysis algorithms are hard to find. Available algorithms tend to be complicated and slow. On the other hand, the tree analysis (VGRH, Tayl) is well understood and rooted in vast research effort and analytical techniques. The tree analysis algorithms are sophisticated and orders of magnitude faster. Transformation of a digraph (cyclic) into trees (CLP, LP) is a viable approach to blend the advantages of the representations. Neither the digraphs nor the trees provide the ability to handle heuristic knowledge. An expert system, to capture the engineering knowledge, is essential. We propose an approach here, namely, expert network analysis. We combine the digraph representation and tree algorithms. The models are augmented by probabilistic and heuristic knowledge. CLIPS, an expert system shell from NASA-JSC will be used to develop a tool. The technique provides the ability to handle probabilities and heuristic knowledge. Mixed analysis, some nodes with probabilities, is possible. The tool provides graphics interface for input, query, and update. With the combined approach it is expected to be a valuable tool in the design process as well in the capture of final design knowledge.

  3. Use of Latent Profile Analysis in Studies of Gifted Students

    ERIC Educational Resources Information Center

    Mammadov, Sakhavat; Ward, Thomas J.; Cross, Jennifer Riedl; Cross, Tracy L.

    2016-01-01

    To date, in gifted education and related fields various conventional factor analytic and clustering techniques have been used extensively for investigation of the underlying structure of data. Latent profile analysis is a relatively new method in the field. In this article, we provide an introduction to latent profile analysis for gifted education…

  4. Application of Economic Analysis to School-Wide Positive Behavior Support (SWPBS) Programs

    ERIC Educational Resources Information Center

    Blonigen, Bruce A.; Harbaugh, William T.; Singell, Larry D.; Horner, Robert H.; Irvin, Larry K.; Smolkowski, Keith S.

    2008-01-01

    The authors discuss how to use economic techniques to evaluate educational programs and show how to apply basic cost analysis to implementation of school-wide positive behavior support (SWPBS). A description of cost analysis concepts used for economic program evaluation is provided, emphasizing the suitability of these concepts for evaluating…

  5. Pyrotechnic Shock Analysis Using Statistical Energy Analysis

    DTIC Science & Technology

    2015-10-23

    SEA subsystems. A couple of validation examples are provided to demonstrate the new approach. KEY WORDS : Peak Ratio, phase perturbation...Ballistic Shock Prediction Models and Techniques for Use in the Crusader Combat Vehicle Program,” 11th Annual US Army Ground Vehicle Survivability

  6. A guide to forestry investment analysis.

    Treesearch

    Dietmar W. Rose; Charles R. Blinn; Gary J. Brand

    1988-01-01

    It is often necessary to choose between several forestry projects. This paper provides the background needed to evaluate projects from a financial perspective. The basic steps for preparing a project analysis, suggestions for dealing with uncertainty, and techniques for monitoring a projects are presented.

  7. Programmable Logic Application Notes

    NASA Technical Reports Server (NTRS)

    Katz, Richard

    2000-01-01

    This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will start a series of notes concentrating on analysis techniques with this issues section discussing worst-case analysis requirements.

  8. Determination of Reaction Stoichiometries by Flow Injection Analysis.

    ERIC Educational Resources Information Center

    Rios, Angel; And Others

    1986-01-01

    Describes a method of flow injection analysis intended for calculation of complex-formation and redox reaction stoichiometries based on a closed-loop configuration. The technique is suitable for use in undergraduate laboratories. Information is provided for equipment, materials, procedures, and sample results. (JM)

  9. A new technique for the characterization of chaff elements

    NASA Astrophysics Data System (ADS)

    Scholfield, David; Myat, Maung; Dauby, Jason; Fesler, Jonathon; Bright, Jonathan

    2011-07-01

    A new technique for the experimental characterization of electromagnetic chaff based on Inverse Synthetic Aperture Radar is presented. This technique allows for the characterization of as few as one filament of chaff in a controlled anechoic environment allowing for stability and repeatability of experimental results. This approach allows for a deeper understanding of the fundamental phenomena of electromagnetic scattering from chaff through an incremental analysis approach. Chaff analysis can now begin with a single element and progress through the build-up of particles into pseudo-cloud structures. This controlled incremental approach is supported by an identical incremental modeling and validation process. Additionally, this technique has the potential to produce considerable savings in financial and schedule cost and provides a stable and repeatable experiment to aid model valuation.

  10. INNOVATIONS IN SOIL SAMPLING AND DATA ANALYSIS

    EPA Science Inventory

    Successful research outcomes from the VOC in soils work will provide the Agency with methods and techniques that provide the accurate VOC concentrations so that decisions related to a contaminated site can be made to optimize the protectiveness to the environment and human health...

  11. Oligonucleotide arrays vs. metaphase-comparative genomic hybridisation and BAC arrays for single-cell analysis: first applications to preimplantation genetic diagnosis for Robertsonian translocation carriers.

    PubMed

    Ramos, Laia; del Rey, Javier; Daina, Gemma; García-Aragonés, Manel; Armengol, Lluís; Fernandez-Encinas, Alba; Parriego, Mònica; Boada, Montserrat; Martinez-Passarell, Olga; Martorell, Maria Rosa; Casagran, Oriol; Benet, Jordi; Navarro, Joaquima

    2014-01-01

    Comprehensive chromosome analysis techniques such as metaphase-Comparative Genomic Hybridisation (CGH) and array-CGH are available for single-cell analysis. However, while metaphase-CGH and BAC array-CGH have been widely used for Preimplantation Genetic Diagnosis, oligonucleotide array-CGH has not been used in an extensive way. A comparison between oligonucleotide array-CGH and metaphase-CGH has been performed analysing 15 single fibroblasts from aneuploid cell-lines and 18 single blastomeres from human cleavage-stage embryos. Afterwards, oligonucleotide array-CGH and BAC array-CGH were also compared analysing 16 single blastomeres from human cleavage-stage embryos. All three comprehensive analysis techniques provided broadly similar cytogenetic profiles; however, non-identical profiles appeared when extensive aneuploidies were present in a cell. Both array techniques provided an optimised analysis procedure and a higher resolution than metaphase-CGH. Moreover, oligonucleotide array-CGH was able to define extra segmental imbalances in 14.7% of the blastomeres and it better determined the specific unbalanced chromosome regions due to a higher resolution of the technique (≈ 20 kb). Applicability of oligonucleotide array-CGH for Preimplantation Genetic Diagnosis has been demonstrated in two cases of Robertsonian translocation carriers 45,XY,der(13;14)(q10;q10). Transfer of euploid embryos was performed in both cases and pregnancy was achieved by one of the couples. This is the first time that an oligonucleotide array-CGH approach has been successfully applied to Preimplantation Genetic Diagnosis for balanced chromosome rearrangement carriers.

  12. Oligonucleotide Arrays vs. Metaphase-Comparative Genomic Hybridisation and BAC Arrays for Single-Cell Analysis: First Applications to Preimplantation Genetic Diagnosis for Robertsonian Translocation Carriers

    PubMed Central

    Ramos, Laia; del Rey, Javier; Daina, Gemma; García-Aragonés, Manel; Armengol, Lluís; Fernandez-Encinas, Alba; Parriego, Mònica; Boada, Montserrat; Martinez-Passarell, Olga; Martorell, Maria Rosa; Casagran, Oriol; Benet, Jordi; Navarro, Joaquima

    2014-01-01

    Comprehensive chromosome analysis techniques such as metaphase-Comparative Genomic Hybridisation (CGH) and array-CGH are available for single-cell analysis. However, while metaphase-CGH and BAC array-CGH have been widely used for Preimplantation Genetic Diagnosis, oligonucleotide array-CGH has not been used in an extensive way. A comparison between oligonucleotide array-CGH and metaphase-CGH has been performed analysing 15 single fibroblasts from aneuploid cell-lines and 18 single blastomeres from human cleavage-stage embryos. Afterwards, oligonucleotide array-CGH and BAC array-CGH were also compared analysing 16 single blastomeres from human cleavage-stage embryos. All three comprehensive analysis techniques provided broadly similar cytogenetic profiles; however, non-identical profiles appeared when extensive aneuploidies were present in a cell. Both array techniques provided an optimised analysis procedure and a higher resolution than metaphase-CGH. Moreover, oligonucleotide array-CGH was able to define extra segmental imbalances in 14.7% of the blastomeres and it better determined the specific unbalanced chromosome regions due to a higher resolution of the technique (≈20 kb). Applicability of oligonucleotide array-CGH for Preimplantation Genetic Diagnosis has been demonstrated in two cases of Robertsonian translocation carriers 45,XY,der(13;14)(q10;q10). Transfer of euploid embryos was performed in both cases and pregnancy was achieved by one of the couples. This is the first time that an oligonucleotide array-CGH approach has been successfully applied to Preimplantation Genetic Diagnosis for balanced chromosome rearrangement carriers. PMID:25415307

  13. Development of Moire machine vision

    NASA Technical Reports Server (NTRS)

    Harding, Kevin G.

    1987-01-01

    Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.

  14. Development of Moire machine vision

    NASA Astrophysics Data System (ADS)

    Harding, Kevin G.

    1987-10-01

    Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.

  15. SeeSway - A free web-based system for analysing and exploring standing balance data.

    PubMed

    Clark, Ross A; Pua, Yong-Hao

    2018-06-01

    Computerised posturography can be used to assess standing balance, and can predict poor functional outcomes in many clinical populations. A key limitation is the disparate signal filtering and analysis techniques, with many methods requiring custom computer programs. This paper discusses the creation of a freely available web-based software program, SeeSway (www.rehabtools.org/seesway), which was designed to provide powerful tools for pre-processing, analysing and visualising standing balance data in an easy to use and platform independent website. SeeSway links an interactive web platform with file upload capability to software systems including LabVIEW, Matlab, Python and R to perform the data filtering, analysis and visualisation of standing balance data. Input data can consist of any signal that comprises an anterior-posterior and medial-lateral coordinate trace such as center of pressure or mass displacement. This allows it to be used with systems including criterion reference commercial force platforms and three dimensional motion analysis, smartphones, accelerometers and low-cost technology such as Nintendo Wii Balance Board and Microsoft Kinect. Filtering options include Butterworth, weighted and unweighted moving average, and discrete wavelet transforms. Analysis methods include standard techniques such as path length, amplitude, and root mean square in addition to less common but potentially promising methods such as sample entropy, detrended fluctuation analysis and multiresolution wavelet analysis. These data are visualised using scalograms, which chart the change in frequency content over time, scatterplots and standard line charts. This provides the user with a detailed understanding of their results, and how their different pre-processing and analysis method selections affect their findings. An example of the data analysis techniques is provided in the paper, with graphical representation of how advanced analysis methods can better discriminate between someone with neurological impairment and a healthy control. The goal of SeeSway is to provide a simple yet powerful educational and research tool to explore how standing balance is affected in aging and clinical populations. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Reducing Wrong Patient Selection Errors: Exploring the Design Space of User Interface Techniques

    PubMed Central

    Sopan, Awalin; Plaisant, Catherine; Powsner, Seth; Shneiderman, Ben

    2014-01-01

    Wrong patient selection errors are a major issue for patient safety; from ordering medication to performing surgery, the stakes are high. Widespread adoption of Electronic Health Record (EHR) and Computerized Provider Order Entry (CPOE) systems makes patient selection using a computer screen a frequent task for clinicians. Careful design of the user interface can help mitigate the problem by helping providers recall their patients’ identities, accurately select their names, and spot errors before orders are submitted. We propose a catalog of twenty seven distinct user interface techniques, organized according to a task analysis. An associated video demonstrates eighteen of those techniques. EHR designers who consider a wider range of human-computer interaction techniques could reduce selection errors, but verification of efficacy is still needed. PMID:25954415

  17. Reducing wrong patient selection errors: exploring the design space of user interface techniques.

    PubMed

    Sopan, Awalin; Plaisant, Catherine; Powsner, Seth; Shneiderman, Ben

    2014-01-01

    Wrong patient selection errors are a major issue for patient safety; from ordering medication to performing surgery, the stakes are high. Widespread adoption of Electronic Health Record (EHR) and Computerized Provider Order Entry (CPOE) systems makes patient selection using a computer screen a frequent task for clinicians. Careful design of the user interface can help mitigate the problem by helping providers recall their patients' identities, accurately select their names, and spot errors before orders are submitted. We propose a catalog of twenty seven distinct user interface techniques, organized according to a task analysis. An associated video demonstrates eighteen of those techniques. EHR designers who consider a wider range of human-computer interaction techniques could reduce selection errors, but verification of efficacy is still needed.

  18. A comparative analysis of soft computing techniques for gene prediction.

    PubMed

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Recommended techniques for effective maintainability. A continuous improvement initiative of the NASA Reliability and Maintainability Steering Committee

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.

  20. Ultrasonic non invasive techniques for microbiological instrumentation

    NASA Astrophysics Data System (ADS)

    Elvira, L.; Sierra, C.; Galán, B.; Resa, P.

    2010-01-01

    Non invasive techniques based on ultrasounds have advantageous features to study, characterize and monitor microbiological and enzymatic reactions. These processes may change the sound speed, viscosity or particle distribution size of the medium where they take place, which makes possible their analysis using ultrasonic techniques. In this work, two different systems for the analysis of microbiological liquid media based on ultrasounds are presented. In first place, an industrial application based on an ultrasonic monitoring technique for microbiological growth detection in milk is shown. Such a system may improve the quality control strategies in food production factories, being able to decrease the time required to detect possible contaminations in packed products. Secondly, a study about the growing of the Escherichia coli DH5 α in different conditions is presented. It is shown that the use of ultrasonic non invasive characterization techniques in combination with other conventional measurements like optical density provides complementary information about the metabolism of these bacteria.

  1. Highly efficient full-wave electromagnetic analysis of 3-D arbitrarily shaped waveguide microwave devices using an integral equation technique

    NASA Astrophysics Data System (ADS)

    Vidal, A.; San-Blas, A. A.; Quesada-Pereira, F. D.; Pérez-Soler, J.; Gil, J.; Vicente, C.; Gimeno, B.; Boria, V. E.

    2015-07-01

    A novel technique for the full-wave analysis of 3-D complex waveguide devices is presented. This new formulation, based on the Boundary Integral-Resonant Mode Expansion (BI-RME) method, allows the rigorous full-wave electromagnetic characterization of 3-D arbitrarily shaped metallic structures making use of extremely low CPU resources (both time and memory). The unknown electric current density on the surface of the metallic elements is represented by means of Rao-Wilton-Glisson basis functions, and an algebraic procedure based on a singular value decomposition is applied to transform such functions into the classical solenoidal and nonsolenoidal basis functions needed by the original BI-RME technique. The developed tool also provides an accurate computation of the electromagnetic fields at an arbitrary observation point of the considered device, so it can be used for predicting high-power breakdown phenomena. In order to validate the accuracy and efficiency of this novel approach, several new designs of band-pass waveguides filters are presented. The obtained results (S-parameters and electromagnetic fields) are successfully compared both to experimental data and to numerical simulations provided by a commercial software based on the finite element technique. The results obtained show that the new technique is specially suitable for the efficient full-wave analysis of complex waveguide devices considering an integrated coaxial excitation, where the coaxial probes may be in contact with the metallic insets of the component.

  2. A comparison of solute-transport solution techniques and their effect on sensitivity analysis and inverse modeling results

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2001-01-01

    Five common numerical techniques for solving the advection-dispersion equation (finite difference, predictor corrector, total variation diminishing, method of characteristics, and modified method of characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using discrete, randomly distributed, homogeneous blocks of five sand types. This experimental model provides an opportunity to compare the solution techniques: the heterogeneous hydraulic-conductivity distribution of known structure can be accurately represented by a numerical model, and detailed measurements can be compared with simulated concentrations and total flow through the tank. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation given the different methods of simulating solute transport. The breakthrough curves show that simulated peak concentrations, even at very fine grid spacings, varied between the techniques because of different amounts of numerical dispersion. Sensitivity-analysis results revealed: (1) a high correlation between hydraulic conductivity and porosity given the concentration and flow observations used, so that both could not be estimated; and (2) that the breakthrough curve data did not provide enough information to estimate individual values of dispersivity for the five sands. This study demonstrates that the choice of assigned dispersivity and the amount of numerical dispersion present in the solution technique influence estimated hydraulic conductivity values to a surprising degree.

  3. Evolutionary and Comparative Genomics to Drive Rational Drug Design, with Particular Focus on Neuropeptide Seven-Transmembrane Receptors.

    PubMed

    Furlong, Michael; Seong, Jae Young

    2017-01-01

    Seven transmembrane receptors (7TMRs), also known as G protein-coupled receptors, are popular targets of drug development, particularly 7TMR systems that are activated by peptide ligands. Although many pharmaceutical drugs have been discovered via conventional bulk analysis techniques the increasing availability of structural and evolutionary data are facilitating change to rational, targeted drug design. This article discusses the appeal of neuropeptide-7TMR systems as drug targets and provides an overview of concepts in the evolution of vertebrate genomes and gene families. Subsequently, methods that use evolutionary concepts and comparative analysis techniques to aid in gene discovery, gene function identification, and novel drug design are provided along with case study examples.

  4. Evolutionary and Comparative Genomics to Drive Rational Drug Design, with Particular Focus on Neuropeptide Seven-Transmembrane Receptors

    PubMed Central

    Furlong, Michael; Seong, Jae Young

    2017-01-01

    Seven transmembrane receptors (7TMRs), also known as G protein-coupled receptors, are popular targets of drug development, particularly 7TMR systems that are activated by peptide ligands. Although many pharmaceutical drugs have been discovered via conventional bulk analysis techniques the increasing availability of structural and evolutionary data are facilitating change to rational, targeted drug design. This article discusses the appeal of neuropeptide-7TMR systems as drug targets and provides an overview of concepts in the evolution of vertebrate genomes and gene families. Subsequently, methods that use evolutionary concepts and comparative analysis techniques to aid in gene discovery, gene function identification, and novel drug design are provided along with case study examples. PMID:28035082

  5. Ranking the strategies for Indian medical tourism sector through the integration of SWOT analysis and TOPSIS method.

    PubMed

    Ajmera, Puneeta

    2017-10-09

    Purpose Organizations have to evaluate their internal and external environments in this highly competitive world. Strengths, weaknesses, opportunities and threats (SWOT) analysis is a very useful technique which analyzes the strengths, weaknesses, opportunities and threats of an organization for taking strategic decisions and it also provides a foundation for the formulation of strategies. But the drawback of SWOT analysis is that it does not quantify the importance of individual factors affecting the organization and the individual factors are described in brief without weighing them. Because of this reason, SWOT analysis can be integrated with any multiple attribute decision-making (MADM) technique like the technique for order preference by similarity to ideal solution (TOPSIS), analytical hierarchy process, etc., to evaluate the best alternative among the available strategic alternatives. The paper aims to discuss these issues. Design/methodology/approach In this study, SWOT analysis is integrated with a multicriteria decision-making technique called TOPSIS to rank different strategies for Indian medical tourism in order of priority. Findings SO strategy (providing best facilitation and care to the medical tourists at par to developed countries) is the best strategy which matches with the four elements of S, W, O and T of SWOT matrix and 35 strategic indicators. Practical implications This paper proposes a solution based on a combined SWOT analysis and TOPSIS approach to help the organizations to evaluate and select strategies. Originality/value Creating a new technology or administering a new strategy always has some degree of resistance by employees. To minimize resistance, the author has used TOPSIS as it involves group thinking, requiring every manager of the organization to analyze and evaluate different alternatives and average measure of each parameter in final decision matrix.

  6. Chapter 14: Electron Microscopy on Thin Films for Solar Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero, Manuel; Abou-Ras, Daniel; Nichterwitz, Melanie

    2016-07-22

    This chapter overviews the various techniques applied in scanning electron microscopy (SEM) and transmission electron microscopy (TEM), and highlights their possibilities and also limitations. It gives the various imaging and analysis techniques applied on a scanning electron microscope. The chapter shows that imaging is divided into that making use of secondary electrons (SEs) and of backscattered electrons (BSEs), resulting in different contrasts in the images and thus providing information on compositions, microstructures, and surface potentials. Whenever aiming for imaging and analyses at scales of down to the angstroms range, TEM and its related techniques are appropriate tools. In many cases,more » also SEM techniques provide the access to various material properties of the individual layers, not requiring specimen preparation as time consuming as TEM techniques. Finally, the chapter dedicates to cross-sectional specimen preparation for electron microscopy. The preparation decides indeed on the quality of imaging and analyses.« less

  7. Measurement techniques for trace metals in coal-plant effluents: A brief review

    NASA Technical Reports Server (NTRS)

    Singh, J. J.

    1979-01-01

    The strong features and limitations of techniques for determining trace elements in aerosols emitted from coal plants are discussed. Techniques reviewed include atomic absorption spectroscopy, charged particle scattering and activation, instrumental neutron activation analysis, gas/liquid chromatography, gas chromatographic/mass spectrometric methods, X-ray fluorescence, and charged-particle-induced X-ray emission. The latter two methods are emphasized. They provide simultaneous, sensitive multielement analyses and lend themselves readily to depth profiling. It is recommended that whenever feasible, two or more complementary techniques should be used for analyzing environmental samples.

  8. Computed tomography for non-destructive evaluation of composites: Applications and correlations

    NASA Technical Reports Server (NTRS)

    Goldberg, B.; Hediger, L.; Noel, E.

    1985-01-01

    The state-of-the-art fabrication techniques for composite materials are such that stringent species-specific acceptance criteria must be generated to insure product reliability. Non-destructive evaluation techniques including computed tomography (CT), X-ray radiography (RT), and ultrasonic scanning (UT) are investigated and compared to determine their applicability and limitations to graphite epoxy, carbon-carbon, and carbon-phenolic materials. While the techniques appear complementary, CT is shown to provide significant, heretofore unattainable data. Finally, a correlation of NDE techniques to destructive analysis is presented.

  9. Development and evaluation of an automatic labeling technique for spring small grains

    NASA Technical Reports Server (NTRS)

    Crist, E. P.; Malila, W. A. (Principal Investigator)

    1981-01-01

    A labeling technique is described which seeks to associate a sampling entity with a particular crop or crop group based on similarity of growing season and temporal-spectral patterns of development. Human analyst provide contextual information, after which labeling decisions are made automatically. Results of a test of the technique on a large, multi-year data set are reported. Grain labeling accuracies are similar to those achieved by human analysis techniques, while non-grain accuracies are lower. Recommendations for improvments and implications of the test results are discussed.

  10. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Space Shuttle Main Engine (SSME) has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature, pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system; (2) develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amount of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. High compression ratio can be achieved to allow minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities; and (3) integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.

  11. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1993-01-01

    The SSME has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) Develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system. (2) Develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amounts of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. A high compression ratio can be achieved to allow the minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities. (3) Integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for a quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.

  12. Signal classification using global dynamical models, Part II: SONAR data analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kremliovsky, M.; Kadtke, J.

    1996-06-01

    In Part I of this paper, we described a numerical method for nonlinear signal detection and classification which made use of techniques borrowed from dynamical systems theory. Here in Part II of the paper, we will describe an example of data analysis using this method, for data consisting of open ocean acoustic (SONAR) recordings of marine mammal transients, supplied from NUWC sources. The purpose here is two-fold: first to give a more operational description of the technique and provide rules-of-thumb for parameter choices; and second to discuss some new issues raised by the analysis of non-ideal (real-world) data sets. Themore » particular data set considered here is quite non-stationary, relatively noisy, is not clearly localized in the background, and as such provides a difficult challenge for most detection/classification schemes. {copyright} {ital 1996 American Institute of Physics.}« less

  13. A Regional Analysis of U.S. Insurance Reimbursement Guidelines for Massage Therapy.

    PubMed

    Miccio, Robin S; Cowen, Virginia S

    2018-03-01

    Massage techniques fall within the scope of many different health care providers. Physical therapists, occupational therapists, and chiropractors receive insurance reimbursement for health care services, including massage. Although many patients pay out of pocket for massage services, it is unclear how the insurance company reimbursement policies factor provider qualifications into coverage. This project examined regional insurance reimbursement guidelines for massage therapy in relation to the role of the provider of massage services. A qualitative content analysis was used to explore guidelines for 26 health insurance policies across seven US companies providing coverage in the northeastern United States. Publicly available information relevant to massage was obtained from insurance company websites and extracted into a dataset for thematic analysis. Data obtained included practice guidelines, techniques, and provider requirements. Information from the dataset was coded and analyzed using descriptive statistics. Of the policies reviewed, 23% explicitly stated massage treatments were limited to 15-minute increments, 19% covered massage as one part of a comprehensive rehabilitation plan, and 27% required physician prescription. Massage techniques mentioned as qualifying for reimbursement included: Swedish, manual lymphatic drainage, mobilization/manipulation, myofascial release, and traction. Chiropractors, physical therapists, and occupational therapists could directly bill for massage. Massage therapists were specifically excluded as covered providers for seven (27%) policies. Although research supports massage for the treatment of a variety of conditions, the provider type has not been separately addressed. The reviewed policies that served the Northeastern states explicitly stated massage therapists could not bill insurance companies directly. The same insurance companies examined reimbursement for massage therapists in their western U.S. state policies. Other health care providers were able to bill directly for massage services to companies that did not accept direct billing by massage therapists. The specific exclusion of massage therapists as eligible providers violates the Affordable Care Act's non-discriminatory provision. Massage therapists should continue to advocate for reimbursement privileges to spur wider acceptance of massage therapy in health care.

  14. How Does One Assess the Accuracy of Academic Success Predictors? ROC Analysis Applied to University Entrance Factors

    ERIC Educational Resources Information Center

    Vivo, Juana-Maria; Franco, Manuel

    2008-01-01

    This article attempts to present a novel application of a method of measuring accuracy for academic success predictors that could be used as a standard. This procedure is known as the receiver operating characteristic (ROC) curve, which comes from statistical decision techniques. The statistical prediction techniques provide predictor models and…

  15. Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique

    ERIC Educational Resources Information Center

    Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

    2005-01-01

    A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

  16. Reading: Tests and Assessment Techniques. Second Edition. United Kingdom Reading Association Teaching of Reading Monograph Series.

    ERIC Educational Resources Information Center

    Pumfrey, Peter D.

    The second edition of this British publication provides details of recent developments in the assessment of reading attainments and the analysis of reading processes. The book begins with a description of various types of reading tests and assessment techniques with consideration given to the purposes for which normative, criterion-referenced, and…

  17. A Bio Medical Waste Identification and Classification Algorithm Using Mltrp and Rvm.

    PubMed

    Achuthan, Aravindan; Ayyallu Madangopal, Vasumathi

    2016-10-01

    We aimed to extract the histogram features for text analysis and, to classify the types of Bio Medical Waste (BMW) for garbage disposal and management. The given BMW was preprocessed by using the median filtering technique that efficiently reduced the noise in the image. After that, the histogram features of the filtered image were extracted with the help of proposed Modified Local Tetra Pattern (MLTrP) technique. Finally, the Relevance Vector Machine (RVM) was used to classify the BMW into human body parts, plastics, cotton and liquids. The BMW image was collected from the garbage image dataset for analysis. The performance of the proposed BMW identification and classification system was evaluated in terms of sensitivity, specificity, classification rate and accuracy with the help of MATLAB. When compared to the existing techniques, the proposed techniques provided the better results. This work proposes a new texture analysis and classification technique for BMW management and disposal. It can be used in many real time applications such as hospital and healthcare management systems for proper BMW disposal.

  18. Practical aspects of NMR signal assignment in larger and challenging proteins

    PubMed Central

    Frueh, Dominique P.

    2014-01-01

    NMR has matured into a technique routinely employed for studying proteins in near physiological conditions. However, applications to larger proteins are impeded by the complexity of the various correlation maps necessary to assign NMR signals. This article reviews the data analysis techniques traditionally employed for resonance assignment and describes alternative protocols necessary for overcoming challenges in large protein spectra. In particular, simultaneous analysis of multiple spectra may help overcome ambiguities or may reveal correlations in an indirect manner. Similarly, visualization of orthogonal planes in a multidimensional spectrum can provide alternative assignment procedures. We describe examples of such strategies for assignment of backbone, methyl, and nOe resonances. We describe experimental aspects of data acquisition for the related experiments and provide guidelines for preliminary studies. Focus is placed on large folded monomeric proteins and examples are provided for 37, 48, 53, and 81 kDa proteins. PMID:24534088

  19. 3D FISH to analyse gene domain-specific chromatin re-modeling in human cancer cell lines.

    PubMed

    Kocanova, Silvia; Goiffon, Isabelle; Bystricky, Kerstin

    2018-06-01

    Fluorescence in situ hybridization (FISH) is a common technique used to label DNA and/or RNA for detection of a genomic region of interest. However, the technique can be challenging, in particular when applied to single genes in human cancer cells. Here, we provide a step-by-step protocol for analysis of short (35 kb-300 kb) genomic regions in three dimensions (3D). We discuss the experimental design and provide practical considerations for 3D imaging and data analysis to determine chromatin folding. We demonstrate that 3D FISH using BACs (Bacterial Artificial Chromosomes) or fosmids can provide detailed information of the architecture of gene domains. More specifically, we show that mapping of specific chromatin landscapes informs on changes associated with estrogen stimulated gene activity in human breast cancer cell lines. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Principles of Metamorphic Petrology

    NASA Astrophysics Data System (ADS)

    Williams, Michael L.

    2009-05-01

    The field of metamorphic petrology has seen spectacular advances in the past decade, including new X-ray mapping techniques for characterizing metamorphic rocks and minerals, new internally consistent thermobarometers, new software for constructing and viewing phase diagrams, new methods to date metamorphic processes, and perhaps most significant, revised petrologic databases and the ability to calculate accurate phase diagrams and pseudosections. These tools and techniques provide new power and resolution for constraining pressure-temperature (P-T) histories and tectonic events. Two books have been fundamental for empowering petrologists and structural geologists during the past decade. Frank Spear's Metamorphic Phase Equilibria and Pressure-Temperature-Time Paths, published in 1993, builds on his seminal papers to provide a quantitative framework for P-T path analysis. Spear's book lays the foundation for modern quantitative metamorphic analysis. Cees Passchier and Rudolph Trouw's Microtectonics, published in 2005, with its superb photos and figures, provides the tools and the theory for interpreting deformation textures and inferring deformation processes.

  1. Development of a versatile user-friendly IBA experimental chamber

    NASA Astrophysics Data System (ADS)

    Kakuee, Omidreza; Fathollahi, Vahid; Lamehi-Rachti, Mohammad

    2016-03-01

    Reliable performance of the Ion Beam Analysis (IBA) techniques is based on the accurate geometry of the experimental setup, employment of the reliable nuclear data and implementation of dedicated analysis software for each of the IBA techniques. It has already been shown that geometrical imperfections lead to significant uncertainties in quantifications of IBA measurements. To minimize these uncertainties, a user-friendly experimental chamber with a heuristic sample positioning system for IBA analysis was recently developed in the Van de Graaff laboratory in Tehran. This system enhances IBA capabilities and in particular Nuclear Reaction Analysis (NRA) and Elastic Recoil Detection Analysis (ERDA) techniques. The newly developed sample manipulator provides the possibility of both controlling the tilt angle of the sample and analyzing samples with different thicknesses. Moreover, a reasonable number of samples can be loaded in the sample wheel. A comparison of the measured cross section data of the 16O(d,p1)17O reaction with the data reported in the literature confirms the performance and capability of the newly developed experimental chamber.

  2. Comprehensive Analysis of LC/MS Data Using Pseudocolor Plots

    NASA Astrophysics Data System (ADS)

    Crutchfield, Christopher A.; Olson, Matthew T.; Gourgari, Evgenia; Nesterova, Maria; Stratakis, Constantine A.; Yergey, Alfred L.

    2013-02-01

    We have developed new applications of the pseudocolor plot for the analysis of LC/MS data. These applications include spectral averaging, analysis of variance, differential comparison of spectra, and qualitative filtering by compound class. These applications have been motivated by the need to better understand LC/MS data generated from analysis of human biofluids. The examples presented use data generated to profile steroid hormones in urine extracts from a Cushing's disease patient relative to a healthy control, but are general to any discovery-based scanning mass spectrometry technique. In addition to new visualization techniques, we introduce a new metric of variance: the relative maximum difference from the mean. We also introduce the concept of substructure-dependent analysis of steroid hormones using precursor ion scans. These new analytical techniques provide an alternative approach to traditional untargeted metabolomics workflow. We present an approach to discovery using MS that essentially eliminates alignment or preprocessing of spectra. Moreover, we demonstrate the concept that untargeted metabolomics can be achieved using low mass resolution instrumentation.

  3. Short-Arc Analysis of Intersatellite Tracking Data in a Gravity Mapping Mission

    NASA Technical Reports Server (NTRS)

    Rowlands, David D.; Ray, Richard D.; Chinn, Douglas S.; Lemoine, Frank G.; Smith, David E. (Technical Monitor)

    2001-01-01

    A technique for the analysis of low-low intersatellite range-rate data in a gravity mapping mission is explored. The technique is based on standard tracking data analysis for orbit determination but uses a spherical coordinate representation of the 12 epoch state parameters describing the baseline between the two satellites. This representation of the state parameters is exploited to allow the intersatellite range-rate analysis to benefit from information provided by other tracking data types without large simultaneous multiple data type solutions. The technique appears especially valuable for estimating gravity from short arcs (e.g., less than 15 minutes) of data. Gravity recovery simulations which use short arcs are compared with those using arcs a day in length. For a high-inclination orbit, the short-arc analysis recovers low-order gravity coefficients remarkably well, although higher order terms, especially sectorial terms, are less accurate. Simulations suggest that either long or short arcs of GRACE data are likely to improve parts of the geopotential spectrum by orders of magnitude.

  4. Evaluating Mixture Modeling for Clustering: Recommendations and Cautions

    ERIC Educational Resources Information Center

    Steinley, Douglas; Brusco, Michael J.

    2011-01-01

    This article provides a large-scale investigation into several of the properties of mixture-model clustering techniques (also referred to as latent class cluster analysis, latent profile analysis, model-based clustering, probabilistic clustering, Bayesian classification, unsupervised learning, and finite mixture models; see Vermunt & Magdison,…

  5. Interest rate next-day variation prediction based on hybrid feedforward neural network, particle swarm optimization, and multiresolution techniques

    NASA Astrophysics Data System (ADS)

    Lahmiri, Salim

    2016-02-01

    Multiresolution analysis techniques including continuous wavelet transform, empirical mode decomposition, and variational mode decomposition are tested in the context of interest rate next-day variation prediction. In particular, multiresolution analysis techniques are used to decompose interest rate actual variation and feedforward neural network for training and prediction. Particle swarm optimization technique is adopted to optimize its initial weights. For comparison purpose, autoregressive moving average model, random walk process and the naive model are used as main reference models. In order to show the feasibility of the presented hybrid models that combine multiresolution analysis techniques and feedforward neural network optimized by particle swarm optimization, we used a set of six illustrative interest rates; including Moody's seasoned Aaa corporate bond yield, Moody's seasoned Baa corporate bond yield, 3-Month, 6-Month and 1-Year treasury bills, and effective federal fund rate. The forecasting results show that all multiresolution-based prediction systems outperform the conventional reference models on the criteria of mean absolute error, mean absolute deviation, and root mean-squared error. Therefore, it is advantageous to adopt hybrid multiresolution techniques and soft computing models to forecast interest rate daily variations as they provide good forecasting performance.

  6. Integration of heterogeneous data for classification in hyperspectral satellite imagery

    NASA Astrophysics Data System (ADS)

    Benedetto, J.; Czaja, W.; Dobrosotskaya, J.; Doster, T.; Duke, K.; Gillis, D.

    2012-06-01

    As new remote sensing modalities emerge, it becomes increasingly important to nd more suitable algorithms for fusion and integration of dierent data types for the purposes of target/anomaly detection and classication. Typical techniques that deal with this problem are based on performing detection/classication/segmentation separately in chosen modalities, and then integrating the resulting outcomes into a more complete picture. In this paper we provide a broad analysis of a new approach, based on creating fused representations of the multi- modal data, which then can be subjected to analysis by means of the state-of-the-art classiers or detectors. In this scenario we shall consider the hyperspectral imagery combined with spatial information. Our approach involves machine learning techniques based on analysis of joint data-dependent graphs and their associated diusion kernels. Then, the signicant eigenvectors of the derived fused graph Laplace operator form the new representation, which provides integrated features from the heterogeneous input data. We compare these fused approaches with analysis of integrated outputs of spatial and spectral graph methods.

  7. Chemical Differentiation of Osseous, Dental, and Non-skeletal Materials in Forensic Anthropology using Elemental Analysis.

    PubMed

    Zimmerman, Heather A; Meizel-Lambert, Cayli J; Schultz, John J; Sigman, Michael E

    2015-03-01

    Forensic anthropologists are generally able to identify skeletal materials (bone and tooth) using gross anatomical features; however, highly fragmented or taphonomically altered materials may be problematic to identify. Several chemical analysis techniques have been shown to be reliable laboratory methods that can be used to determine if questionable fragments are osseous, dental, or non-skeletal in nature. The purpose of this review is to provide a detailed background of chemical analysis techniques focusing on elemental compositions that have been assessed for use in differentiating osseous, dental, and non-skeletal materials. More recently, chemical analysis studies have also focused on using the elemental composition of osseous/dental materials to evaluate species and provide individual discrimination, but have generally been successful only in small, closed groups, limiting their use forensically. Despite significant advances incorporating a variety of instruments, including handheld devices, further research is necessary to address issues in standardization, error rates, and sample size/diversity. Copyright © 2014 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  8. The analysis and forecasting of male cycling time trial records established within England and Wales.

    PubMed

    Dyer, Bryce; Hassani, Hossein; Shadi, Mehran

    2016-01-01

    The format of cycling time trials in England, Wales and Northern Ireland, involves riders competing individually over several fixed race distances of 10-100 miles in length and using time constrained formats of 12 and 24 h in duration. Drawing on data provided by the national governing body that covers the regions of England and Wales, an analysis of six male competition record progressions was undertaken to illustrate its progression. Future forecasts are then projected through use of the Singular Spectrum Analysis technique. This method has not been applied to sport-based time series data before. All six records have seen a progressive improvement and are non-linear in nature. Five records saw their highest level of record change during the 1950-1969 period. Whilst new record frequency generally has reduced since this period, the magnitude of performance improvement has generally increased. The Singular Spectrum Analysis technique successfully provided forecasted projections in the short to medium term with a high level of fit to the time series data.

  9. Mixed-venous oxygen tension by nitrogen rebreathing - A critical, theoretical analysis.

    NASA Technical Reports Server (NTRS)

    Kelman, G. R.

    1972-01-01

    There is dispute about the validity of the nitrogen rebreathing technique for determination of mixed-venous oxygen tension. This theoretical analysis examines the circumstances under which the technique is likely to be applicable. When the plateau method is used the probable error in mixed-venous oxygen tension is plus or minus 2.5 mm Hg at rest, and of the order of plus or minus 1 mm Hg during exercise. Provided, that the rebreathing bag size is reasonably chosen, Denison's (1967) extrapolation technique gives results at least as accurate as those obtained by the plateau method. At rest, however, extrapolation should be to 30 rather than to 20 sec.

  10. Noncontact techniques for diesel engine diagnostics using exhaust waveform analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gore, D.A.; Cooke, G.J.

    1987-01-01

    RCA Corporation's continuing efforts to develop noncontact test techniques for diesel engines have led to recent advancements in deep engine diagnostics. The U.S. Army Tank-Automotive Command (TACOM) has been working with RCA for the development of new noncontact sensors and test techniques which use these sensors in conjunction with their family of Simplified Test Equipment (STE) to perform vehicle diagnostics. The STE systems are microprocessor-based maintenance tools that assist the Army mechanic in diagnosing malfunctions in both tactical and combat vehicles. The test systems support the mechanic by providing the sophisticated signal processing capabilities necessary for a wide range ofmore » diagnostic testing including exhaust waveform analysis.« less

  11. Methods for automatically analyzing humpback song units.

    PubMed

    Rickwood, Peter; Taylor, Andrew

    2008-03-01

    This paper presents mathematical techniques for automatically extracting and analyzing bioacoustic signals. Automatic techniques are described for isolation of target signals from background noise, extraction of features from target signals and unsupervised classification (clustering) of the target signals based on these features. The only user-provided inputs, other than raw sound, is an initial set of signal processing and control parameters. Of particular note is that the number of signal categories is determined automatically. The techniques, applied to hydrophone recordings of humpback whales (Megaptera novaeangliae), produce promising initial results, suggesting that they may be of use in automated analysis of not only humpbacks, but possibly also in other bioacoustic settings where automated analysis is desirable.

  12. Center of Excellence for Applied Mathematical and Statistical Research in support of development of multicrop production monitoring capability

    NASA Technical Reports Server (NTRS)

    Woodward, W. A.; Gray, H. L.

    1983-01-01

    Efforts in support of the development of multicrop production monitoring capability are reported. In particular, segment level proportion estimation techniques based upon a mixture model were investigated. Efforts have dealt primarily with evaluation of current techniques and development of alternative ones. A comparison of techniques is provided on both simulated and LANDSAT data along with an analysis of the quality of profile variables obtained from LANDSAT data.

  13. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    PubMed

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  14. Genetic analysis of Asian longhorned beetle populations from Chicago, New York, and China using the RAPD technique

    Treesearch

    James M. Slavicek; Patricia De Graff

    2003-01-01

    Anoplophora glabripennis samples were collected in the Ravenswood area of Chicago, near the Mt. Zion cemetery in Queens, New York (provided by Leah Bauer) and the Gansu Province in northwest China (provided by Leah Bauer).

  15. The Study of an Integrated Rating System for Supplier Quality Performance in the Semiconductor Industry

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Cheng; Yen, Tieh-Min; Tsai, Chih-Hung

    This study provides an integrated model of Supplier Quality Performance Assesment (SQPA) activity for the semiconductor industry through introducing the ISO 9001 management framework, Importance-Performance Analysis (IPA) Supplier Quality Performance Assesment and Taguchi`s Signal-to-Noise Ratio (S/N) techniques. This integrated model provides a SQPA methodology to create value for all members under mutual cooperation and trust in the supply chain. This method helps organizations build a complete SQPA framework, linking organizational objectives and SQPA activities to optimize rating techniques to promote supplier quality improvement. The techniques used in SQPA activities are easily understood. A case involving a design house is illustrated to show our model.

  16. Objective analysis of tidal fields in the Atlantic and Indian Oceans

    NASA Technical Reports Server (NTRS)

    Sanchez, B. V.; Rao, D. B.; Steenrod, S. D.

    1986-01-01

    An objective analysis technique has been developed to extrapolate tidal amplitudes and phases over entire ocean basins using existing gauge data and the altimetric measurements which are now beginning to be provided by satellite oceanography. The technique was previously tested in the Lake Superior basin. The method has now been developed and applied in the Atlantic-Indian ocean basins using a 6 deg x 6 deg grid to test its essential features. The functions used in the interpolation are the eigenfunctions of the velocity potential (Proudman functions) which are computed numerically from a knowledge of the basin's bottom topography, the horizontal plan form and the necessary boundary conditions. These functions are characteristic of the particular basin. The gravitational normal modes of the basin are computed as part of the investigation, they are used to obtain the theoretical forced solutions for the tidal constituents, the latter provide the simulated data for the testing of the method and serve as a guide in choosing the most energetic modes for the objective analysis. The results of the objective analysis of the M2 and K1 tidal constituents indicate the possibility of recovering the tidal signal with a degree of accuracy well within the error bounds of present day satellite techniques.

  17. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-05-25

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  18. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-11-23

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  19. Radio Occultation Experiments with Venus Express and Mars Express using the Planetary Radio Interferometry and Doppler Experiment (PRIDE) Technique

    NASA Astrophysics Data System (ADS)

    Bocanegra Bahamon, T.; Gurvits, L.; Molera Calves, G.; Cimo, G.; Duev, D.; Pogrebenko, S.; Dirkx, D.; Rosenblatt, P.

    2017-12-01

    The Planetary Radio Interferometry and Doppler Experiment (PRIDE) is a technique that can be used to enhance multiple radio science experiments of planetary missions. By 'eavesdropping' on the spacecraft signal using radio telescopes from different VLBI networks around the world, the PRIDE technique provides precise open-loop Doppler and VLBI observables to able to reconstruct the spacecraft's orbit. The application of this technique for atmospheric studies has been assessed by observing ESA's Venus Express (VEX) and Mars Express (MEX) during multiple Venus and Mars occultation events between 2012 and 2014. From these observing sessions density, temperature and pressure profiles of Venus and Mars neutral atmosphere and ionosphere have been retrieved. We present an error propagation analysis where the uncertainties of the atmospheric properties measured with this technique have been derived. These activities serve as demonstration of the applicability of the PRIDE technique for radio occultation studies, and provides a benchmark against the traditional Doppler tracking provided by the NASA's DSN and ESA's Estrack networks for these same purposes, in the framework of the upcoming ESA JUICE mission to the Jovian system.

  20. Steam generator tubing NDE performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, G.; Welty, C.S. Jr.

    1997-02-01

    Steam generator (SG) non-destructive examination (NDE) is a fundamental element in the broader SG in-service inspection (ISI) process, a cornerstone in the management of PWR steam generators. Based on objective performance measures (tube leak forced outages and SG-related capacity factor loss), ISI performance has shown a continually improving trend over the years. Performance of the NDE element is a function of the fundamental capability of the technique, and the ability of the analysis portion of the process in field implementation of the technique. The technology continues to improve in several areas, e.g. system sensitivity, data collection rates, probe/coil design, andmore » data analysis software. With these improvements comes the attendant requirement for qualification of the technique on the damage form(s) to which it will be applied, and for training and qualification of the data analysis element of the ISI process on the field implementation of the technique. The introduction of data transfer via fiber optic line allows for remote data acquisition and analysis, thus improving the efficiency of analysis for a limited pool of data analysts. This paper provides an overview of the current status of SG NDE, and identifies several important issues to be addressed.« less

  1. Circular dichroism spectroscopy: Enhancing a traditional undergraduate biochemistry laboratory experience.

    PubMed

    Lewis, Russell L; Seal, Erin L; Lorts, Aimee R; Stewart, Amanda L

    2017-11-01

    The undergraduate biochemistry laboratory curriculum is designed to provide students with experience in protein isolation and purification protocols as well as various data analysis techniques, which enhance the biochemistry lecture course and give students a broad range of tools upon which to build in graduate level laboratories or once they begin their careers. One of the most common biochemistry protein purification experiments is the isolation and characterization of cytochrome c. Students across the country purify cytochrome c, lysozyme, or some other well-known protein to learn these common purification techniques. What this series of experiments lacks is the use of sophisticated instrumentation that is rarely available to undergraduate students. To give students a broader background in biochemical spectroscopy techniques, a new circular dichroism (CD) laboratory experiment was introduced into the biochemistry laboratory curriculum. This CD experiment provides students with a means of conceptualizing the secondary structure of their purified protein, and assessments indicate that students' understanding of the technique increased significantly. Students conducted this experiment with ease and in a short time frame, so this laboratory is conducive to merging with other data analysis techniques within a single laboratory period. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(6):515-520, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.

  2. Current process in hearing-aid fitting appointments: An analysis of audiologists' use of behaviour change techniques using the behaviour change technique taxonomy (v1).

    PubMed

    Barker, Fiona; Mackenzie, Emma; de Lusignan, Simon

    2016-11-01

    To observe and analyse the range and nature of behaviour change techniques (BCTs) employed by audiologists during hearing-aid fitting consultations to encourage and enable hearing-aid use. Non-participant observation and qualitative thematic analysis using the behaviour change technique taxonomy (version 1) (BCTTv1). Ten consultations across five English NHS audiology departments. Audiologists engage in behaviours to ensure the hearing-aid is fitted to prescription and is comfortable to wear. They provide information, equipment, and training in how to use a hearing-aid including changing batteries, cleaning, and maintenance. There is scope for audiologists to use additional BCTs: collaborating with patients to develop a behavioural plan for hearing-aid use that includes goal-setting, action-planning and problem-solving; involving significant others; providing information on the benefits of hearing-aid use or the consequences of non-use and giving advice about using prompts/cues for hearing-aid use. This observational study of audiologist behaviour in hearing-aid fitting consultations has identified opportunities to use additional behaviour change techniques that might encourage hearing-aid use. This information defines potential intervention targets for further research with the aim of improving hearing-aid use amongst adults with acquired hearing loss.

  3. Combined use of quantitative ED-EPMA, Raman microspectrometry, and ATR-FTIR imaging techniques for the analysis of individual particles.

    PubMed

    Jung, Hae-Jin; Eom, Hyo-Jin; Kang, Hyun-Woo; Moreau, Myriam; Sobanska, Sophie; Ro, Chul-Un

    2014-08-21

    In this work, quantitative energy-dispersive electron probe X-ray microanalysis (ED-EPMA) (called low-Z particle EPMA), Raman microspectrometry (RMS), and attenuated total reflectance Fourier transform infrared spectroscopic (ATR-FTIR) imaging were applied in combination for the analysis of the same individual airborne particles for the first time. After examining individual particles of micrometer size by low-Z particle EPMA, consecutive examinations by RMS and ATR-FTIR imaging of the same individual particles were then performed. The relocation of the same particles on Al or Ag foils was successfully carried out among the three standalone instruments for several standard samples and an indoor airborne particle sample, resulting in the successful acquisition of quality spectral data from the three single-particle analytical techniques. The combined application of the three techniques to several different standard particles confirmed that those techniques provided consistent and complementary chemical composition information on the same individual particles. Further, it was clearly demonstrated that the three different types of spectral and imaging data from the same individual particles in an indoor aerosol sample provided richer information on physicochemical characteristics of the particle ensemble than that obtainable by the combined use of two single-particle analytical techniques.

  4. Measuring protein dynamics in live cells: protocols and practical considerations for fluorescence fluctuation microscopy

    PubMed Central

    Youker, Robert T.; Teng, Haibing

    2014-01-01

    Abstract. Quantitative analysis of protein complex stoichiometries and mobilities are critical for elucidating the mechanisms that regulate cellular pathways. Fluorescence fluctuation spectroscopy (FFS) techniques can measure protein dynamics, such as diffusion coefficients and formation of complexes, with extraordinary precision and sensitivity. Complete calibration and characterization of the microscope instrument is necessary in order to avoid artifacts during data acquisition and to capitalize on the full capabilities of FFS techniques. We provide an overview of the theory behind FFS techniques, discuss calibration procedures, provide protocols, and give practical considerations for performing FFS experiments. One important parameter recovered from FFS measurements is the relative molecular brightness that can correlate with oligomerization. Three methods for measuring molecular brightness (fluorescence correlation spectroscopy, photon-counting histogram, and number and brightness analysis) recover similar values when measuring samples under ideal conditions in vitro. However, examples are given illustrating that these different methods used for calculating molecular brightness of fluorescent molecules in cells are not always equivalent. Methods relying on spot measurements are more prone to bleaching and movement artifacts that can lead to underestimation of brightness values. We advocate for the use of multiple FFS techniques to study molecular brightnesses to overcome and compliment limitations of individual techniques. PMID:25260867

  5. Comparative evaluation of differential laser-induced perturbation spectroscopy as a technique to discriminate emerging skin pathology

    NASA Astrophysics Data System (ADS)

    Kozikowski, Raymond T.; Smith, Sarah E.; Lee, Jennifer A.; Castleman, William L.; Sorg, Brian S.; Hahn, David W.

    2012-06-01

    Fluorescence spectroscopy has been widely investigated as a technique for identifying pathological tissue; however, unrelated subject-to-subject variations in spectra complicate data analysis and interpretation. We describe and evaluate a new biosensing technique, differential laser-induced perturbation spectroscopy (DLIPS), based on deep ultraviolet (UV) photochemical perturbation in combination with difference spectroscopy. This technique combines sequential fluorescence probing (pre- and post-perturbation) with sub-ablative UV perturbation and difference spectroscopy to provide a new spectral dimension, facilitating two improvements over fluorescence spectroscopy. First, the differential technique eliminates significant variations in absolute fluorescence response within subject populations. Second, UV perturbations alter the extracellular matrix (ECM), directly coupling the DLIPS response to the biological structure. Improved biosensing with DLIPS is demonstrated in vivo in a murine model of chemically induced skin lesion development. Component loading analysis of the data indicates that the DLIPS technique couples to structural proteins in the ECM. Analysis of variance shows that DLIPS has a significant response to emerging pathology as opposed to other population differences. An optimal likelihood ratio classifier for the DLIPS dataset shows that this technique holds promise for improved diagnosis of epithelial pathology. Results further indicate that DLIPS may improve diagnosis of tissue by augmenting fluorescence spectra (i.e. orthogonal sensing).

  6. Measurements of Cuspal Slope Inclination Angles in Palaeoanthropological Applications

    NASA Astrophysics Data System (ADS)

    Gaboutchian, A. V.; Knyaz, V. A.; Leybova, N. A.

    2017-05-01

    Tooth crown morphological features, studied in palaeoanthropology, provide valuable information about human evolution and development of civilization. Tooth crown morphology represents biological and historical data of high taxonomical value as it characterizes genetically conditioned tooth relief features averse to substantial changes under environmental factors during lifetime. Palaeoanthropological studies are still based mainly on descriptive techniques and manual measurements of limited number of morphological parameters. Feature evaluation and measurement result analysis are expert-based. Development of new methods and techniques in 3D imaging creates a background provides for better value of palaeoanthropological data processing, analysis and distribution. The goals of the presented research are to propose new features for automated odontometry and to explore their applicability to paleoanthropological studies. A technique for automated measuring of given morphological tooth parameters needed for anthropological study is developed. It is based on using original photogrammetric system as a teeth 3D models acquisition device and on a set of algorithms for given tooth parameters estimation.

  7. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  8. Electrospray Modifications for Advancing Mass Spectrometric Analysis

    PubMed Central

    Meher, Anil Kumar; Chen, Yu-Chie

    2017-01-01

    Generation of analyte ions in gas phase is a primary requirement for mass spectrometric analysis. One of the ionization techniques that can be used to generate gas phase ions is electrospray ionization (ESI). ESI is a soft ionization method that can be used to analyze analytes ranging from small organics to large biomolecules. Numerous ionization techniques derived from ESI have been reported in the past two decades. These ion sources are aimed to achieve simplicity and ease of operation. Many of these ionization methods allow the flexibility for elimination or minimization of sample preparation steps prior to mass spectrometric analysis. Such ion sources have opened up new possibilities for taking scientific challenges, which might be limited by the conventional ESI technique. Thus, the number of ESI variants continues to increase. This review provides an overview of ionization techniques based on the use of electrospray reported in recent years. Also, a brief discussion on the instrumentation, underlying processes, and selected applications is also presented. PMID:28573082

  9. An evaluation of machine processing techniques of ERTS-1 data for user applications. [urban land use and soil association mapping in Indiana

    NASA Technical Reports Server (NTRS)

    Landgrebe, D.

    1974-01-01

    A broad study is described to evaluate a set of machine analysis and processing techniques applied to ERTS-1 data. Based on the analysis results in urban land use analysis and soil association mapping together with previously reported results in general earth surface feature identification and crop species classification, a profile of general applicability of this procedure is beginning to emerge. Put in the hands of a user who knows well the information needed from the data and also is familiar with the region to be analyzed it appears that significantly useful information can be generated by these methods. When supported by preprocessing techniques such as the geometric correction and temporal registration capabilities, final products readily useable by user agencies appear possible. In parallel with application, through further research, there is much potential for further development of these techniques both with regard to providing higher performance and in new situations not yet studied.

  10. Use of communication techniques by Maryland dentists.

    PubMed

    Maybury, Catherine; Horowitz, Alice M; Wang, Min Qi; Kleinman, Dushanka V

    2013-12-01

    Health care providers' use of recommended communication techniques can increase patients' adherence to prevention and treatment regimens and improve patient health outcomes. The authors conducted a survey of Maryland dentists to determine the number and type of communication techniques they use on a routine basis. The authors mailed a 30-item questionnaire to a random sample of 1,393 general practice dentists and all 169 members of the Maryland chapter of the American Academy of Pediatric Dentistry. The overall response rate was 38.4 percent. Analysis included descriptive statistics, analysis of variance and ordinary least squares regression analysis to examine the association of dentists' characteristics with the number of communication techniques used. They set the significance level at P < .05. General dentists reported routinely using a mean of 7.9 of the 18 communication techniques and 3.6 of the seven basic techniques, whereas pediatric dentists reported using a mean of 8.4 and 3.8 of those techniques, respectively. General dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .01) but not the seven basic techniques (P < .05). Pediatric dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .05) and the seven basic techniques (P < .01). The number of communication techniques that dentists used routinely varied across the 18 techniques and was low for most techniques. Practical Implications. Professional education is needed both in dental school curricula and continuing education courses to increase use of recommended communication techniques. Specifically, dentists and their team members should consider taking communication skills courses and conducting an overall evaluation of their practices for user friendliness.

  11. Study of photon correlation techniques for processing of laser velocimeter signals

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1977-01-01

    The objective was to provide the theory and a system design for a new type of photon counting processor for low level dual scatter laser velocimeter (LV) signals which would be capable of both the first order measurements of mean flow and turbulence intensity and also the second order time statistics: cross correlation auto correlation, and related spectra. A general Poisson process model for low level LV signals and noise which is valid from the photon-resolved regime all the way to the limiting case of nonstationary Gaussian noise was used. Computer simulation algorithms and higher order statistical moment analysis of Poisson processes were derived and applied to the analysis of photon correlation techniques. A system design using a unique dual correlate and subtract frequency discriminator technique is postulated and analyzed. Expectation analysis indicates that the objective measurements are feasible.

  12. Reliability/safety analysis of a fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goddman, H. A.

    1980-01-01

    An analysis technique has been developed to estimate the reliability of a very complex, safety-critical system by constructing a diagram of the reliability equations for the total system. This diagram has many of the characteristics of a fault-tree or success-path diagram, but is much easier to construct for complex redundant systems. The diagram provides insight into system failure characteristics and identifies the most likely failure modes. A computer program aids in the construction of the diagram and the computation of reliability. Analysis of the NASA F-8 Digital Fly-by-Wire Flight Control System is used to illustrate the technique.

  13. Market basket analysis visualization on a spherical surface

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Hsu, Meichun; Dayal, Umeshwar; Wei, Shu F.; Sprenger, Thomas; Holenstein, Thomas

    2001-05-01

    This paper discusses the visualization of the relationships in e-commerce transactions. To date, many practical research projects have shown the usefulness of a physics-based mass- spring technique to layout data items with close relationships on a graph. We describe a market basket analysis visualization system using this technique. This system is described as the following: (1) integrates a physics-based engine into a visual data mining platform; (2) use a 3D spherical surface to visualize the cluster of related data items; and (3) for large volumes of transactions, uses hidden structures to unclutter the display. Several examples of market basket analysis are also provided.

  14. Applications of liquid-based separation in conjunction with mass spectrometry to the analysis of forensic evidence.

    PubMed

    Moini, Mehdi

    2018-05-01

    In the past few years, there has been a significant effort by the forensic science community to develop new scientific techniques for the analysis of forensic evidence. Forensic chemists have been spearheaded to develop information-rich confirmatory technologies and techniques and apply them to a broad array of forensic challenges. The purpose of these confirmatory techniques is to provide alternatives to presumptive techniques that rely on data such as color changes, pattern matching, or retention time alone, which are prone to more false positives. To this end, the application of separation techniques in conjunction with mass spectrometry has played an important role in the analysis of forensic evidence. Moreover, in the past few years the role of liquid separation techniques, such as liquid chromatography and capillary electrophoresis in conjunction with mass spectrometry, has gained significant tractions and have been applied to a wide range of chemicals, from small molecules such as drugs and explosives, to large molecules such as proteins. For example, proteomics and peptidomics have been used for identification of humans, organs, and bodily fluids. A wide range of HPLC techniques including reversed phase, hydrophilic interaction, mixed-mode, supercritical fluid, multidimensional chromatography, and nanoLC, as well as several modes of capillary electrophoresis mass spectrometry, including capillary zone electrophoresis, partial filling, full filling, and micellar electrokenetic chromatography have been applied to the analysis drugs, explosives, and questioned documents. In this article, we review recent (2015-2017) applications of liquid separation in conjunction with mass spectrometry to the analysis of forensic evidence. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Fault Tree Analysis: Its Implications for Use in Education.

    ERIC Educational Resources Information Center

    Barker, Bruce O.

    This study introduces the concept of Fault Tree Analysis as a systems tool and examines the implications of Fault Tree Analysis (FTA) as a technique for isolating failure modes in educational systems. A definition of FTA and discussion of its history, as it relates to education, are provided. The step by step process for implementation and use of…

  16. What School Psychologists Need to Know about Factor Analysis

    ERIC Educational Resources Information Center

    McGill, Ryan J.; Dombrowski, Stefan C.

    2017-01-01

    Factor analysis is a versatile class of psychometric techniques used by researchers to provide insight into the psychological dimensions (factors) that may account for the relationships among variables in a given dataset. The primary goal of a factor analysis is to determine a more parsimonious set of variables (i.e., fewer than the number of…

  17. Physics and Analysis at a Hadron Collider - An Introduction (1/3)

    ScienceCinema

    None

    2018-05-11

    This is the first lecture of three which together discuss the physics of hadron colliders with an emphasis on experimental techniques used for data analysis. This first lecture provides a brief introduction to hadron collider physics and collider detector experiments as well as offers some analysis guidelines. The lectures are aimed at graduate students.

  18. The R-package eseis - A toolbox to weld geomorphic, seismologic, spatial, and time series analysis

    NASA Astrophysics Data System (ADS)

    Dietze, Michael

    2017-04-01

    Environmental seismology is the science of investigating the seismic signals that are emitted by Earth surface processes. This emerging field provides unique opportunities to identify, locate, track and inspect a wide range of the processes that shape our planet. Modern broadband seismometers are sensitive enough to detect signals from sources as weak as wind interacting with the ground and as powerful as collapsing mountains. This places the field of environmental seismology at the seams of many geoscientific disciplines and requires integration of a series of specialised analysis techniques. R provides the perfect environment for this challenge. The package eseis uses the foundations laid by a series of existing packages and data types tailored to solve specialised problems (e.g., signal, sp, rgdal, Rcpp, matrixStats) and thus provides access to efficiently handling large streams of seismic data (> 300 million samples per station and day). It supports standard data formats (mseed, sac), preparation techniques (deconvolution, filtering, rotation), processing methods (spectra, spectrograms, event picking, migration for localisation) and data visualisation. Thus, eseis provides a seamless approach to the entire workflow of environmental seismology and passes the output to related analysis fields with temporal, spatial and modelling focus in R.

  19. A comparison between DART-MS and DSA-MS in the forensic analysis of writing inks.

    PubMed

    Drury, Nicholas; Ramotowski, Robert; Moini, Mehdi

    2018-05-23

    Ambient ionization mass spectrometry is gaining momentum in forensic science laboratories because of its high speed of analysis, minimal sample preparation, and information-rich results. One such application of ambient ionization methodology includes the analysis of writing inks from questioned documents where colorants of interest may not be soluble in common solvents, rendering thin layer chromatography (TLC) and separation-mass spectrometry methods such as LC/MS (-MS) impractical. Ambient ionization mass spectrometry uses a variety of ionization techniques such as penning ionization in Direct Analysis in Real Time (DART), and atmospheric pressure chemical ionization in Direct Sample Analysis (DSA), and electrospray ionization in Desorption Electrospray Ionization (DESI). In this manuscript, two of the commonly used ambient ionization techniques are compared: Perkin Elmer DSA-MS and IonSense DART in conjunction with a JEOL AccuTOF MS. Both technologies were equally successful in analyzing writing inks and produced similar spectra. DSA-MS produced less background signal likely because of its closed source configuration; however, the open source configuration of DART-MS provided more flexibility for sample positioning for optimum sensitivity and thereby allowing smaller piece of paper containing writing ink to be analyzed. Under these conditions, the minimum sample required for DART-MS was 1mm strokes of ink on paper, whereas DSA-MS required a minimum of 3mm. Moreover, both techniques showed comparable repeatability. Evaluation of the analytical figures of merit, including sensitivity, linear dynamic range, and repeatability, for DSA-MS and DART-MS analysis is provided. To the forensic context of the technique, DART-MS was applied to the analysis of United States Secret Service ink samples directly on a sampling mesh, and the results were compared with DSA-MS of the same inks on paper. Unlike analysis using separation mass spectrometry, which requires sample preparation, both DART-MS and DSA-MS successfully analyzed writing inks with minimal sample preparation. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Metabolic changes associated with papillary thyroid carcinoma: A nuclear magnetic resonance-based metabolomics study.

    PubMed

    Li, Yanyun; Chen, Minjian; Liu, Cuiping; Xia, Yankai; Xu, Bo; Hu, Yanhui; Chen, Ting; Shen, Meiping; Tang, Wei

    2018-05-01

    Papillary thyroid carcinoma (PTC) is the most common thyroid cancer. Nuclear magnetic resonance (NMR)‑based metabolomic technique is the gold standard in metabolite structural elucidation, and can provide different coverage of information compared with other metabolomic techniques. Here, we firstly conducted NMR based metabolomics study regarding detailed metabolic changes especially metabolic pathway changes related to PTC pathogenesis. 1H NMR-based metabolomic technique was adopted in conju-nction with multivariate analysis to analyze matched tumor and normal thyroid tissues obtained from 16 patients. The results were further annotated with Kyoto Encyclopedia of Genes and Genomes (KEGG), and Human Metabolome Database, and then were analyzed using modules of pathway analysis and enrichment analysis of MetaboAnalyst 3.0. Based on the analytical techniques, we established the models of principal component analysis (PCA), partial least squares-discriminant analysis (PLS-DA), and orthogonal partial least-squares discriminant analysis (OPLS‑DA) which could discriminate PTC from normal thyroid tissue, and found 15 robust differentiated metabolites from two OPLS-DA models. We identified 8 KEGG pathways and 3 pathways of small molecular pathway database which were significantly related to PTC by using pathway analysis and enrichment analysis, respectively, through which we identified metabolisms related to PTC including branched chain amino acid metabolism (leucine and valine), other amino acid metabolism (glycine and taurine), glycolysis (lactate), tricarboxylic acid cycle (citrate), choline metabolism (choline, ethanolamine and glycerolphosphocholine) and lipid metabolism (very-low‑density lipoprotein and low-density lipoprotein). In conclusion, the PTC was characterized with increased glycolysis and inhibited tricarboxylic acid cycle, increased oncogenic amino acids as well as abnormal choline and lipid metabolism. The findings in this study provide new insights into detailed metabolic changes of PTC, and hold great potential in the treatment of PTC.

  1. Laser Ablation-Aerosol Mass Spectrometry-Chemical Ionization Mass Spectrometry for Ambient Surface Imaging

    DOE PAGES

    Berry, Jennifer L.; Day, Douglas A.; Elseberg, Tim; ...

    2018-02-20

    Mass spectrometry imaging is becoming an increasingly common analytical technique due to its ability to provide spatially resolved chemical information. In this paper, we report a novel imaging approach combining laser ablation with two mass spectrometric techniques, aerosol mass spectrometry and chemical ionization mass spectrometry, separately and in parallel. Both mass spectrometric methods provide the fast response, rapid data acquisition, low detection limits, and high-resolution peak separation desirable for imaging complex samples. Additionally, the two techniques provide complementary information with aerosol mass spectrometry providing near universal detection of all aerosol molecules and chemical ionization mass spectrometry with a heated inletmore » providing molecular-level detail of both gases and aerosols. The two techniques operate with atmospheric pressure interfaces and require no matrix addition for ionization, allowing for samples to be investigated in their native state under ambient pressure conditions. We demonstrate the ability of laser ablation-aerosol mass spectrometry-chemical ionization mass spectrometry (LA-AMS-CIMS) to create 2D images of both standard compounds and complex mixtures. Finally, the results suggest that LA-AMS-CIMS, particularly when combined with advanced data analysis methods, could have broad applications in mass spectrometry imaging applications.« less

  2. Laser Ablation-Aerosol Mass Spectrometry-Chemical Ionization Mass Spectrometry for Ambient Surface Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Jennifer L.; Day, Douglas A.; Elseberg, Tim

    Mass spectrometry imaging is becoming an increasingly common analytical technique due to its ability to provide spatially resolved chemical information. In this paper, we report a novel imaging approach combining laser ablation with two mass spectrometric techniques, aerosol mass spectrometry and chemical ionization mass spectrometry, separately and in parallel. Both mass spectrometric methods provide the fast response, rapid data acquisition, low detection limits, and high-resolution peak separation desirable for imaging complex samples. Additionally, the two techniques provide complementary information with aerosol mass spectrometry providing near universal detection of all aerosol molecules and chemical ionization mass spectrometry with a heated inletmore » providing molecular-level detail of both gases and aerosols. The two techniques operate with atmospheric pressure interfaces and require no matrix addition for ionization, allowing for samples to be investigated in their native state under ambient pressure conditions. We demonstrate the ability of laser ablation-aerosol mass spectrometry-chemical ionization mass spectrometry (LA-AMS-CIMS) to create 2D images of both standard compounds and complex mixtures. Finally, the results suggest that LA-AMS-CIMS, particularly when combined with advanced data analysis methods, could have broad applications in mass spectrometry imaging applications.« less

  3. Exploring the Spatiotemporal Organization of Membrane Proteins in Living Plant Cells.

    PubMed

    Wang, Li; Xue, Yiqun; Xing, Jingjing; Song, Kai; Lin, Jinxing

    2018-04-29

    Plasma membrane proteins have important roles in transport and signal transduction. Deciphering the spatiotemporal organization of these proteins provides crucial information for elucidating the links between the behaviors of different molecules. However, monitoring membrane proteins without disrupting their membrane environment remains difficult. Over the past decade, many studies have developed single-molecule techniques, opening avenues for probing the stoichiometry and interactions of membrane proteins in their native environment by providing nanometer-scale spatial information and nanosecond-scale temporal information. In this review, we assess recent progress in the development of labeling and imaging technology for membrane protein analysis. We focus in particular on several single-molecule techniques for quantifying the dynamics and assembly of membrane proteins. Finally, we provide examples of how these new techniques are advancing our understanding of the complex biological functions of membrane proteins.

  4. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  5. Knowledge Management for the Analysis of Complex Experimentation.

    ERIC Educational Resources Information Center

    Maule, R.; Schacher, G.; Gallup, S.

    2002-01-01

    Describes a knowledge management system that was developed to help provide structure for dynamic and static data and to aid in the analysis of complex experimentation. Topics include quantitative and qualitative data; mining operations using artificial intelligence techniques; information architecture of the system; and transforming data into…

  6. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  7. Utility of correlation techniques in gravity and magnetic interpretation

    NASA Technical Reports Server (NTRS)

    Chandler, V. W.; Koski, J. S.; Braile, L. W.; Hinze, W. J.

    1977-01-01

    Two methods of quantitative combined analysis, internal correspondence and clustering, are presented. Model studies are used to illustrate implementation and interpretation procedures of these methods, particularly internal correspondence. Analysis of the results of applying these methods to data from the midcontinent and a transcontinental profile show they can be useful in identifying crustal provinces, providing information on horizontal and vertical variations of physical properties over province size zones, validating long wave-length anomalies, and isolating geomagnetic field removal problems. Thus, these techniques are useful in considering regional data acquired by satellites.

  8. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  9. Evaluation of a Modified Single-Enzyme Amplified-Fragment Length Polymorphism Technique for Fingerprinting and Differentiating of Mycobacterium kansasii Type I Isolates

    PubMed Central

    Gaafar, Ayman; Josebe Unzaga, M.; Cisterna, Ramón; Clavo, Felicitas Elena; Urra, Elena; Ayarza, Rafael; Martín, Gloria

    2003-01-01

    The usefulness of single-enzyme amplified-fragment length polymorphism (AFLP) analysis for the subtyping of Mycobacterium kansasii type I isolates was evaluated. This simplified technique classified 253 type I strains into 12 distinct clusters. The discriminating power of this technique was high, and the technique easily distinguished between the epidemiologically unrelated control strains and our clinical isolates. Overall, the technique was relatively rapid and technically simple, yet it gave reproducible and discriminatory results. This technique provides a powerful typing tool which may be helpful in solving many questions concerning the reservoirs, pathogenicities, and modes of transmission of these isolates. PMID:12904399

  10. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  11. Trace analysis of energetic materials via direct analyte-probed nanoextraction coupled to direct analysis in real time mass spectrometry.

    PubMed

    Clemons, Kristina; Dake, Jeffrey; Sisco, Edward; Verbeck, Guido F

    2013-09-10

    Direct analysis in real time mass spectrometry (DART-MS) has proven to be a useful forensic tool for the trace analysis of energetic materials. While other techniques for detecting trace amounts of explosives involve extraction, derivatization, solvent exchange, or sample clean-up, DART-MS requires none of these. Typical DART-MS analyses directly from a solid sample or from a swab have been quite successful; however, these methods may not always be an optimal sampling technique in a forensic setting. For example, if the sample were only located in an area which included a latent fingerprint of interest, direct DART-MS analysis or the use of a swab would almost certainly destroy the print. To avoid ruining such potentially invaluable evidence, another method has been developed which will leave the fingerprint virtually untouched. Direct analyte-probed nanoextraction coupled to nanospray ionization-mass spectrometry (DAPNe-NSI-MS) has demonstrated excellent sensitivity and repeatability in forensic analyses of trace amounts of illicit drugs from various types of surfaces. This technique employs a nanomanipulator in conjunction with bright-field microscopy to extract single particles from a surface of interest and has provided a limit of detection of 300 attograms for caffeine. Combining DAPNe with DART-MS provides another level of flexibility in forensic analysis, and has proven to be a sufficient detection method for trinitrotoluene (TNT), RDX, and 1-methylaminoanthraquinone (MAAQ). Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Validating Human Behavioral Models for Combat Simulations Using Techniques for the Evaluation of Human Performance

    DTIC Science & Technology

    2004-01-01

    Cognitive Task Analysis Abstract As Department of Defense (DoD) leaders rely more on modeling and simulation to provide information on which to base...capabilities and intent. Cognitive Task Analysis (CTA) Cognitive Task Analysis (CTA) is an extensive/detailed look at tasks and subtasks performed by a...Domain Analysis and Task Analysis: A Difference That Matters. In Cognitive Task Analysis , edited by J. M. Schraagen, S.

  13. Project Sell, Title VII: Final Evaluation 1970-1971.

    ERIC Educational Resources Information Center

    Condon, Elaine C.; And Others

    This evaluative report consists of two parts. The first is a narrative report which represents a summary by the evaluation team and recommendations regarding project activities; the second part provides a statistical analysis of project achievements. Details are provided on evaluation techniques, staff, management, instructional materials,…

  14. When continuous observations just won't do: developing accurate and efficient sampling strategies for the laying hen.

    PubMed

    Daigle, Courtney L; Siegford, Janice M

    2014-03-01

    Continuous observation is the most accurate way to determine animals' actual time budget and can provide a 'gold standard' representation of resource use, behavior frequency, and duration. Continuous observation is useful for capturing behaviors that are of short duration or occur infrequently. However, collecting continuous data is labor intensive and time consuming, making multiple individual or long-term data collection difficult. Six non-cage laying hens were video recorded for 15 h and behavioral data collected every 2 s were compared with data collected using scan sampling intervals of 5, 10, 15, 30, and 60 min and subsamples of 2 second observations performed for 10 min every 30 min, 15 min every 1 h, 30 min every 1.5 h, and 15 min every 2 h. Three statistical approaches were used to provide a comprehensive analysis to examine the quality of the data obtained via different sampling methods. General linear mixed models identified how the time budget from the sampling techniques differed from continuous observation. Correlation analysis identified how strongly results from the sampling techniques were associated with those from continuous observation. Regression analysis identified how well the results from the sampling techniques were associated with those from continuous observation, changes in magnitude, and whether a sampling technique had bias. Static behaviors were well represented with scan and time sampling techniques, while dynamic behaviors were best represented with time sampling techniques. Methods for identifying an appropriate sampling strategy based upon the type of behavior of interest are outlined and results for non-caged laying hens are presented. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. In vivo quantification of the shear modulus of the human Achilles tendon during passive loading using shear wave dispersion analysis.

    PubMed

    Helfenstein-Didier, C; Andrade, R J; Brum, J; Hug, F; Tanter, M; Nordez, A; Gennisson, J-L

    2016-03-21

    The shear wave velocity dispersion was analyzed in the Achilles tendon (AT) during passive dorsiflexion using a phase velocity method in order to obtain the tendon shear modulus (C 55). Based on this analysis, the aims of the present study were (i) to assess the reproducibility of the shear modulus for different ankle angles, (ii) to assess the effect of the probe locations, and (iii) to compare results with elasticity values obtained with the supersonic shear imaging (SSI) technique. The AT shear modulus (C 55) consistently increased with the ankle dorsiflexion (N = 10, p < 0.05). Furthermore, the technique showed a very good reproducibility (all standard error of the mean values <10.7 kPa and all coefficient of variation (CV) values ⩽ 0.05%). In addition, independently from the ankle dorsiflexion, the shear modulus was significantly higher in the proximal location compared to the more distal one. The shear modulus provided by SSI was always lower than C55 and the difference increased with the ankle dorsiflexion. However, shear modulus values provided by both methods were highly correlated (R = 0.84), indicating that the conventional shear wave elastography technique (SSI technique) can be used to compare tendon mechanical properties across populations. Future studies should determine the clinical relevance of the shear wave dispersion analysis, for instance in the case of tendinopathy or tendon tear.

  16. The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine

    Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less

  17. The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions

    DOE PAGES

    Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine; ...

    2017-07-18

    Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less

  18. Transcriptomics in cancer diagnostics: developments in technology, clinical research and commercialization.

    PubMed

    Sager, Monica; Yeat, Nai Chien; Pajaro-Van der Stadt, Stefan; Lin, Charlotte; Ren, Qiuyin; Lin, Jimmy

    2015-01-01

    Transcriptomic technologies are evolving to diagnose cancer earlier and more accurately to provide greater predictive and prognostic utility to oncologists and patients. Digital techniques such as RNA sequencing are replacing still-imaging techniques to provide more detailed analysis of the transcriptome and aberrant expression that causes oncogenesis, while companion diagnostics are developing to determine the likely effectiveness of targeted treatments. This article examines recent advancements in molecular profiling research and technology as applied to cancer diagnosis, clinical applications and predictions for the future of personalized medicine in oncology.

  19. The origin, composition and history of cometary ices from spectroscopic studies

    NASA Technical Reports Server (NTRS)

    Allamandola, L. J.

    1989-01-01

    The spectroscopic analysis of pristine cometary material provides a very important probe of the chemical identity of the material as well as of the physical and chemical conditions which prevailed during the comet's history. Concerning classical spectroscopy, the spectral regions which will most likely prove most useful are the infrared, the visible and ultraviolet. Newer spectroscopic techniques which have the potential to provide equally important information include nuclear magnetic resonance (NMR) and electron spin resonance (ESR). Each technique is summarized with emphasis placed on the kind of information which can be obtained.

  20. Thyroid Radiofrequency Ablation: Updates on Innovative Devices and Techniques

    PubMed Central

    Park, Hye Sun; Park, Auh Whan; Chung, Sae Rom; Choi, Young Jun; Lee, Jeong Hyun

    2017-01-01

    Radiofrequency ablation (RFA) is a well-known, effective, and safe method for treating benign thyroid nodules and recurrent thyroid cancers. Thyroid-dedicated devices and basic techniques for thyroid RFA were introduced by the Korean Society of Thyroid Radiology (KSThR) in 2012. Thyroid RFA has now been adopted worldwide, with subsequent advances in devices and techniques. To optimize the treatment efficacy and patient safety, understanding the basic and advanced RFA techniques and selecting the optimal treatment strategy are critical. The goal of this review is to therefore provide updates and analysis of current devices and advanced techniques for RFA treatment of benign thyroid nodules and recurrent thyroid cancers. PMID:28670156

  1. Analysis of Multi-Antenna GNSS Receiver Performance under Jamming Attacks.

    PubMed

    Vagle, Niranjana; Broumandan, Ali; Lachapelle, Gérard

    2016-11-17

    Although antenna array-based Global Navigation Satellite System (GNSS) receivers can be used to mitigate both narrowband and wideband electronic interference sources, measurement distortions induced by array processing methods are not suitable for high precision applications. The measurement distortions have an adverse effect on the carrier phase ambiguity resolution, affecting the navigation solution. Depending on the array attitude information availability and calibration parameters, different spatial processing methods can be implemented although they distort carrier phase measurements in some cases. This paper provides a detailed investigation of the effect of different array processing techniques on array-based GNSS receiver measurements and navigation performance. The main novelty of the paper is to provide a thorough analysis of array-based GNSS receivers employing different beamforming techniques from tracking to navigation solution. Two beamforming techniques, namely Power Minimization (PM) and Minimum Power Distortionless Response (MPDR), are being investigated. In the tracking domain, the carrier Doppler, Phase Lock Indicator (PLI), and Carrier-to-Noise Ratio (C/N₀) are analyzed. Pseudorange and carrier phase measurement distortions and carrier phase position performance are also evaluated. Performance analyses results from simulated GNSS signals and field tests are provided.

  2. Analysis and correction of ground reflection effects in measured narrowband sound spectra using cepstral techniques

    NASA Technical Reports Server (NTRS)

    Miles, J. H.; Stevens, G. H.; Leininger, G. G.

    1975-01-01

    Ground reflections generate undesirable effects on acoustic measurements such as those conducted outdoors for jet noise research, aircraft certification, and motor vehicle regulation. Cepstral techniques developed in speech processing are adapted to identify echo delay time and to correct for ground reflection effects. A sample result is presented using an actual narrowband sound pressure level spectrum. The technique can readily be adapted to existing fast Fourier transform type spectrum measurement instrumentation to provide field measurements/of echo time delays.

  3. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    An example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems is presented. This particular application is for a solar cell power system demonstration project designed to provide electric power requirements for remote villages. The techniques utilized involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of fail-safe and planned spare parts engineering philosophy.

  4. Segmentation of Unstructured Datasets

    NASA Technical Reports Server (NTRS)

    Bhat, Smitha

    1996-01-01

    Datasets generated by computer simulations and experiments in Computational Fluid Dynamics tend to be extremely large and complex. It is difficult to visualize these datasets using standard techniques like Volume Rendering and Ray Casting. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This thesis explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and from Finite Element Analysis.

  5. Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems

    DTIC Science & Technology

    2016-06-28

    harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release

  6. Doing That Thing That Scientists Do: A Discovery-Driven Module on Protein Purification and Characterization for the Undergraduate Biochemistry Laboratory Classroom

    ERIC Educational Resources Information Center

    Garrett, Teresa A.; Osmundson, Joseph; Isaacson, Marisa; Herrera, Jennifer

    2015-01-01

    In traditional introductory biochemistry laboratory classes students learn techniques for protein purification and analysis by following provided, established, step-by-step procedures. Students are exposed to a variety of biochemical techniques but are often not developing procedures or collecting new, original data. In this laboratory module,…

  7. The Use of Recommended Communication Techniques by Maryland Family Physicians and Pediatricians

    PubMed Central

    Weatherspoon, Darien J.; Horowitz, Alice M.; Kleinman, Dushanka V.; Wang, Min Qi

    2015-01-01

    Background Health literacy experts and the American Medical Association have developed recommended communication techniques for healthcare providers given that effective communication has been shown to greatly improve health outcomes. The purpose of this study was to determine the number and types of communication techniques routinely used by Maryland physicians. Methods In 2010, a 30-item survey was mailed to a random sample of 1,472 Maryland family physicians and pediatricians, with 294 surveys being returned and usable. The survey contained questions about provider and practice characteristics, and 17 items related to communication techniques, including seven basic communication techniques. Physicians’ use of recommended communication techniques was analyzed using descriptive statistics, analysis of variance, and ordinary least squares regression. Results Family physicians routinely used an average of 6.6 of the 17 total techniques and 3.3 of the seven basic techniques, whereas pediatricians routinely used 6.4 and 3.2 techniques, respectively. The use of simple language was the only technique that nearly all physicians routinely utilized (Family physicians, 91%; Pediatricians, 93%). Physicians who had taken a communications course used significantly more techniques than those who had not. Physicians with a low percentage of patients on Medicaid were significantly less likely to use the recommended communication techniques compared to those providers who had high proportion of their patient population on Medicaid. Conclusions Overall, the use of recommended communication techniques was low. Additionally, many physicians were unsure of the effectiveness of several of the recommended techniques, which could suggest that physicians are unaware of valuable skills that could enhance their communication. The findings of this study suggest that communications training should be given a higher priority in the medical training process in the United States. PMID:25856371

  8. XRF analysis to identify historical photographic processes: The case of some Interguglielmi Jr.'s images from the Palermo Municipal Archive

    NASA Astrophysics Data System (ADS)

    Modica, A.; Alberghina, M. F.; Brai, M.; Bruno, M.; Di Bella, M.; Fontana, D.; Tranchina, L.

    2017-06-01

    In the early period, even though professional photographers worked with similar techniques and products, their artistic and commercial aims determined different choices and led them to follow different, often personal, recipes. For this reason, identification of the techniques through date and name of the photographer or through some visual features like colour, tonality and surface of the image layer, often needs further investigation to be proved. Chemical characterization, carried out in a non or micro destructive way, can be crucial to provide useful information about the original composition, degradation process, realization technique, in obtaining an indirect dating of the photograph and/or to choose the most correct conservation treatment. In our case, x-ray fluorescence (XRF) analysis was used to confirm the chemical composition of eleven historical photographs dated between the end of the 19th century and the beginning of the 20th, shot in Palermo (Sicily) by a renowned photographer of the time, and pasted on their original cardboards. The elemental identification, obtained with a non destructive approach, provided important information to distinguish among different photographic techniques in terms of distribution and characterization of chemical elements markers in the photographic surface.

  9. Body composition analysis techniques in adult and pediatric patients: how reliable are they? How useful are they clinically?

    PubMed

    Woodrow, Graham

    2007-06-01

    Complex abnormalities of body composition occur in peritoneal dialysis (PD). These abnormalities reflect changes in hydration, nutrition, and body fat, and they are of major clinical significance. Clinical assessment of these body compartments is insensitive and inaccurate. Frequently, simultaneous changes of hydration, wasting, and body fat content can occur, confounding clinical assessment of each component. Body composition can be described by models of varying complexity that use one or more measurement techniques. "Gold standard" methods provide accurate and precise data, but are not practical for routine clinical use. Dual energy X-ray absorptiometry allows for measurement of regional as well as whole-body composition, which can provide further information of clinical relevance. Simpler techniques such as anthropometry and bioelectrical impedance analysis are suited to routine use in clinic or at the bedside, but may be less accurate. Body composition methodology sometimes makes assumptions regarding relationships between components, particularly in regard to hydration, which may be invalid in pathologic states. Uncritical application of these methods to the PD patient may result in erroneous interpretation of results. Understanding the foundations and limitations of body composition techniques allows for optimal application in clinical practice.

  10. Direct analysis in real time--high resolution mass spectrometry as a valuable tool for the pharmaceutical drug development.

    PubMed

    Srbek, Jan; Klejdus, Bořivoj; Douša, Michal; Břicháč, Jiří; Stasiak, Pawel; Reitmajer, Josef; Nováková, Lucie

    2014-12-01

    In this study, direct analysis in real time-mass spectrometry (DART-MS) was assessed for the analysis of various pharmaceutical formulations with intention to summarize possible applications for the routine pharmaceutical development. As DART is an ambient ionization technique, it allows direct analysis of pharmaceutical samples in solid or liquid form without complex sample preparation, which is often the most time-consuming part of the analytical method. This makes the technique suitable for many application fields, including pharmaceutical drug development. DART mass spectra of more than twenty selected tablets and other common pharmaceutical formulations, i.e. injection solutions, ointments and suppositories developed in the pharmaceutical industry during several recent years are presented. Moreover, as thin-layer chromatography (TLC) is still very popular for the monitoring of the reactions in the synthetic chemistry, several substances were analyzed directly from the TLC plates to demonstrate the simplicity of the technique. Pure substance solutions were spotted onto a TLC plate and then analyzed with DART without separation. This was the first DART-MS study of pharmaceutical dosage forms using DART-Orbitrap combination. The duration of sample analysis by the DART-MS technique lasted several seconds, allowing enough time to collect sufficient number of data points for compound identification. The experimental setup provided excellent mass accuracy and high resolution of the mass spectra which allowed unambiguous identification of the compounds of interest. Finally, DART mass spectrometry was also used for the monitoring of the selected impurity distribution in the atorvastatin tablets. These measurements demonstrated DART to be robust ionization technique, which provided easy-to-interpret mass spectra for the broad range of compounds. DART has high-throughput potential for various types of pharmaceutical analyses and therefore eliminates the time for sample cleanup and chromatographic separation. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Marketing Youth Services.

    ERIC Educational Resources Information Center

    Dimick, Barbara

    1995-01-01

    Marketing techniques in youth services are useful for designing programs, collections, and services and for determining customer needs. The marketing mix--product, place, price, and practice--provides a framework for service analysis. (AEF)

  12. Nuts and Bolts - Techniques for Genesis Sample Curation

    NASA Technical Reports Server (NTRS)

    Burkett, Patti J.; Rodriquez, M. C.; Allton, J. H.

    2011-01-01

    The Genesis curation staff at NASA Johnson Space Center provides samples and data for analysis to the scientific community, following allocation approval by the Genesis Oversight Committee, a sub-committee of CAPTEM (Curation Analysis Planning Team for Extraterrestrial Materials). We are often asked by investigators within the scientific community how we choose samples to best fit the requirements of the request. Here we will demonstrate our techniques for characterizing samples and satisfying allocation requests. Even with a systematic approach, every allocation is unique. We are also providing updated status of the cataloging and characterization of solar wind collectors as of January 2011. The collection consists of 3721 inventoried samples consisting of a single fragment, or multiple fragments containerized or pressed between post-it notes, jars or vials of various sizes.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1990. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis. The Chemical Analysis Group uses wet-chemical and instrumental methods for elemental, compositional, and isotopic analyses of solid, liquid, and gaseous samples and provides specialized analytical services. The Instrumental Analysis Group uses nuclear counting techniques in radiochemical analyses over a wide range of sample types from low-level environmental samples to samples of high radioactivity. The Organic Analysis Group uses amore » number of complementary techniques to separate and to quantitatively and qualitatively analyze complex organic mixtures and compounds at the trace level, including synthetic fuels, toxic substances, fossil-fuel residues and emissions, pollutants, biologically active compounds, pesticides, and drugs. The Environmental Analysis Group performs analyses of inorganic environmental and hazardous waste and coal samples.« less

  14. New approaches to wipe sampling methods for antineoplastic and other hazardous drugs in healthcare settings.

    PubMed

    Connor, Thomas H; Smith, Jerome P

    2016-09-01

    At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.

  15. Recent mass spectrometry-based techniques and considerations for disulfide bond characterization in proteins.

    PubMed

    Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather

    2018-04-01

    Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass spectrometry data, and software tools.

  16. Procedures for analysis of debris relative to Space Shuttle systems

    NASA Technical Reports Server (NTRS)

    Kim, Hae Soo; Cummings, Virginia J.

    1993-01-01

    Debris samples collected from various Space Shuttle systems have been submitted to the Microchemical Analysis Branch. This investigation was initiated to develop optimal techniques for the analysis of debris. Optical microscopy provides information about the morphology and size of crystallites, particle sizes, amorphous phases, glass phases, and poorly crystallized materials. Scanning electron microscopy with energy dispersive spectrometry is utilized for information on surface morphology and qualitative elemental content of debris. Analytical electron microscopy with wavelength dispersive spectrometry provides information on the quantitative elemental content of debris.

  17. The Fourth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Fourth Annual Thermal and Fluids Analysis Workshop was held from August 17-21, 1992, at NASA Lewis Research Center. The workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysts. Paper topics included advances and uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  18. Isolation and Analysis of Essential Oils from Spices

    ERIC Educational Resources Information Center

    O'Shea, Stephen K.; Von Riesen, Daniel D.; Rossi, Lauren L.

    2012-01-01

    Natural product isolation and analysis provide an opportunity to present a variety of experimental techniques to undergraduate students in introductory organic chemistry. Eugenol, anethole, and carvone were extracted from six common spices using steam-distillation and diethyl ether as the extraction solvent. Students assessed the purity of their…

  19. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  20. Optical analysis of crystal growth

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Passeur, Andrea; Harper, Sabrina

    1994-01-01

    Processing and data reduction of holographic images from Spacelab presents some interesting challenges in determining the effects of microgravity on crystal growth processes. Evaluation of several processing techniques, including the Computerized Holographic Image Processing System and the image processing software ITEX150, will provide fundamental information for holographic analysis of the space flight data.

  1. Cost Benefit Analysis and Other Fun and Games.

    ERIC Educational Resources Information Center

    White, Herbert S.

    1985-01-01

    Discussion of application of cost benefit analysis (CBA) accounting techniques to libraries highlights user willingness to be charged for services provided, reasons why CBA will not work in library settings, libraries and budgets, cost distribution on basis of presumed or expected use, implementation of information-seeking behavior control, and…

  2. High Performance Liquid Chromatography of Some Analgesic Compounds: An Instrumental Analysis Experiment.

    ERIC Educational Resources Information Center

    Haddad, Paul; And Others

    1983-01-01

    Background information, procedures, and results are provided for an experiment demonstrating techniques of solvent selection, gradient elution, pH control, and ion-pairing in the analysis of an analgesic mixture using reversed-phase liquid chromatography on an octadecylsilane column. Although developed using sophisticated/expensive equipment, less…

  3. Network Meta-Analysis Using R: A Review of Currently Available Automated Packages

    PubMed Central

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687

  4. Network meta-analysis using R: a review of currently available automated packages.

    PubMed

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  5. Analysis of Information Content in High-Spectral Resolution Sounders using Subset Selection Analysis

    NASA Technical Reports Server (NTRS)

    Velez-Reyes, Miguel; Joiner, Joanna

    1998-01-01

    In this paper, we summarize the results of the sensitivity analysis and data reduction carried out to determine the information content of AIRS and IASI channels. The analysis and data reduction was based on the use of subset selection techniques developed in the linear algebra and statistical community to study linear dependencies in high dimensional data sets. We applied the subset selection method to study dependency among channels by studying the dependency among their weighting functions. Also, we applied the technique to study the information provided by the different levels in which the atmosphere is discretized for retrievals and analysis. Results from the method correlate well with intuition in many respects and point out to possible modifications for band selection in sensor design and number and location of levels in the analysis process.

  6. Solar prediction analysis

    NASA Technical Reports Server (NTRS)

    Smith, Jesse B.

    1992-01-01

    Solar Activity prediction is essential to definition of orbital design and operational environments for space flight. This task provides the necessary research to better understand solar predictions being generated by the solar community and to develop improved solar prediction models. The contractor shall provide the necessary manpower and facilities to perform the following tasks: (1) review, evaluate, and assess the time evolution of the solar cycle to provide probable limits of solar cycle behavior near maximum end during the decline of solar cycle 22, and the forecasts being provided by the solar community and the techniques being used to generate these forecasts; and (2) develop and refine prediction techniques for short-term solar behavior flare prediction within solar active regions, with special emphasis on the correlation of magnetic shear with flare occurrence.

  7. Staging research of human lung cancer tissues by high-resolution magic angle spinning proton nuclear magnetic resonance spectroscopy (HRMAS 1 H NMR) and multivariate data analysis.

    PubMed

    Chen, Wenxue; Lu, Shaohua; Wang, Guifang; Chen, Fener; Bai, Chunxue

    2017-10-01

    High-resolution magic-angle spinning proton nuclear magnetic resonance (HRMAS 1 H NMR) spectroscopy technique was employed to analyze the metabonomic characterizations of lung cancer tissues in hope to identify potential diagnostic biomarkers for malignancy detection and staging research of lung tissues. HRMAS 1 H NMR spectroscopy technique can rapidly provide important information for accurate diagnosis and staging of cancer tissues owing to its noninvasive nature and limited requirement for the samples, and thus has been acknowledged as an excellent tool to investigate tissue metabolism and provide a more realistic insight into the metabonomics of tissues when combined with multivariate data analysis (MVDA) such as component analysis and orthogonal partial least squares-discriminant analysis in particular. HRMAS 1 H NMR spectra displayed the metabonomic differences of 32 lung cancer tissues at the different stages from 32 patients. The significant changes (P < 0.05) of some important metabolites such as lipids, aspartate and choline-containing compounds in cancer tissues at the different stages had been identified. Furthermore, the combination of HRMAS 1 H NMR spectroscopy and MVDA might potentially and precisely provided for a high sensitivity, specificity, prediction accuracy in the positive identification of the staging for the cancer tissues in contrast with the pathological data in clinic. This study highlighted the potential of metabonomics in clinical settings so that the techniques might be further exploited for the diagnosis and staging prediction of lung cancer in future. © 2016 John Wiley & Sons Australia, Ltd.

  8. Description of MSFC engineering photographic analysis

    NASA Technical Reports Server (NTRS)

    Earle, Jim; Williams, Frank

    1988-01-01

    Utilizing a background that includes development of basic launch and test photographic coverage and analysis procedures, the MSFC Photographic Evaluation Group has built a body of experience that enables it to effectively satisfy MSFC's engineering photographic analysis needs. Combining the basic soundness of reliable, proven techniques of the past with the newer technical advances of computers and computer-related devices, the MSFC Photo Evaluation Group is in a position to continue to provide photo and video analysis service center-wide and NASA-wide to supply an improving photo analysis product to meet the photo evaluation needs of the future; and to provide new standards in the state-of-the-art of photo analysis of dynamic events.

  9. Verification of Orthogrid Finite Element Modeling Techniques

    NASA Technical Reports Server (NTRS)

    Steeve, B. E.

    1996-01-01

    The stress analysis of orthogrid structures, specifically with I-beam sections, is regularly performed using finite elements. Various modeling techniques are often used to simplify the modeling process but still adequately capture the actual hardware behavior. The accuracy of such 'Oshort cutso' is sometimes in question. This report compares three modeling techniques to actual test results from a loaded orthogrid panel. The finite element models include a beam, shell, and mixed beam and shell element model. Results show that the shell element model performs the best, but that the simpler beam and beam and shell element models provide reasonable to conservative results for a stress analysis. When deflection and stiffness is critical, it is important to capture the effect of the orthogrid nodes in the model.

  10. Parametric Robust Control and System Identification: Unified Approach

    NASA Technical Reports Server (NTRS)

    Keel, L. H.

    1996-01-01

    During the period of this support, a new control system design and analysis method has been studied. This approach deals with control systems containing uncertainties that are represented in terms of its transfer function parameters. Such a representation of the control system is common and many physical parameter variations fall into this type of uncertainty. Techniques developed here are capable of providing nonconservative analysis of such control systems with parameter variations. We have also developed techniques to deal with control systems when their state space representations are given rather than transfer functions. In this case, the plant parameters will appear as entries of state space matrices. Finally, a system modeling technique to construct such systems from the raw input - output frequency domain data has been developed.

  11. A comparison of partial order technique with three methods of multi-criteria analysis for ranking of chemical substances.

    PubMed

    Lerche, Dorte; Brüggemann, Rainer; Sørensen, Peter; Carlsen, Lars; Nielsen, Ole John

    2002-01-01

    An alternative to the often cumbersome and time-consuming risk assessments of chemical substances could be more reliable and advanced priority setting methods. An elaboration of the simple scoring methods is provided by Hasse Diagram Technique (HDT) and/or Multi-Criteria Analysis (MCA). The present study provides an in depth evaluation of HDT relative to three MCA techniques. The new and main methodological step in the comparison is the use of probability concepts based on mathematical tools such as linear extensions of partially ordered sets and Monte Carlo simulations. A data set consisting of 12 High Production Volume Chemicals (HPVCs) is used for illustration. It is a paradigm in this investigation to claim that the need of external input (often subjective weightings of criteria) should be minimized and that the transparency should be maximized in any multicriteria prioritisation. The study illustrates that the Hasse diagram technique (HDT) needs least external input, is most transparent and is least subjective. However, HDT has some weaknesses if there are criteria which exclude each other. Then weighting is needed. Multi-Criteria Analysis (i.e. Utility Function approach, PROMETHEE and concordance analysis) can deal with such mutual exclusions because their formalisms to quantify preferences allow participation e.g. weighting of criteria. Consequently MCA include more subjectivity and loose transparency. The recommendation which arises from this study is that the first step in decision making is to run HDT and as the second step possibly is to run one of the MCA algorithms.

  12. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  13. The integration of system specifications and program coding

    NASA Technical Reports Server (NTRS)

    Luebke, W. R.

    1970-01-01

    Experience in maintaining up-to-date documentation for one module of the large-scale Medical Literature Analysis and Retrieval System 2 (MEDLARS 2) is described. Several innovative techniques were explored in the development of this system's data management environment, particularly those that use PL/I as an automatic documenter. The PL/I data description section can provide automatic documentation by means of a master description of data elements that has long and highly meaningful mnemonic names and a formalized technique for the production of descriptive commentary. The techniques discussed are practical methods that employ the computer during system development in a manner that assists system implementation, provides interim documentation for customer review, and satisfies some of the deliverable documentation requirements.

  14. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  15. Nanomanipulation-Coupled Matrix-Assisted Laser Desorption/ Ionization-Direct Organelle Mass Spectrometry: A Technique for the Detailed Analysis of Single Organelles

    NASA Astrophysics Data System (ADS)

    Phelps, Mandy S.; Sturtevant, Drew; Chapman, Kent D.; Verbeck, Guido F.

    2016-02-01

    We describe a novel technique combining precise organelle microextraction with deposition and matrix-assisted laser desorption/ionization (MALDI) for a rapid, minimally invasive mass spectrometry (MS) analysis of single organelles from living cells. A dual-positioner nanomanipulator workstation was utilized for both extraction of organelle content and precise co-deposition of analyte and matrix solution for MALDI-direct organelle mass spectrometry (DOMS) analysis. Here, the triacylglycerol (TAG) profiles of single lipid droplets from 3T3-L1 adipocytes were acquired and results validated with nanoelectrospray ionization (NSI) MS. The results demonstrate the utility of the MALDI-DOMS technique as it enabled longer mass analysis time, higher ionization efficiency, MS imaging of the co-deposited spot, and subsequent MS/MS capabilities of localized lipid content in comparison to NSI-DOMS. This method provides selective organellar resolution, which complements current biochemical analyses and prompts for subsequent subcellular studies to be performed where limited samples and analyte volume are of concern.

  16. Predicting Gilthead Sea Bream (Sparus aurata) Freshness by a Novel Combined Technique of 3D Imaging and SW-NIR Spectral Analysis.

    PubMed

    Ivorra, Eugenio; Verdu, Samuel; Sánchez, Antonio J; Grau, Raúl; Barat, José M

    2016-10-19

    A technique that combines the spatial resolution of a 3D structured-light (SL) imaging system with the spectral analysis of a hyperspectral short-wave near infrared system was developed for freshness predictions of gilthead sea bream on the first storage days (Days 0-6). This novel approach allows the hyperspectral analysis of very specific fish areas, which provides more information for freshness estimations. The SL system obtains a 3D reconstruction of fish, and an automatic method locates gilthead's pupils and irises. Once these regions are positioned, the hyperspectral camera acquires spectral information and a multivariate statistical study is done. The best region is the pupil with an R² of 0.92 and an RMSE of 0.651 for predictions. We conclude that the combination of 3D technology with the hyperspectral analysis offers plenty of potential and is a very promising technique to non destructively predict gilthead freshness.

  17. Predicting Gilthead Sea Bream (Sparus aurata) Freshness by a Novel Combined Technique of 3D Imaging and SW-NIR Spectral Analysis

    PubMed Central

    Ivorra, Eugenio; Verdu, Samuel; Sánchez, Antonio J.; Grau, Raúl; Barat, José M.

    2016-01-01

    A technique that combines the spatial resolution of a 3D structured-light (SL) imaging system with the spectral analysis of a hyperspectral short-wave near infrared system was developed for freshness predictions of gilthead sea bream on the first storage days (Days 0–6). This novel approach allows the hyperspectral analysis of very specific fish areas, which provides more information for freshness estimations. The SL system obtains a 3D reconstruction of fish, and an automatic method locates gilthead’s pupils and irises. Once these regions are positioned, the hyperspectral camera acquires spectral information and a multivariate statistical study is done. The best region is the pupil with an R2 of 0.92 and an RMSE of 0.651 for predictions. We conclude that the combination of 3D technology with the hyperspectral analysis offers plenty of potential and is a very promising technique to non destructively predict gilthead freshness. PMID:27775556

  18. Surface and Thin Film Analysis during Metal Organic Vapour Phase Epitaxial Growth

    NASA Astrophysics Data System (ADS)

    Richter, Wolfgang

    2007-06-01

    In-situ analysis of epitaxial growth is the essential ingredient in order to understand the growth process, to optimize growth and last but not least to monitor or even control the epitaxial growth on a microscopic scale. In MBE (molecular beam epitaxy) in-situ analysis tools existed right from the beginning because this technique developed from Surface Science technology with all its electron based analysis tools (LEED, RHEED, PES etc). Vapour Phase Epitaxy, in contrast, remained for a long time in an empirical stage ("alchemy") because only post growth characterisations like photoluminescence, Hall effect and electrical conductivity were available. Within the last two decades, however, optical techniques were developed which provide similar capabilities as in MBE for Vapour Phase growth. I will discuss in this paper the potential of Reflectance Anisotropy Spectroscopy (RAS) and Spectroscopic Ellipsometry (SE) for the growth of thin epitaxial semiconductor layers with zincblende (GaAs etc) and wurtzite structure (GaN etc). Other techniques and materials will be also mentioned.

  19. Recent trends in analytical methods and separation techniques for drugs of abuse in hair.

    PubMed

    Baciu, T; Borrull, F; Aguilar, C; Calull, M

    2015-01-26

    Hair analysis of drugs of abuse has been a subject of growing interest from a clinical, social and forensic perspective for years because of the broad time detection window after intake in comparison to urine and blood analysis. Over the last few years, hair analysis has gained increasing attention and recognition for the retrospective investigation of drug abuse in a wide variety of contexts, shown by the large number of applications developed. This review aims to provide an overview of the state of the art and the latest trends used in the literature from 2005 to the present in the analysis of drugs of abuse in hair, with a special focus on separation analytical techniques and their hyphenation with mass spectrometry detection. The most recently introduced sample preparation techniques are also addressed in this paper. The main strengths and weaknesses of all of these approaches are critically discussed by means of relevant applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Space shuttle/food system. Volume 2, Appendix C: Food cooling techniques analysis. Appendix D: Package and stowage: Alternate concepts analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The relative penalties associated with various techniques for providing an onboard cold environment for storage of perishable food items, and for the development of packaging and vehicle stowage parameters were investigated in terms of the overall food system design analysis of space shuttle. The degrees of capability for maintaining both a 40 F to 45 F refrigerated temperature and a 0 F and 20 F frozen environment were assessed for the following cooling techniques: (1) phase change (heat sink) concept; (2) thermoelectric concept; (3) vapor cycle concept; and (4) expendable ammonia concept. The parameters considered in the analysis were weight, volume, and spacecraft power restrictions. Data were also produced for packaging and vehicle stowage parameters which are compatible with vehicle weight and volume specifications. Certain assumptions were made for food packaging sizes based on previously generated space shuttle menus. The results of the study are shown, along with the range of meal choices considered.

  1. PARENT Quick Blind Round-Robin Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braatz, Brett G.; Heasler, Patrick G.; Meyer, Ryan M.

    The U.S. Nuclear Regulatory Commission has established the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT) whose goal is to investigate the effectiveness of current and novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is to be done by conducting a series of open and blind international round-robin tests on a set of piping components that include large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds. The blind testing is being conducted in two segments, one is called Quick-Blind and the other is called Blind. Themore » Quick-Blind testing and destructive analysis of the test blocks has been completed. This report describes the four Quick-Blind test blocks used, summarizes their destructive analysis, gives an overview of the nondestructive evaluation (NDE) techniques applied, provides an analysis inspection data, and presents the conclusions drawn.« less

  2. Combined spectral-domain optical coherence tomography and hyperspectral imaging applied for tissue analysis: Preliminary results

    NASA Astrophysics Data System (ADS)

    Dontu, S.; Miclos, S.; Savastru, D.; Tautan, M.

    2017-09-01

    In recent years many optoelectronic techniques have been developed for improvement and the development of devices for tissue analysis. Spectral-Domain Optical Coherence Tomography (SD-OCT) is a new medical interferometric imaging modality that provides depth resolved tissue structure information with resolution in the μm range. However, SD-OCT has its own limitations and cannot offer the biochemical information of the tissue. These data can be obtained with hyperspectral imaging, a non-invasive, sensitive and real time technique. In the present study we have combined Spectral-Domain Optical Coherence Tomography (SD-OCT) with Hyperspectral imaging (HSI) for tissue analysis. The Spectral-Domain Optical Coherence Tomography (SD-OCT) and Hyperspectral imaging (HSI) are two methods that have demonstrated significant potential in this context. Preliminary results using different tissue have highlighted the capabilities of this technique of combinations.

  3. Nuclear analytical techniques in medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cesareo, R.

    1988-01-01

    This book acquaints one with the fundamental principles and the instrumentation relevant to analytical technique based on atomic and nuclear physics, as well as present and future biomedical applications. Besides providing a theoretical description of the physical phenomena, a large part of the book is devoted to applications in the medical and biological field, particularly in hematology, forensic medicine and environmental science. This volume reviews methods such as the possibility of carrying out rapid multi-element analysis of trace elements on biomedical samples, in vitro and in vivo, by XRF-analysis; the ability of the PIXE-microprobe to analyze in detail and tomore » map trace elements in fragments of biomedical samples or inside the cells; the potentiality of in vivo nuclear activation analysis for diagnostic purposes. Finally, techniques are described such as radiation scattering (elastic and inelastic scattering) and attenuation measurements which will undoubtedly see great development in the immediate future.« less

  4. Enabling Community Through Social Media

    PubMed Central

    Haythornthwaite, Caroline

    2013-01-01

    Background Social network analysis provides a perspective and method for inquiring into the structures that comprise online groups and communities. Traces from interaction via social media provide the opportunity for understanding how a community is formed and maintained online. Objective The paper aims to demonstrate how social network analysis provides a vocabulary and set of techniques for examining interaction patterns via social media. Using the case of the #hcsmca online discussion forum, this paper highlights what has been and can be gained by approaching online community from a social network perspective, as well as providing an inside look at the structure of the #hcsmca community. Methods Social network analysis was used to examine structures in a 1-month sample of Twitter messages with the hashtag #hcsmca (3871 tweets, 486 unique posters), which is the tag associated with the social media–supported group Health Care Social Media Canada. Network connections were considered present if the individual was mentioned, replied to, or had a post retweeted. Results Network analyses revealed patterns of interaction that characterized the community as comprising one component, with a set of core participants prominent in the network due to their connections with others. Analysis showed the social media health content providers were the most influential group based on in-degree centrality. However, there was no preferential attachment among people in the same professional group, indicating that the formation of connections among community members was not constrained by professional status. Conclusions Network analysis and visualizations provide techniques and a vocabulary for understanding online interaction, as well as insights that can help in understanding what, and who, comprises and sustains a network, and whether community emerges from a network of online interactions. PMID:24176835

  5. 40 CFR 141.33 - Record maintenance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... laboratory reports may be kept, or data may be transferred to tabular summaries, provided that the following...) Laboratory and person responsible for performing analysis; (5) The analytical technique/method used; and (6...

  6. Multiobjective Decision Analysis With Engineering and Business Applications

    NASA Astrophysics Data System (ADS)

    Wood, Eric

    The last 15 years have witnessed the development of a large number of multiobjective decision techniques. Applying these techniques to environmental, engineering, and business problems has become well accepted. Multiobjective Decision Analysis With Engineering and Business Applications attempts to cover the main multiobjective techniques both in their mathematical treatment and in their application to real-world problems.The book is divided into 12 chapters plus three appendices. The main portion of the book is represented by chapters 3-6, Where the various approaches are identified, classified, and reviewed. Chapter 3 covers methods for generating nondominated solutions; chapter 4, continuous methods with prior preference articulation; chapter 5, discrete methods with prior preference articulation; and chapter 6, methods of progressive articulation of preferences. In these four chapters, close to 20 techniques are discussed with over 20 illustrative examples. This is both a strength and a weakness; the breadth of techniques and examples provide comprehensive coverage, but it is in a style too mathematically compact for most readers. By my count, the presentation of the 20 techniques in chapters 3-6 covered 85 pages, an average of about 4.5 pages each; therefore, a sound basis in linear algebra and linear programing is required if the reader hopes to follow the material. Chapter 2, “Concepts in Multiobjective Analysis,” also assumes such a background.

  7. Operationalizing strategic marketing.

    PubMed

    Chambers, S B

    1989-05-01

    The strategic marketing process, like any administrative practice, is far simpler to conceptualize than operationalize within an organization. It is for this reason that this chapter focused on providing practical techniques and strategies for implementing the strategic marketing process. First and foremost, the marketing effort needs to be marketed to the various publics of the organization. This chapter advocated the need to organize the marketing analysis into organizational, competitive, and market phases, and it provided examples of possible designs of the phases. The importance and techniques for exhausting secondary data sources and conducting efficient primary data collection methods were explained and illustrated. Strategies for determining marketing opportunities and threats, as well as segmenting markets, were detailed. The chapter provided techniques for developing marketing strategies, including considering the five patterns of coverage available; determining competitor's position and the marketing mix; examining the stage of the product life cycle; and employing a consumer decision model. The importance of developing explicit objectives, goals, and detailed action plans was emphasized. Finally, helpful hints for operationalizing the communication variable and evaluating marketing programs were provided.

  8. Technique for Early Reliability Prediction of Software Components Using Behaviour Models

    PubMed Central

    Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad

    2016-01-01

    Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748

  9. An in Situ Technique for Elemental Analysis of Lunar Surfaces

    NASA Technical Reports Server (NTRS)

    Kane, K. Y.; Cremers, D. A.

    1992-01-01

    An in situ analytical technique that can remotely determine the elemental constituents of solids has been demonstrated. Laser-Induced Breakdown Spectroscopy (LIBS) is a form of atomic emission spectroscopy in which a powerful laser pulse is focused on a solid to generate a laser spark, or microplasma. Material in the plasma is vaporized, and the resulting atoms are excited to emit light. The light is spectrally resolved to identify the emitting species. LIBS is a simple technique that can be automated for inclusion aboard a remotely operated vehicle. Since only optical access to a sample is required, areas inaccessible to a rover can be analyzed remotely. A single laser spark both vaporizes and excites the sample so that near real-time analysis (a few minutes) is possible. This technique provides simultaneous multielement detection and has good sensitivity for many elements. LIBS also eliminates the need for sample retrieval and preparation preventing possible sample contamination. These qualities make the LIBS technique uniquely suited for use in the lunar environment.

  10. BaTMAn: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  11. Model authoring system for fail safe analysis

    NASA Technical Reports Server (NTRS)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  12. A Cross-Cultural Analysis of Personality Structure Through the Lens of the HEXACO Model.

    PubMed

    Ion, Andrei; Iliescu, Dragos; Aldhafri, Said; Rana, Neeti; Ratanadilok, Kattiya; Widyanti, Ari; Nedelcea, Cătălin

    2017-01-01

    Across 5 different samples, totaling more than 1,600 participants from India, Indonesia, Oman, Romania, and Thailand, the authors address the question of cross-cultural replicability of a personality structure, while exploring the utility of exploratory structural equation modeling (ESEM) as a data analysis technique in cross-cultural personality research. Personality was measured with an alternative, non-Five-Factor Model (FFM) personality framework, provided by the HEXACO-PI (Lee & Ashton, 2004 ). The results show that the HEXACO framework was replicated in some of the investigated cultures. The ESEM data analysis technique proved to be especially useful in investigating the between-group measurement equivalence of broad personality measures across different cultures.

  13. Advances in Mid-Infrared Spectroscopy for Chemical Analysis

    NASA Astrophysics Data System (ADS)

    Haas, Julian; Mizaikoff, Boris

    2016-06-01

    Infrared spectroscopy in the 3-20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review.

  14. Detection of Glaucoma Using Image Processing Techniques: A Critique.

    PubMed

    Kumar, B Naveen; Chauhan, R P; Dahiya, Nidhi

    2018-01-01

    The primary objective of this article is to present a summary of different types of image processing methods employed for the detection of glaucoma, a serious eye disease. Glaucoma affects the optic nerve in which retinal ganglion cells become dead, and this leads to loss of vision. The principal cause is the increase in intraocular pressure, which occurs in open-angle and angle-closure glaucoma, the two major types affecting the optic nerve. In the early stages of glaucoma, no perceptible symptoms appear. As the disease progresses, vision starts to become hazy, leading to blindness. Therefore, early detection of glaucoma is needed for prevention. Manual analysis of ophthalmic images is fairly time-consuming and accuracy depends on the expertise of the professionals. Automatic analysis of retinal images is an important tool. Automation aids in the detection, diagnosis, and prevention of risks associated with the disease. Fundus images obtained from a fundus camera have been used for the analysis. Requisite pre-processing techniques have been applied to the image and, depending upon the technique, various classifiers have been used to detect glaucoma. The techniques mentioned in the present review have certain advantages and disadvantages. Based on this study, one can determine which technique provides an optimum result.

  15. Molecular diagnosis of bloodstream infections: planning to (physically) reach the bedside.

    PubMed

    Leggieri, N; Rida, A; François, P; Schrenzel, Jacques

    2010-08-01

    Faster identification of infecting microorganisms and treatment options is a first-ranking priority in the infectious disease area, in order to prevent inappropriate treatment and overuse of broad-spectrum antibiotics. Standard bacterial identification is intrinsically time-consuming, and very recently there has been a burst in the number of commercially available nonphenotype-based techniques and in the documentation of a possible clinical impact of these techniques. Matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) is now a standard diagnostic procedure on cultures and hold promises on spiked blood. Meanwhile, commercial PCR-based techniques have improved with the use of bacterial DNA enrichment methods, the diversity of amplicon analysis techniques (melting curve analysis, microarrays, gel electrophoresis, sequencing and analysis by mass spectrometry) leading to the ability to challenge bacterial culture as the gold standard for providing earlier diagnosis with a better 'clinical' sensitivity and additional prognostic information. Laboratory practice has already changed with MALDI-TOF MS, but a change in clinical practice, driven by emergent nucleic acid-based techniques, will need the demonstration of real-life applicability as well as robust clinical-impact-oriented studies.

  16. New X-Ray Technique to Characterize Nanoscale Precipitates in Aged Aluminum Alloys

    NASA Astrophysics Data System (ADS)

    Sitdikov, V. D.; Murashkin, M. Yu.; Valiev, R. Z.

    2017-10-01

    This paper puts forward a new technique for measurement of x-ray patterns, which enables to solve the problem of identification and determination of precipitates (nanoscale phases) in metallic alloys of the matrix type. The minimum detection limit of precipitates in the matrix of the base material provided by this technique constitutes as little as 1%. The identification of precipitates in x-ray patterns and their analysis are implemented through a transmission mode with a larger radiation area, longer holding time and higher diffractometer resolution as compared to the conventional reflection mode. The presented technique has been successfully employed to identify and quantitatively describe precipitates formed in the Al alloy of the Al-Mg-Si system as a result of artificial aging. For the first time, the x-ray phase analysis has been used to identify and measure precipitates formed during the alloy artificial aging.

  17. TOPICAL REVIEW: Human soft tissue analysis using x-ray or gamma-ray techniques

    NASA Astrophysics Data System (ADS)

    Theodorakou, C.; Farquharson, M. J.

    2008-06-01

    This topical review is intended to describe the x-ray techniques used for human soft tissue analysis. X-ray techniques have been applied to human soft tissue characterization and interesting results have been presented over the last few decades. The motivation behind such studies is to provide improved patient outcome by using the data obtained to better understand a disease process and improve diagnosis. An overview of theoretical background as well as a complete set of references is presented. For each study, a brief summary of the methodology and results is given. The x-ray techniques include x-ray diffraction, x-ray fluorescence, Compton scattering, Compton to coherent scattering ratio and attenuation measurements. The soft tissues that have been classified using x-rays or gamma rays include brain, breast, colon, fat, kidney, liver, lung, muscle, prostate, skin, thyroid and uterus.

  18. A manual for inexpensive methods of analyzing and utilizing remote sensor data

    NASA Technical Reports Server (NTRS)

    Elifrits, C. D.; Barr, D. J.

    1978-01-01

    Instructions are provided for inexpensive methods of using remote sensor data to assist in the completion of the need to observe the earth's surface. When possible, relative costs were included. Equipment need for analysis of remote sensor data is described, and methods of use of these equipment items are included, as well as advantages and disadvantages of the use of individual items. Interpretation and analysis of stereo photos and the interpretation of typical patterns such as tone and texture, landcover, drainage, and erosional form are described. Similar treatment is given to monoscopic image interpretation, including LANDSAT MSS data. Enhancement techniques are detailed with respect to their application and simple techniques of creating an enhanced data item. Techniques described include additive and subtractive (Diazo processes) color techniques and enlargement of photos or images. Applications of these processes, including mappings of land resources, engineering soils, geology, water resources, environmental conditions, and crops and/or vegetation, are outlined.

  19. Amorphous and liquid samples structure and density measurements at high pressure - high temperature using diffraction and imaging techniques

    NASA Astrophysics Data System (ADS)

    Guignot, N.; King, A.; Clark, A. N.; Perrillat, J. P.; Boulard, E.; Morard, G.; Deslandes, J. P.; Itié, J. P.; Ritter, X.; Sanchez-Valle, C.

    2016-12-01

    Determination of the density and structure of liquids such as iron alloys, silicates and carbonates is a key to understand deep Earth structure and dynamics. X-ray diffraction provided by large synchrotron facilities gives excellent results as long as the signal scattered from the sample can be isolated from its environment. Different techniques already exist; we present here the implementation and the first results given by the combined angle- and energy-dispersive structural analysis and refinement (CAESAR) technique introduced by Wang et al. in 2004, that has never been used in this context. It has several advantages in the study of liquids: 1/ the standard energy-dispersive technique (EDX), fast and compatible with large multi-anvil presses frames, is used for fast analysis free of signal pollution from the sample environment 2/ some limitations of the EDX technique (homogeneity of the sample, low resolution) are irrelevant in the case of liquid signals, others (wrong intensities, escape peaks artifacts, background subtraction) are solved by the CAESAR technique 3/ high Q data (up to 15 A-1 and more) can be obtained in a few hours (usually less than 2). We present here the facilities available on the PSICHE beamline (SOLEIL synchrotron, France) and a few results obtained using a Paris-Edinburgh (PE) press and a 1200 tons load capacity multi-anvil press with a (100) DIA compression module. X-ray microtomography, used in conjunction with a PE press featuring rotating anvils (RotoPEc, Philippe et al., 2013) is also very effective, by simply measuring the 3D volume of glass or liquid spheres at HPHT, thus providing density. This can be done in conjunction with the CAESAR technique and we illustrate this point. Finally, absorption profiles can be obtained via imaging techniques, providing another independent way to measure the density of these materials. References Y. Wang et al., A new technique for angle-dispersive powder diffraction using an energy-dispersive setup and synchrotron radiation (2004), J. Appl. Cryst. (2004). 37, 947-956 J. Philippe, Y. Le Godec, F. Bergame et M. Morand, Patent INPI 11 62335 (2013)

  20. TH-EF-BRC-03: Fault Tree Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomadsen, B.

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  1. Recent Advances in Techniques for Hyperspectral Image Processing

    NASA Technical Reports Server (NTRS)

    Plaza, Antonio; Benediktsson, Jon Atli; Boardman, Joseph W.; Brazile, Jason; Bruzzone, Lorenzo; Camps-Valls, Gustavo; Chanussot, Jocelyn; Fauvel, Mathieu; Gamba, Paolo; Gualtieri, Anthony; hide

    2009-01-01

    Imaging spectroscopy, also known as hyperspectral imaging, has been transformed in less than 30 years from being a sparse research tool into a commodity product available to a broad user community. Currently, there is a need for standardized data processing techniques able to take into account the special properties of hyperspectral data. In this paper, we provide a seminal view on recent advances in techniques for hyperspectral image processing. Our main focus is on the design of techniques able to deal with the highdimensional nature of the data, and to integrate the spatial and spectral information. Performance of the discussed techniques is evaluated in different analysis scenarios. To satisfy time-critical constraints in specific applications, we also develop efficient parallel implementations of some of the discussed algorithms. Combined, these parts provide an excellent snapshot of the state-of-the-art in those areas, and offer a thoughtful perspective on future potentials and emerging challenges in the design of robust hyperspectral imaging algorithms

  2. Techniques for hot structures testing

    NASA Technical Reports Server (NTRS)

    Deangelis, V. Michael; Fields, Roger A.

    1990-01-01

    Hot structures testing have been going on since the early 1960's beginning with the Mach 6, X-15 airplane. Early hot structures test programs at NASA-Ames-Dryden focused on operational testing required to support the X-15 flight test program, and early hot structures research projects focused on developing lab test techniques to simulate flight thermal profiles. More recent efforts involved numerous large and small hot structures test programs that served to develop test methods and measurement techniques to provide data that promoted the correlation of test data with results from analytical codes. In Nov. 1988 a workshop was sponsored that focused on the correlation of hot structures test data with analysis. Limited material is drawn from the workshop and a more formal documentation is provided of topics that focus on hot structures test techniques used at NASA-Ames-Dryden. Topics covered include the data acquisition and control of testing, the quartz lamp heater systems, current strain and temperature sensors, and hot structures test techniques used to simulate the flight thermal environment in the lab.

  3. Cost considerations in using simulations for medical training.

    PubMed

    Fletcher, J D; Wind, Alexander P

    2013-10-01

    This article reviews simulation used for medical training, techniques for assessing simulation-based training, and cost analyses that can be included in such assessments. Simulation in medical training appears to take four general forms: human actors who are taught to simulate illnesses and ailments in standardized ways; virtual patients who are generally presented via computer-controlled, multimedia displays; full-body manikins that simulate patients using electronic sensors, responders, and controls; and part-task anatomical simulations of various body parts and systems. Techniques for assessing costs include benefit-cost analysis, return on investment, and cost-effectiveness analysis. Techniques for assessing the effectiveness of simulation-based medical training include the use of transfer effectiveness ratios and incremental transfer effectiveness ratios to measure transfer of knowledge and skill provided by simulation to the performance of medical procedures. Assessment of costs and simulation effectiveness can be combined with measures of transfer using techniques such as isoperformance analysis to identify ways of minimizing costs without reducing performance effectiveness or maximizing performance without increasing costs. In sum, economic analysis must be considered in training assessments if training budgets are to compete successfully with other requirements for funding. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  4. Three-dimensional kinematic estimation of mobile-bearing total knee arthroplasty from x-ray fluoroscopic images

    NASA Astrophysics Data System (ADS)

    Yamazaki, Takaharu; Futai, Kazuma; Tomita, Tetsuya; Sato, Yoshinobu; Yoshikawa, Hideki; Tamura, Shinichi; Sugamoto, Kazuomi

    2011-03-01

    To achieve 3D kinematic analysis of total knee arthroplasty (TKA), 2D/3D registration techniques, which use X-ray fluoroscopic images and computer-aided design (CAD) model of the knee implant, have attracted attention in recent years. These techniques could provide information regarding the movement of radiopaque femoral and tibial components but could not provide information of radiolucent polyethylene insert, because the insert silhouette on X-ray image did not appear clearly. Therefore, it was difficult to obtain 3D kinemaitcs of polyethylene insert, particularly mobile-bearing insert that move on the tibial component. This study presents a technique and the accuracy for 3D kinematic analysis of mobile-bearing insert in TKA using X-ray fluoroscopy, and finally performs clinical applications. For a 3D pose estimation technique of the mobile-bearing insert in TKA using X-ray fluoroscopy, tantalum beads and CAD model with its beads are utilized, and the 3D pose of the insert model is estimated using a feature-based 2D/3D registration technique. In order to validate the accuracy of the present technique, experiments including computer simulation test were performed. The results showed the pose estimation accuracy was sufficient for analyzing mobile-bearing TKA kinematics (the RMS error: about 1.0 mm, 1.0 degree). In the clinical applications, seven patients with mobile-bearing TKA in deep knee bending motion were studied and analyzed. Consequently, present technique enables us to better understand mobile-bearing TKA kinematics, and this type of evaluation was thought to be helpful for improving implant design and optimizing TKA surgical techniques.

  5. Using Job Analysis Techniques to Understand Training Needs for Promotores de Salud.

    PubMed

    Ospina, Javier H; Langford, Toshiko A; Henry, Kimberly L; Nelson, Tristan Q

    2018-04-01

    Despite the value of community health worker programs, such as Promotores de Salud, for addressing health disparities in the Latino community, little consensus has been reached to formally define the unique roles and duties associated with the job, thereby creating unique job training challenges. Understanding the job tasks and worker attributes central to this work is a critical first step for developing the training and evaluation systems of promotores programs. Here, we present the process and findings of a job analysis conducted for promotores working for Planned Parenthood. We employed a systematic approach, the combination job analysis method, to define the job in terms of its work and worker requirements, identifying key job tasks, as well as the worker attributes necessary to effectively perform them. Our results suggest that the promotores' job encompasses a broad range of activities and requires an equally broad range of personal characteristics to perform. These results played an important role in the development of our training and evaluation protocols. In this article, we introduce the technique of job analysis, provide an overview of the results from our own application of this technique, and discuss how these findings can be used to inform a training and performance evaluation system. This article provides a template for other organizations implementing similar community health worker programs and illustrates the value of conducting a job analysis for clarifying job roles, developing and evaluating job training materials, and selecting qualified job candidates.

  6. Fully Integrated Microfluidic Device for Direct Sample-to-Answer Genetic Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Robin H.; Grodzinski, Piotr

    Integration of microfluidics technology with DNA microarrays enables building complete sample-to-answer systems that are useful in many applications such as clinic diagnostics. In this chapter, a fully integrated microfluidic device [1] that consists of microfluidic mixers, valves, pumps, channels, chambers, heaters, and a DNA microarray sensor to perform DNA analysis of complex biological sample solutions is present. This device can perform on-chip sample preparation (including magnetic bead-based cell capture, cell preconcentration and purification, and cell lysis) of complex biological sample solutions (such as whole blood), polymerase chain reaction, DNA hybridization, and electrochemical detection. A few novel microfluidic techniques were developed and employed. A micromix-ing technique based on a cavitation microstreaming principle was implemented to enhance target cell capture from whole blood samples using immunomagnetic beads. This technique was also employed to accelerate DNA hybridization reaction. Thermally actuated paraffin-based microvalves were developed to regulate flows. Electrochemical pumps and thermopneumatic pumps were integrated on the chip to provide pumping of liquid solutions. The device is completely self-contained: no external pressure sources, fluid storage, mechanical pumps, or valves are necessary for fluid manipulation, thus eliminating possible sample contamination and simplifying device operation. Pathogenic bacteria detection from ~mL whole blood samples and single-nucleotide polymorphism analysis directly from diluted blood were demonstrated. The device provides a cost-effective solution to direct sample-to-answer genetic analysis, and thus has a potential impact in the fields of point-of-care genetic analysis, environmental testing, and biological warfare agent detection.

  7. Localized analysis of paint-coat drying using dynamic speckle interferometry

    NASA Astrophysics Data System (ADS)

    Sierra-Sosa, Daniel; Tebaldi, Myrian; Grumel, Eduardo; Rabal, Hector; Elmaghraby, Adel

    2018-07-01

    The paint-coating is part of several industrial processes, including the automotive industry, architectural coatings, machinery and appliances. These paint-coatings must comply with high quality standards, for this reason evaluation techniques from paint-coatings are in constant development. One important factor from the paint-coating process is the drying, as it has influence on the quality of final results. In this work we present an assessment technique based on the optical dynamic speckle interferometry, this technique allows for the temporal activity evaluation of the paint-coating drying process, providing localized information from drying. This localized information is relevant in order to address the drying homogeneity, optimal drying, and quality control. The technique relies in the definition of a new temporal history of the speckle patterns to obtain the local activity; this information is then clustered to provide a convenient indicative of different drying process stages. The experimental results presented were validated using the gravimetric drying curves

  8. Cost-Effectiveness Research in Neurosurgery: We Can and We Must.

    PubMed

    Stein, Sherman C

    2018-01-05

    Rapid advancement of medical and surgical therapies, coupled with the recent preoccupation with limiting healthcare costs, makes a collision of the 2 objectives imminent. This article explains the value of cost-effectiveness analysis (CEA) in reconciling the 2 competing goals, and provides a brief introduction to evidence-based CEA techniques. The historical role of CEA in determining whether new neurosurgical strategies provide value for cost is summarized briefly, as are the limitations of the technique. Finally, the unique ability of the neurosurgical community to provide input to the CEA process is emphasized, as are the potential risks of leaving these important decisions in the hands of others. Copyright © 2018 by the Congress of Neurological Surgeons.

  9. Possibilities of LA-ICP-MS technique for the spatial elemental analysis of the recent fish scales: Line scan vs. depth profiling

    NASA Astrophysics Data System (ADS)

    Holá, Markéta; Kalvoda, Jiří; Nováková, Hana; Škoda, Radek; Kanický, Viktor

    2011-01-01

    LA-ICP-MS and solution based ICP-MS in combination with electron microprobe are presented as a method for the determination of the elemental spatial distribution in fish scales which represent an example of a heterogeneous layered bone structure. Two different LA-ICP-MS techniques were tested on recent common carp ( Cyprinus carpio) scales: A line scan through the whole fish scale perpendicular to the growth rings. The ablation crater of 55 μm width and 50 μm depth allowed analysis of the elemental distribution in the external layer. Suitable ablation conditions providing a deeper ablation crater gave average values from the external HAP layer and the collagen basal plate. Depth profiling using spot analysis was tested in fish scales for the first time. Spot analysis allows information to be obtained about the depth profile of the elements at the selected position on the sample. The combination of all mentioned laser ablation techniques provides complete information about the elemental distribution in the fish scale samples. The results were compared with the solution based ICP-MS and EMP analyses. The fact that the results of depth profiling are in a good agreement both with EMP and PIXE results and, with the assumed ways of incorporation of the studied elements in the HAP structure, suggests a very good potential for this method.

  10. Computational techniques for ECG analysis and interpretation in light of their contribution to medical advances

    PubMed Central

    Mincholé, Ana; Martínez, Juan Pablo; Laguna, Pablo; Rodriguez, Blanca

    2018-01-01

    Widely developed for clinical screening, electrocardiogram (ECG) recordings capture the cardiac electrical activity from the body surface. ECG analysis can therefore be a crucial first step to help diagnose, understand and predict cardiovascular disorders responsible for 30% of deaths worldwide. Computational techniques, and more specifically machine learning techniques and computational modelling are powerful tools for classification, clustering and simulation, and they have recently been applied to address the analysis of medical data, especially ECG data. This review describes the computational methods in use for ECG analysis, with a focus on machine learning and 3D computer simulations, as well as their accuracy, clinical implications and contributions to medical advances. The first section focuses on heartbeat classification and the techniques developed to extract and classify abnormal from regular beats. The second section focuses on patient diagnosis from whole recordings, applied to different diseases. The third section presents real-time diagnosis and applications to wearable devices. The fourth section highlights the recent field of personalized ECG computer simulations and their interpretation. Finally, the discussion section outlines the challenges of ECG analysis and provides a critical assessment of the methods presented. The computational methods reported in this review are a strong asset for medical discoveries and their translation to the clinical world may lead to promising advances. PMID:29321268

  11. A Passive System Reliability Analysis for a Station Blackout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia; Bucknor, Matthew; Grabaskas, David

    2015-05-03

    The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passivemore » system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.« less

  12. Biomedical surface analysis: Evolution and future directions (Review)

    PubMed Central

    Castner, David G.

    2017-01-01

    This review describes some of the major advances made in biomedical surface analysis over the past 30–40 years. Starting from a single technique analysis of homogeneous surfaces, it has been developed into a complementary, multitechnique approach for obtaining detailed, comprehensive information about a wide range of surfaces and interfaces of interest to the biomedical community. Significant advances have been made in each surface analysis technique, as well as how the techniques are combined to provide detailed information about biological surfaces and interfaces. The driving force for these advances has been that the surface of a biomaterial is the interface between the biological environment and the biomaterial, and so, the state-of-the-art in instrumentation, experimental protocols, and data analysis methods need to be developed so that the detailed surface structure and composition of biomedical devices can be determined and related to their biological performance. Examples of these advances, as well as areas for future developments, are described for immobilized proteins, complex biomedical surfaces, nanoparticles, and 2D/3D imaging of biological materials. PMID:28438024

  13. Sensory Recovery Outcome after Digital Nerve Repair in Relation to Different Reconstructive Techniques: Meta-Analysis and Systematic Review

    PubMed Central

    Wolf, Petra; Harder, Yves; Kern, Yasmin; Paprottka, Philipp M.; Machens, Hans-Günther; Lohmeyer, Jörn A.

    2013-01-01

    Good clinical outcome after digital nerve repair is highly relevant for proper hand function and has a significant socioeconomic impact. However, level of evidence for competing surgical techniques is low. The aim is to summarize and compare the outcomes of digital nerve repair with different methods (end-to-end and end-to-side coaptations, nerve grafts, artificial conduit-, vein-, muscle, and muscle-in-vein reconstructions, and replantations) to provide an aid for choosing an individual technique of nerve reconstruction and to create reference values of standard repair for nonrandomized clinical studies. 87 publications including 2,997 nerve repairs were suitable for a precise evaluation. For digital nerve repairs there was practically no particular technique superior to another. Only end-to-side coaptation had an inferior two-point discrimination in comparison to end-to-end coaptation or nerve grafting. Furthermore, this meta-analysis showed that youth was associated with an improved sensory recovery outcome in patients who underwent digital replantation. For end-to-end coaptations, recent publications had significantly better sensory recovery outcomes than older ones. Given minor differences in outcome, the main criteria in choosing an adequate surgical technique should be gap length and donor site morbidity caused by graft material harvesting. Our clinical experience was used to provide a decision tree for digital nerve repair. PMID:23984064

  14. Intraosseous repair of the inferior alveolar nerve in rats: an experimental model.

    PubMed

    Curtis, N J; Trickett, R I; Owen, E; Lanzetta, M

    1998-08-01

    A reliable method of exposure of the inferior alveolar nerve in Wistar rats has been developed, to allow intraosseous repair with two microsurgical techniques under halothane inhalational anaesthesia. The microsuturing technique involves anastomosis with 10-0 nylon sutures; a laser-weld technique uses an albumin-based solder containing indocyanine green, plus an infrared (810 nm wavelength) diode laser Seven animals had left inferior alveolar nerve repairs performed with the microsuture and laser-weld techniques. Controls were provided by unoperated nerves in the repaired cases. Histochemical analysis was performed utilizing neuron counts and horseradish peroxidase tracer (HRP) uptake in the mandibular division of the trigeminal ganglion, following sacrifice and staining of frozen sections with cresyl violet and diaminobenzidene. The results of this analysis showed similar mean neuron counts and mean HRP uptake by neurons for the unoperated controls and both microsuture and laser-weld groups. This new technique of intraosseous exposure of the inferior alveolar nerve in rats is described. It allows reliable and reproducible microsurgical repairs using both microsuture and laser-weld techniques.

  15. Magnetic fabric constraints of the emplacement of igneous intrusions

    NASA Astrophysics Data System (ADS)

    Maes, Stephanie M.

    Fabric analysis is critical to evaluating the history, kinematics, and dynamics of geological deformation. This is particularly true of igneous intrusions, where the development of fabric is used to constrain magmatic flow and emplacement mechanisms. Fabric analysis was applied to three mafic intrusions, with different tectonic and petrogenetic histories, to study emplacement and magma flow: the Insizwa sill (Mesozoic Karoo Large Igneous Province, South Africa), Sonju Lake intrusion (Proterozoic Midcontinent Rift, Minnesota, USA), and Palisades sill (Mesozoic rift basin, New Jersey, USA). Multiple fabric analysis techniques were used to define the fabric in each intrusive body. Using digital image analysis techniques on multiple thin sections, the three-dimensional shape-preferred orientation (SPO) of populations of mineral phases were calculated. Low-field anisotropy of magnetic susceptibility (AMS) measurements were used as a proxy for the mineral fabric of the ferromagnetic phases (e.g., magnetite). In addition, a new technique---high-field AMS---was used to isolate the paramagnetic component of the fabric (e.g., silicate fabric). Each fabric analysis technique was then compared to observable field fabrics as a framework for interpretation. In the Insizwa sill, magnetic properties were used to corroborate vertical petrologic zonation and distinguish sub-units within lithologically defined units. Abrupt variation in magnetic properties provides evidence supporting the formation of the Insizwa sill by separate magma intrusions. Low-field AMS fabrics in the Sonju Lake intrusion exhibit consistent SW-plunging lineations and SW-dipping foliations. These fabric orientations provide evidence that the cumulate layers in the intrusion were deposited in a dynamic environment, and indicate magma flowed from southwest to northeast, parallel to the pre-existing rift structures. In the Palisades sill, the magnetite SPO and low-field AMS lineation have developed orthogonal to the plagioclase SPO and high-field AMS lineation. Magma flow in the Palisades magmatic system is interpreted to have originated from a point source feeder. Low-field AMS records the flow direction, whereas high-field AMS records extension within the igneous sheet. The multiple fabric analysis techniques presented in this dissertation have advanced our understanding of the development of fabric and its relationship to internal structure, emplacement, and magma dynamics in mafic igneous systems.

  16. Application of Petri net based analysis techniques to signal transduction pathways.

    PubMed

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-11-02

    Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. The paper demonstrates how Petri net analysis techniques can promote a deeper understanding of signal transduction pathways. The new concepts of feasible t-invariants and MCT-sets have been proven to be useful for model validation and the interpretation of the biological system behaviour. Whereas MCT-sets provide a decomposition of the net into disjunctive subnets, feasible t-invariants describe subnets, which generally overlap. This work contributes to qualitative modelling and to the analysis of large biological networks by their fully automatic decomposition into biologically meaningful modules.

  17. Application of Petri net based analysis techniques to signal transduction pathways

    PubMed Central

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-01-01

    Background Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. Methods We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. Results We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. Conclusion The paper demonstrates how Petri net analysis techniques can promote a deeper understanding of signal transduction pathways. The new concepts of feasible t-invariants and MCT-sets have been proven to be useful for model validation and the interpretation of the biological system behaviour. Whereas MCT-sets provide a decomposition of the net into disjunctive subnets, feasible t-invariants describe subnets, which generally overlap. This work contributes to qualitative modelling and to the analysis of large biological networks by their fully automatic decomposition into biologically meaningful modules. PMID:17081284

  18. Application of data fusion techniques and technologies for wearable health monitoring.

    PubMed

    King, Rachel C; Villeneuve, Emma; White, Ruth J; Sherratt, R Simon; Holderbaum, William; Harwin, William S

    2017-04-01

    Technological advances in sensors and communications have enabled discrete integration into everyday objects, both in the home and about the person. Information gathered by monitoring physiological, behavioural, and social aspects of our lives, can be used to achieve a positive impact on quality of life, health, and well-being. Wearable sensors are at the cusp of becoming truly pervasive, and could be woven into the clothes and accessories that we wear such that they become ubiquitous and transparent. To interpret the complex multidimensional information provided by these sensors, data fusion techniques are employed to provide a meaningful representation of the sensor outputs. This paper is intended to provide a short overview of data fusion techniques and algorithms that can be used to interpret wearable sensor data in the context of health monitoring applications. The application of these techniques are then described in the context of healthcare including activity and ambulatory monitoring, gait analysis, fall detection, and biometric monitoring. A snap-shot of current commercially available sensors is also provided, focusing on their sensing capability, and a commentary on the gaps that need to be bridged to bring research to market. Copyright © 2017. Published by Elsevier Ltd.

  19. Combined elemental and microstructural analysis of genuine and fake copper-alloy coins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartoli, L; Agresti, J; Mascalchi, M

    2011-07-31

    Innovative noninvasive material analysis techniques are applied to determine archaeometallurgical characteristics of copper-alloy coins from Florence's National Museum of Archaeology. Three supposedly authentic Roman coins and three hypothetically fraudolent imitations are thoroughly investigated using laser-induced plasma spectroscopy and time of flight neutron diffraction along with 3D videomicroscopy and electron microscopy. Material analyses are aimed at collecting data allowing for objective discrimination between genuine Roman productions and late fakes. The results show the mentioned techniques provide quantitative compositional and textural data, which are strictly related to the manufacturing processes and aging of copper alloys. (laser applications)

  20. Zymography Methods to Simultaneously Analyze Superoxide Dismutase and Catalase Activities: Novel Application for Yeast Species Identification.

    PubMed

    Gamero-Sandemetrio, Esther; Gómez-Pastor, Rocío; Matallana, Emilia

    2017-01-01

    We provide an optimized protocol for a double staining technique to analyze superoxide dismutase enzymatic isoforms Cu-Zn SOD (Sod1) and Mn-SOD (Sod2) and catalase in the same polyacrylamide gel. The use of NaCN, which specifically inhibits yeast Sod1 isoform, allows the analysis of Sod2 isoform while the use of H 2 O 2 allows the analysis of catalase. The identification of a different zymography profiling of SOD and catalase isoforms in different yeast species allowed us to propose this technique as a novel yeast identification and classification strategy.

  1. Temporal Methods to Detect Content-Based Anomalies in Social Media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skryzalin, Jacek; Field, Jr., Richard; Fisher, Andrew N.

    Here, we develop a method for time-dependent topic tracking and meme trending in social media. Our objective is to identify time periods whose content differs signifcantly from normal, and we utilize two techniques to do so. The first is an information-theoretic analysis of the distributions of terms emitted during different periods of time. In the second, we cluster documents from each time period and analyze the tightness of each clustering. We also discuss a method of combining the scores created by each technique, and we provide ample empirical analysis of our methodology on various Twitter datasets.

  2. The interactive astronomical data analysis facility - image enhancement techniques to Comet Halley

    NASA Astrophysics Data System (ADS)

    Klinglesmith, D. A.

    1981-10-01

    PDP 11/40 computer is at the heart of a general purpose interactive data analysis facility designed to permit easy access to data in both visual imagery and graphic representations. The major components consist of: the 11/40 CPU and 256 K bytes of 16-bit memory; two TU10 tape drives; 20 million bytes of disk storage; three user terminals; and the COMTAL image processing display system. The application of image enhancement techniques to two sequences of photographs of Comet Halley taken in Egypt in 1910 provides evidence for eruptions from the comet's nucleus.

  3. Surface diagnostics for scale analysis.

    PubMed

    Dunn, S; Impey, S; Kimpton, C; Parsons, S A; Doyle, J; Jefferson, B

    2004-01-01

    Stainless steel, polymethylmethacrylate and polytetrafluoroethylene coupons were analysed for surface topographical and adhesion force characteristics using tapping mode atomic force microscopy and force-distance microscopy techniques. The two polymer materials were surface modified by polishing with silicon carbide papers of known grade. The struvite scaling rate was determined for each coupon and related to the data gained from the surface analysis. The scaling rate correlated well with adhesion force measurements indicating that lower energy materials scale at a lower rate. The techniques outlined in the paper provide a method for the rapid screening of materials in potential scaling applications.

  4. Using Structural Equation Models with Latent Variables to Study Student Growth and Development.

    ERIC Educational Resources Information Center

    Pike, Gary R.

    1991-01-01

    Analysis of data on freshman-to-senior developmental gains in 722 University of Tennessee-Knoxville students provides evidence of the advantages of structural equation modeling with latent variables and suggests that the group differences identified by traditional analysis of variance and covariance techniques may be an artifact of measurement…

  5. Establishing a Common Vocabulary of Key Concepts for the Effective Implementation of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Cihon, Traci M.; Cihon, Joseph H.; Bedient, Guy M.

    2016-01-01

    The technical language of behavior analysis is arguably necessary to share ideas and research with precision among each other. However, it can hinder effective implementation of behavior analytic techniques when it prevents clear communication between the supervising behavior analyst and behavior technicians. The present paper provides a case…

  6. Analysis of Computer Teachers' Online Discussion Forum Messages about Their Occupational Problems

    ERIC Educational Resources Information Center

    Deryakulu, Deniz; Olkun, Sinan

    2007-01-01

    This study, using content analysis technique, examined the types of job-related problems that the Turkish computer teachers experienced and the types of social support provided by reciprocal discussions in an online forum. Results indicated that role conflict, inadequate teacher induction policies, lack of required technological infrastructure and…

  7. Merging Traditional Technique Vocabularies with Democratic Teaching Perspectives in Dance Education: A Consideration of Aesthetic Values and Their Sociopolitical Contexts

    ERIC Educational Resources Information Center

    Dyer, Becky

    2009-01-01

    This article suggests how movement analysis from a socially contextualized perspective can inform understanding about the significance of sociopolitical contexts and aesthetic values in Western dance training. Perspectives of movement analysis provide groundwork for discussing perceivable ways to address discrepancies between democratic and…

  8. A Selected Annotated Bibliography on the Analysis of Water Resources System, Volume 2.

    ERIC Educational Resources Information Center

    Kriss, Carol; And Others

    Presented is an annotated bibliography of some recent selected publications pertaining to the application of systems analysis techniques for defining and evaluating alternative solutions to water resource problems. Both subject and author indices are provided. Keywords are listed at the end of each abstract. The abstracted material emphasizes the…

  9. Protocol Analysis as a Method for Analyzing Conversational Data.

    ERIC Educational Resources Information Center

    Aleman, Carlos G.; Vangelisti, Anita L.

    Protocol analysis, a technique that uses people's verbal reports about their cognitions as they engage in an assigned task, has been used in a number of applications to provide insight into how people mentally plan, assess, and carry out those assignments. Using a system of networked computers where actors communicate with each other over…

  10. Space crew radiation exposure analysis system based on a commercial stand-alone CAD system

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Golightly, Michael J.; Hardy, Alva C.

    1992-01-01

    Major improvements have recently been completed in the approach to spacecraft shielding analysis. A Computer-Aided Design (CAD)-based system has been developed for determining the shielding provided to any point within or external to the spacecraft. Shielding analysis is performed using a commercially available stand-alone CAD system and a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design projects such as a Mars transfer habitat, pressurized lunar rover, and the redesigned Space Station. Results of these analyses are provided to demonstrate the applicability and versatility of the system.

  11. Recent Development in Optical Chemical Sensors Coupling with Flow Injection Analysis

    PubMed Central

    Ojeda, Catalina Bosch; Rojas, Fuensanta Sánchez

    2006-01-01

    Optical techniques for chemical analysis are well established and sensors based on these techniques are now attracting considerable attention because of their importance in applications such as environmental monitoring, biomedical sensing, and industrial process control. On the other hand, flow injection analysis (FIA) is advisable for the rapid analysis of microliter volume samples and can be interfaced directly to the chemical process. The FIA has become a widespread automatic analytical method for more reasons; mainly due to the simplicity and low cost of the setups, their versatility, and ease of assembling. In this paper, an overview of flow injection determinations by using optical chemical sensors is provided, and instrumentation, sensor design, and applications are discussed. This work summarizes the most relevant manuscripts from 1980 to date referred to analysis using optical chemical sensors in FIA.

  12. Load and dynamic assessment of B-52B-008 carrier aircraft for finned configuration 1 space shuttle solid rocket booster decelerator subsystem drop test vehicle. Volume 2: Airplane flutter and load analysis results

    NASA Technical Reports Server (NTRS)

    Quade, D. A.

    1978-01-01

    The airplane flutter and maneuver-gust load analysis results obtained during B-52B drop test vehicle configuration (with fins) evaluation are presented. These data are presented as supplementary data to that given in Volume 1 of this document. A brief mathematical description of airspeed notation and gust load factor criteria are provided as a help to the user. References are defined which provide mathematical description of the airplane flutter and load analysis techniques. Air-speed-load factor diagrams are provided for the airplane weight configurations reanalyzed for finned drop test vehicle configuration.

  13. Spacecraft Charging Calculations: NASCAP-2K and SEE Spacecraft Charging Handbook

    NASA Technical Reports Server (NTRS)

    Davis, V. A.; Neergaard, L. F.; Mandell, M. J.; Katz, I.; Gardner, B. M.; Hilton, J. M.; Minor, J.

    2002-01-01

    For fifteen years NASA and the Air Force Charging Analyzer Program for Geosynchronous Orbits (NASCAP/GEO) has been the workhorse of spacecraft charging calculations. Two new tools, the Space Environment and Effects (SEE) Spacecraft Charging Handbook (recently released), and Nascap-2K (under development), use improved numeric techniques and modern user interfaces to tackle the same problem. The SEE Spacecraft Charging Handbook provides first-order, lower-resolution solutions while Nascap-2K provides higher resolution results appropriate for detailed analysis. This paper illustrates how the improvements in the numeric techniques affect the results.

  14. Interviewing a Silent (Radioactive) Witness through Nuclear Forensic Analysis.

    PubMed

    Mayer, Klaus; Wallenius, Maria; Varga, Zsolt

    2015-12-01

    Nuclear forensics is a relatively young discipline in science which aims at providing information on nuclear material of unknown origin. The determination of characteristic parameters through tailored analytical techniques enables establishing linkages to the material's processing history and hence provides hints on its place and date of production and on the intended use.

  15. Techniques for the Analysis of Spectral and Orbital Congestion in Space Systems.

    DTIC Science & Technology

    1984-03-01

    Appendix 29 gives the appropriate equations ... .. - 87 - for the two cases, and provides algorithms for polarization isolation, I topocentric and geocentric ...The PDP form is maintained by MITRE Dept. D97, which provides services to run the program when staffing permits. NASA Lewis has used the results in a

  16. User Oriented Techniques to Support Interaction and Decision Making with Large Educational Databases

    ERIC Educational Resources Information Center

    Hartley, Roger; Almuhaidib, Saud M. Y.

    2007-01-01

    Information Technology is developing rapidly and providing policy/decision makers with large amounts of information that require processing and analysis. Decision support systems (DSS) aim to provide tools that not only help such analyses, but enable the decision maker to experiment and simulate the effects of different policies and selection…

  17. Discourse analysis: towards an understanding of its place in nursing.

    PubMed

    Crowe, Marie

    2005-07-01

    This paper describes how discourse analysis, and in particular critical discourse analysis, can be used in nursing research, and provides an example to illustrate the techniques involved. Discourse analysis has risen to prominence in the 1980s and 1990s in disciplines such as the social sciences, literary theory and cultural studies and is increasingly used in nursing. This paper investigates discourse analysis as a useful methodology for conducting nursing research. Effective clinical reasoning relies on employing several different kinds of knowledge and research that draw on different perspectives, methodologies and techniques to generate breadth of knowledge and depth of understanding of clinical practices and patients' experiences of those practices. The steps in a discourse analysis include: choosing the text, and identifying the explicit purpose of the text, the processes used for claiming authority connections to other discourses, construction of major concepts, processes of naming and categorizing, construction of subject positions, construction of reality and social relations and implications for the practice of nursing. The limitations of discourse analysis, its relationship to other qualitative approaches and questions for evaluating the rigour of research using discourse analysis are also explored. The example of discourse analysis shows how a text influences the practice of nursing by shaping knowledge, values and beliefs. Discourse analysis can make a contribution to the development of nursing knowledge by providing a research strategy to examine dominant discourses that influence nursing practice.

  18. Heterodyne laser spectroscopy system

    DOEpatents

    Wyeth, Richard W.; Paisner, Jeffrey A.; Story, Thomas

    1990-01-01

    A heterodyne laser spectroscopy system utilizes laser heterodyne techniques for purposes of laser isotope separation spectroscopy, vapor diagnostics, processing of precise laser frequency offsets from a reference frequency, and provides spectral analysis of a laser beam.

  19. A Marketing Case History Profile

    ERIC Educational Resources Information Center

    Weirick, Margaret C.

    1978-01-01

    A current marketing plan from Temple University illustrates many marketing techniques, including those dealing with enrollment objectives, market objectives, demographic characteristics of Temple students, market share analysis, and the marketing plan. Specific guidelines are provided. (LBH)

  20. Applications of decision analysis and related techniques to industrial engineering problems at KSC

    NASA Technical Reports Server (NTRS)

    Evans, Gerald W.

    1995-01-01

    This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).

  1. The support-control continuum: An investigation of staff perspectives on factors influencing the success or failure of de-escalation techniques for the management of violence and aggression in mental health settings.

    PubMed

    Price, Owen; Baker, John; Bee, Penny; Lovell, Karina

    2018-01-01

    De-escalation techniques are recommended to manage violence and aggression in mental health settings yet restrictive practices continue to be frequently used. Barriers and enablers to the implementation and effectiveness of de-escalation techniques in practice are not well understood. To obtain staff descriptions of de-escalation techniques currently used in mental health settings and explore factors perceived to influence their implementation and effectiveness. Qualitative, semi-structured interviews and Framework Analysis. Five in-patient wards including three male psychiatric intensive care units, one female acute ward and one male acute ward in three UK Mental Health NHS Trusts. 20 ward-based clinical staff. Individual semi-structured interviews were digitally recorded, transcribed verbatim and analysed using a qualitative data analysis software package. Participants described 14 techniques used in response to escalated aggression applied on a continuum between support and control. Techniques along the support-control continuum could be classified in three groups: 'support' (e.g. problem-solving, distraction, reassurance) 'non-physical control' (e.g. reprimands, deterrents, instruction) and 'physical control' (e.g. physical restraint and seclusion). Charting the reasoning staff provided for technique selection against the described behavioural outcome enabled a preliminary understanding of staff, patient and environmental influences on de-escalation success or failure. Importantly, the more coercive 'non-physical control' techniques are currently conceptualised by staff as a feature of de-escalation techniques, yet, there was evidence of a link between these and increased aggression/use of restrictive practices. Risk was not a consistent factor in decisions to adopt more controlling techniques. Moral judgements regarding the function of the aggression; trial-and-error; ingrained local custom (especially around instruction to low stimulus areas); knowledge of the patient; time-efficiency and staff anxiety had a key role in escalating intervention. This paper provides a new model for understanding staff intervention in response to escalated aggression, a continuum between support and control. It further provides a preliminary explanatory framework for understanding the relationship between patient behaviour, staff response and environmental influences on de-escalation success and failure. This framework reveals potentially important behaviour change targets for interventions seeking to reduce violence and use of restrictive practices through enhanced de-escalation techniques. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. A Hitchhiker's Guide to Functional Magnetic Resonance Imaging

    PubMed Central

    Soares, José M.; Magalhães, Ricardo; Moreira, Pedro S.; Sousa, Alexandre; Ganz, Edward; Sampaio, Adriana; Alves, Victor; Marques, Paulo; Sousa, Nuno

    2016-01-01

    Functional Magnetic Resonance Imaging (fMRI) studies have become increasingly popular both with clinicians and researchers as they are capable of providing unique insights into brain functions. However, multiple technical considerations (ranging from specifics of paradigm design to imaging artifacts, complex protocol definition, and multitude of processing and methods of analysis, as well as intrinsic methodological limitations) must be considered and addressed in order to optimize fMRI analysis and to arrive at the most accurate and grounded interpretation of the data. In practice, the researcher/clinician must choose, from many available options, the most suitable software tool for each stage of the fMRI analysis pipeline. Herein we provide a straightforward guide designed to address, for each of the major stages, the techniques, and tools involved in the process. We have developed this guide both to help those new to the technique to overcome the most critical difficulties in its use, as well as to serve as a resource for the neuroimaging community. PMID:27891073

  3. Quantitative mass spectrometry methods for pharmaceutical analysis

    PubMed Central

    Loos, Glenn; Van Schepdael, Ann

    2016-01-01

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644982

  4. Reliability analysis of the F-8 digital fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goodman, H. A.

    1981-01-01

    The F-8 Digital Fly-by-Wire (DFBW) flight test program intended to provide the technology for advanced control systems, giving aircraft enhanced performance and operational capability is addressed. A detailed analysis of the experimental system was performed to estimated the probabilities of two significant safety critical events: (1) loss of primary flight control function, causing reversion to the analog bypass system; and (2) loss of the aircraft due to failure of the electronic flight control system. The analysis covers appraisal of risks due to random equipment failure, generic faults in design of the system or its software, and induced failure due to external events. A unique diagrammatic technique was developed which details the combinatorial reliability equations for the entire system, promotes understanding of system failure characteristics, and identifies the most likely failure modes. The technique provides a systematic method of applying basic probability equations and is augmented by a computer program written in a modular fashion that duplicates the structure of these equations.

  5. Cluster analysis of accelerated molecular dynamics simulations: A case study of the decahedron to icosahedron transition in Pt nanoparticles.

    PubMed

    Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F; Perez, Danny

    2017-10-21

    Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.

  6. Cluster analysis of accelerated molecular dynamics simulations: A case study of the decahedron to icosahedron transition in Pt nanoparticles

    NASA Astrophysics Data System (ADS)

    Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F.; Perez, Danny

    2017-10-01

    Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.

  7. Reliable transformation system for Microbotryum lychnidis-dioicae informed by genome and transcriptome project.

    PubMed

    Toh, Su San; Treves, David S; Barati, Michelle T; Perlin, Michael H

    2016-10-01

    Microbotryum lychnidis-dioicae is a member of a species complex infecting host plants in the Caryophyllaceae. It is used as a model system in many areas of research, but attempts to make this organism tractable for reverse genetic approaches have not been fruitful. Here, we exploited the recently obtained genome sequence and transcriptome analysis to inform our design of constructs for use in Agrobacterium-mediated transformation techniques currently available for other fungi. Reproducible transformation was demonstrated at the genomic, transcriptional and functional levels. Moreover, these initial proof-of-principle experiments provide evidence that supports the findings from initial global transcriptome analysis regarding expression from the respective promoters under different growth conditions of the fungus. The technique thus provides for the first time the ability to stably introduce transgenes and over-express target M. lychnidis-dioicae genes.

  8. MicroV Technology to Improve Transcranial Color Coded Doppler Examinations.

    PubMed

    Malferrari, Giovanni; Pulito, Giuseppe; Pizzini, Attilia Maria; Carraro, Nicola; Meneghetti, Giorgio; Sanzaro, Enzo; Prati, Patrizio; Siniscalchi, Antonio; Monaco, Daniela

    2018-05-04

    The purpose of this review is to provide an update on technology related to Transcranial Color Coded Doppler Examinations. Microvascularization (MicroV) is an emerging Power Doppler technology which can allow visualization of low and weak blood flows even at high depths, thus providing a suitable technique for transcranial ultrasound analysis. With MicroV, reconstruction of the vessel shape can be improved, without any overestimation. Furthermore, by analyzing the Doppler signal, MicroV allows a global image of the Circle of Willis. Transcranial Doppler was originally developed for the velocimetric analysis of intracranial vessels, in particular to detect stenoses and the assessment of collateral circulation. Doppler velocimetric analysis was then compared to other neuroimaging techniques, thus providing a cut-off threshold. Transcranial Color Coded Doppler sonography allowed the characterization of vessel morphology. In both Color Doppler and Power Doppler, the signal overestimated the shape of the intracranial vessels, mostly in the presence of thin vessels and high depths of study. In further neurosonology technology development efforts, attempts have been made to address morphology issues and overcome technical limitations. The use of contrast agents has helped in this regard by introducing harmonics and subtraction software, which allowed better morphological studies of vessels, due to their increased signal-to-noise ratio. Having no limitations in the learning curve, in time and contrast agent techniques, and due to its high signal-to-noise ratio, MicroV has shown great potential to obtain the best morphological definition. Copyright © 2018 by the American Society of Neuroimaging.

  9. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    PubMed

    Chavez, Juan D; Eng, Jimmy K; Schweppe, Devin K; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  10. A comb-sampling method for enhanced mass analysis in linear electrostatic ion traps.

    PubMed

    Greenwood, J B; Kelly, O; Calvert, C R; Duffy, M J; King, R B; Belshaw, L; Graham, L; Alexander, J D; Williams, I D; Bryan, W A; Turcu, I C E; Cacho, C M; Springate, E

    2011-04-01

    In this paper an algorithm for extracting spectral information from signals containing a series of narrow periodic impulses is presented. Such signals can typically be acquired by pickup detectors from the image-charge of ion bunches oscillating in a linear electrostatic ion trap, where frequency analysis provides a scheme for high-resolution mass spectrometry. To provide an improved technique for such frequency analysis, we introduce the CHIMERA algorithm (Comb-sampling for High-resolution IMpulse-train frequency ExtRAaction). This algorithm utilizes a comb function to generate frequency coefficients, rather than using sinusoids via a Fourier transform, since the comb provides a superior match to the data. This new technique is developed theoretically, applied to synthetic data, and then used to perform high resolution mass spectrometry on real data from an ion trap. If the ions are generated at a localized point in time and space, and the data is simultaneously acquired with multiple pickup rings, the method is shown to be a significant improvement on Fourier analysis. The mass spectra generated typically have an order of magnitude higher resolution compared with that obtained from fundamental Fourier frequencies, and are absent of large contributions from harmonic frequency components. © 2011 American Institute of Physics

  11. Fundamentals of functional imaging II: emerging MR techniques and new methods of analysis.

    PubMed

    Luna, A; Martín Noguerol, T; Mata, L Alcalá

    2018-05-01

    Current multiparameter MRI protocols integrate structural, physiological, and metabolic information about cancer. Emerging techniques such as arterial spin-labeling (ASL), blood oxygen level dependent (BOLD), MR elastography, chemical exchange saturation transfer (CEST), and hyperpolarization provide new information and will likely be integrated into daily clinical practice in the near future. Furthermore, there is great interest in the study of tumor heterogeneity as a prognostic factor and in relation to resistance to treatment, and this interest is leading to the application of new methods of analysis of multiparametric protocols. In parallel, new oncologic biomarkers that integrate the information from MR with clinical, laboratory, genetic, and histologic findings are being developed, thanks to the application of big data and artificial intelligence. This review analyzes different emerging MR techniques that are able to evaluate the physiological, metabolic, and mechanical characteristics of cancer, as well as the main clinical applications of these techniques. In addition, it summarizes the most novel methods of analysis of functional radiologic information in oncology. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  12. Neutron beam measurement of industrial polymer materials for composition and bulk integrity

    NASA Astrophysics Data System (ADS)

    Rogante, M.; Rosta, L.; Heaton, M. E.

    2013-10-01

    Neutron beam techniques, among other non-destructive diagnostics, are particularly irreplaceable in the complete analysis of industrial materials and components when supplying fundamental information. In this paper, nanoscale small-angle neutron scattering analysis and prompt gamma activation analysis for the characterization of industrial polymers are considered. The basic theoretical aspects are briefly introduced and some applications are presented. The investigations of the SU-8 polymer in axial airflow microturbines—i.e. microelectromechanical systems—are presented foremost. Also presented are full and feasibility studies on polyurethanes, composites based on cross-linked polymers reinforced by carbon fibres and polymer cement concrete. The obtained results have provided a substantial contribution to the improvement of the considered materials, and indeed confirmed the industrial applicability of the adopted techniques in the analysis of polymers.

  13. The Evolution of 3D Microimaging Techniques in Geosciences

    NASA Astrophysics Data System (ADS)

    Sahagian, D.; Proussevitch, A.

    2009-05-01

    In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can be done at three basic levels of resolution, with "normal" x-rays providing tens of microns resolution, synchrotron sources providing single to few microns, and emerging XuM techniques providing a practical 300 nm and theoretical 60 nm. The main challenges in CXT imaging have been in segmentation, which delineates material boundaries, and object recognition (registration), in which the individual objects within a material are identified. The former is critical in quantifying object volume, while the latter is essential for preventing the false appearance of individual objects as a continuous structure. Additional, new techniques are now being developed to enhance resolution and provide more detailed analysis without the complex infrastructure needed for CXT. One such method is Laser Scanning Confocal Microscopy, in which a laser is reflected from individual interior surfaces of a fluorescing material, providing a series of sharp images of internal slices with quantitative information available, just as in x-ray tomography, after "z-stacking" of planes of pixels. Another novel approach is the use of Stereo Scanning Electron Microscopy to create digital elevation models of 3D surficial features such as partial bubble margins on the surfaces of fine volcanic ash particles. As other novel techniques emerge, new opportunities will be presented to the geological research community to obtain ever more detailed and accurate information regarding the interior structure of geomaterials.

  14. Time-frequency analysis of pediatric murmurs

    NASA Astrophysics Data System (ADS)

    Lombardo, Joseph S.; Blodgett, Lisa A.; Rosen, Ron S.; Najmi, Amir-Homayoon; Thompson, W. Reid

    1998-05-01

    Technology has provided many new tools to assist in the diagnosis of pathologic conditions of the heart. Echocardiography, Ultrafast CT, and MRI are just a few. While these tools are a valuable resource, they are typically too expensive, large and complex in operation for use in rural, homecare, and physician's office settings. Recent advances in computer performance, miniaturization, and acoustic signal processing, have yielded new technologies that when applied to heart sounds can provide low cost screening for pathologic conditions. The short duration and transient nature of these signals requires processing techniques that provide high resolution in both time and frequency. Short-time Fourier transforms, Wigner distributions, and wavelet transforms have been applied to signals form hearts with various pathologic conditions. While no single technique provides the ideal solution, the combination of tools provides a good representation of the acoustic features of the pathologies selected.

  15. Enhanced Analysis Techniques for an Imaging Neutron and Gamma Ray Spectrometer

    NASA Astrophysics Data System (ADS)

    Madden, Amanda C.

    The presence of gamma rays and neutrons is a strong indicator of the presence of Special Nuclear Material (SNM). The imaging Neutron and gamma ray SPECTrometer (NSPECT) developed by the University of New Hampshire and Michigan Aerospace corporation detects the fast neutrons and prompt gamma rays from fissile material, and the gamma rays from radioactive material. The instrument operates as a double scatter device, requiring a neutron or a gamma ray to interact twice in the instrument. While this detection requirement decreases the efficiency of the instrument, it offers superior background rejection and the ability to measure the energy and momentum of the incident particle. These measurements create energy spectra and images of the emitting source for source identification and localization. The dual species instrument provides superior detection than a single species alone. In realistic detection scenarios, few particles are detected from a potential threat due to source shielding, detection at a distance, high background, and weak sources. This contributes to a small signal to noise ratio, and threat detection becomes difficult. To address these difficulties, several enhanced data analysis tools were developed. A Receiver Operating Characteristic Curve (ROC) helps set instrumental alarm thresholds as well as to identify the presence of a source. Analysis of a dual-species ROC curve provides superior detection capabilities. Bayesian analysis helps to detect and identify the presence of a source through model comparisons, and helps create a background corrected count spectra for enhanced spectroscopy. Development of an instrument response using simulations and numerical analyses will help perform spectra and image deconvolution. This thesis will outline the principles of operation of the NSPECT instrument using the double scatter technology, traditional analysis techniques, and enhanced analysis techniques as applied to data from the NSPECT instrument, and an outline of how these techniques can be used to superior detection of radioactive and fissile materials.

  16. Microstructural characterization of multiphase chocolate using X-ray microtomography.

    PubMed

    Frisullo, Pierangelo; Licciardello, Fabio; Muratore, Giuseppe; Del Nobile, Matteo Alessandro

    2010-09-01

    In this study, X-ray microtomography (μCT) was used for the image analysis of the microstructure of 12 types of Italian aerated chocolate chosen to exhibit variability in terms of cocoa mass content. Appropriate quantitative 3-dimensional parameters describing the microstructure were calculated, for example, the structure thickness (ST), object structure volume ratio (OSVR), and the percentage object volume (POV). Chemical analysis was also performed to correlate the microstructural data to the chemical composition of the samples. Correlation between the μCT parameters acquired for the pore microstructure evaluation and the chemical analysis revealed that the sugar crystals content does not influence the pore structure and content. On the other hand, it revealed that there is a strong correlation between the POV and the sugar content obtained by chemical analysis. The results from this study show that μCT is a suitable technique for the microstructural analysis of confectionary products such as chocolates and not only does it provide an accurate analysis of the pores and microstructure but the data obtained could also be used to aid in the assessment of its composition and consistency with label specifications. X-ray microtomography (μCT) is a noninvasive and nondestructive 3-D imaging technique that has several advantages over other methods, including the ability to image low-moisture materials. Given the enormous success of μCT in medical applications, material science, chemical engineering, geology, and biology, it is not surprising that in recent years much attention has been focused on extending this imaging technique to food science as a useful technique to aid in the study of food microstructure. X-ray microtomography provides in-depth information on the microstructure of the food product being tested; therefore, a better understanding of the physical structure of the product and from an engineering perspective, knowledge about the microstructure of foods can be used to identify the important processing parameters that affect the quality of a product.

  17. An example of requirements for Advanced Subsonic Civil Transport (ASCT) flight control system using structured techniques

    NASA Technical Reports Server (NTRS)

    Mclees, Robert E.; Cohen, Gerald C.

    1991-01-01

    The requirements are presented for an Advanced Subsonic Civil Transport (ASCT) flight control system generated using structured techniques. The requirements definition starts from initially performing a mission analysis to identify the high level control system requirements and functions necessary to satisfy the mission flight. The result of the study is an example set of control system requirements partially represented using a derivative of Yourdon's structured techniques. Also provided is a research focus for studying structured design methodologies and in particular design-for-validation philosophies.

  18. Next-Generation Technologies for Multiomics Approaches Including Interactome Sequencing

    PubMed Central

    Ohashi, Hiroyuki; Miyamoto-Sato, Etsuko

    2015-01-01

    The development of high-speed analytical techniques such as next-generation sequencing and microarrays allows high-throughput analysis of biological information at a low cost. These techniques contribute to medical and bioscience advancements and provide new avenues for scientific research. Here, we outline a variety of new innovative techniques and discuss their use in omics research (e.g., genomics, transcriptomics, metabolomics, proteomics, and interactomics). We also discuss the possible applications of these methods, including an interactome sequencing technology that we developed, in future medical and life science research. PMID:25649523

  19. Design of three-dimensional scramjet inlets for hypersonic propulsion

    NASA Technical Reports Server (NTRS)

    Simmons, J. M.; Weidner, E. H.

    1986-01-01

    The paper outlines an approach to the design of three-dimensional inlets for scramjet engines. The basis of the techniques used is the method of streamline tracing through an inviscid axisymmetric flow field. A technique is described for making a smooth change of cross-section shape from rectangular to circular. A feature is the considerable use of computer-graphics to provide a 'user-oriented' procedure which can produce promising design configurations for subsequent analysis with CFD codes. An example is given to demonstrate the capabilities of the design techniques.

  20. Nanomanipulation-coupled nanospray mass spectrometry as an approach for single cell analysis

    NASA Astrophysics Data System (ADS)

    Phelps, Mandy; Hamilton, Jason; Verbeck, Guido F.

    2014-12-01

    Electrospray mass spectrometry is now a widely used technique for observing cell content of various biological tissues. However, electrospray techniques (liquid chromatography and direct infusion) often involve lysing a group of cells and extracting the biomolecules of interest, rather than a sensitive, individual cell method to observe local chemistry. Presented here is an approach of combining a nanomanipulator workstation with nanospray mass spectrometry, which allows for extraction of a single cell, followed by rapid mass analysis that can provide a detailed metabolic profile. Triacylglycerol content was profiled with this tool coupled to mass spectrometry to investigate heterogeneity between healthy and tumorous tissues as well as lipid droplet containing adipocytes in vitro as proof of concept. This selective approach provides cellular resolution and complements existing bioanalytical techniques with minimal invasion to samples. In addition, the coupling of nanomanipulation and mass spectrometry holds the potential to be used in a great number of applications for individual organelles, diseased tissues, and in vitro cell cultures for observing heterogeneity even amongst cells and organelles of the same tissue.

  1. Automated Identification and Shape Analysis of Chorus Elements in the Van Allen Radiation Belts

    NASA Astrophysics Data System (ADS)

    Sen Gupta, Ananya; Kletzing, Craig; Howk, Robin; Kurth, William; Matheny, Morgan

    2017-12-01

    An important goal of the Van Allen Probes mission is to understand wave-particle interaction by chorus emissions in terrestrial Van Allen radiation belts. To test models, statistical characterization of chorus properties, such as amplitude variation and sweep rates, is an important scientific goal. The Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) instrumentation suite provides measurements of wave electric and magnetic fields as well as DC magnetic fields for the Van Allen Probes mission. However, manual inspection across terabytes of EMFISIS data is not feasible and as such introduces human confirmation bias. We present signal processing techniques for automated identification, shape analysis, and sweep rate characterization of high-amplitude whistler-mode chorus elements in the Van Allen radiation belts. Specifically, we develop signal processing techniques based on the radon transform that disambiguate chorus elements with a dominant sweep rate against hiss-like chorus. We present representative results validating our techniques and also provide statistical characterization of detected chorus elements across a case study of a 6 s epoch.

  2. Characterization of Colloidal Quantum Dot Ligand Exchange by X-ray Photoelectron Spectroscopy

    NASA Astrophysics Data System (ADS)

    Atewologun, Ayomide; Ge, Wangyao; Stiff-Roberts, Adrienne D.

    2013-05-01

    Colloidal quantum dots (CQDs) are chemically synthesized semiconductor nanoparticles with size-dependent wavelength tunability. Chemical synthesis of CQDs involves the attachment of long organic surface ligands to prevent aggregation; however, these ligands also impede charge transport. Therefore, it is beneficial to exchange longer surface ligands for shorter ones for optoelectronic devices. Typical characterization techniques used to analyze surface ligand exchange include Fourier-transform infrared spectroscopy, x-ray diffraction, transmission electron microscopy, and nuclear magnetic resonance spectroscopy, yet these techniques do not provide a simultaneously direct, quantitative, and sensitive method for evaluating surface ligands on CQDs. In contrast, x-ray photoelectron spectroscopy (XPS) can provide nanoscale sensitivity for quantitative analysis of CQD surface ligand exchange. A unique aspect of this work is that a fingerprint is identified for shorter surface ligands by resolving the regional XPS spectrum corresponding to different types of carbon bonds. In addition, a deposition technique known as resonant infrared matrix-assisted pulsed laser evaporation is used to improve the CQD film uniformity such that stronger XPS signals are obtained, enabling more accurate analysis of the ligand exchange process.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, E.; Engebrecht-Metzger, C.; Horowitz, S.

    As BA has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, E.; Engebrecht, C. Metzger; Horowitz, S.

    As Building America has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  5. Geomagnetic field models for satellite angular motion studies

    NASA Astrophysics Data System (ADS)

    Ovchinnikov, M. Yu.; Penkov, V. I.; Roldugin, D. S.; Pichuzhkina, A. V.

    2018-03-01

    Four geomagnetic field models are discussed: IGRF, inclined, direct and simplified dipoles. Geomagnetic induction vector expressions are provided in different reference frames. Induction vector behavior is compared for different models. Models applicability for the analysis of satellite motion is studied from theoretical and engineering perspectives. Relevant satellite dynamics analysis cases using analytical and numerical techniques are provided. These cases demonstrate the benefit of a certain model for a specific dynamics study. Recommendations for models usage are summarized in the end.

  6. Neutron imaging data processing using the Mantid framework

    NASA Astrophysics Data System (ADS)

    Pouzols, Federico M.; Draper, Nicholas; Nagella, Sri; Yang, Erica; Sajid, Ahmed; Ross, Derek; Ritchie, Brian; Hill, John; Burca, Genoveva; Minniti, Triestino; Moreton-Smith, Christopher; Kockelmann, Winfried

    2016-09-01

    Several imaging instruments are currently being constructed at neutron sources around the world. The Mantid software project provides an extensible framework that supports high-performance computing for data manipulation, analysis and visualisation of scientific data. At ISIS, IMAT (Imaging and Materials Science & Engineering) will offer unique time-of-flight neutron imaging techniques which impose several software requirements to control the data reduction and analysis. Here we outline the extensions currently being added to Mantid to provide specific support for neutron imaging requirements.

  7. Noise distribution and denoising of current density images

    PubMed Central

    Beheshti, Mohammadali; Foomany, Farbod H.; Magtibay, Karl; Jaffray, David A.; Krishnan, Sridhar; Nanthakumar, Kumaraswamy; Umapathy, Karthikeyan

    2015-01-01

    Abstract. Current density imaging (CDI) is a magnetic resonance (MR) imaging technique that could be used to study current pathways inside the tissue. The current distribution is measured indirectly as phase changes. The inherent noise in the MR imaging technique degrades the accuracy of phase measurements leading to imprecise current variations. The outcome can be affected significantly, especially at a low signal-to-noise ratio (SNR). We have shown the residual noise distribution of the phase to be Gaussian-like and the noise in CDI images approximated as a Gaussian. This finding matches experimental results. We further investigated this finding by performing comparative analysis with denoising techniques, using two CDI datasets with two different currents (20 and 45 mA). We found that the block-matching and three-dimensional (BM3D) technique outperforms other techniques when applied on current density (J). The minimum gain in noise power by BM3D applied to J compared with the next best technique in the analysis was found to be around 2 dB per pixel. We characterize the noise profile in CDI images and provide insights on the performance of different denoising techniques when applied at two different stages of current density reconstruction. PMID:26158100

  8. Multiple-wavelength neutron holography with pulsed neutrons

    PubMed Central

    Hayashi, Kouichi; Ohoyama, Kenji; Happo, Naohisa; Matsushita, Tomohiro; Hosokawa, Shinya; Harada, Masahide; Inamura, Yasuhiro; Nitani, Hiroaki; Shishido, Toetsu; Yubuta, Kunio

    2017-01-01

    Local structures around impurities in solids provide important information for understanding the mechanisms of material functions, because most of them are controlled by dopants. For this purpose, the x-ray absorption fine structure method, which provides radial distribution functions around specific elements, is most widely used. However, a similar method using neutron techniques has not yet been developed. If one can establish a method of local structural analysis with neutrons, then a new frontier of materials science can be explored owing to the specific nature of neutron scattering—that is, its high sensitivity to light elements and magnetic moments. Multiple-wavelength neutron holography using the time-of-flight technique with pulsed neutrons has great potential to realize this. We demonstrated multiple-wavelength neutron holography using a Eu-doped CaF2 single crystal and obtained a clear three-dimensional atomic image around trivalent Eu substituted for divalent Ca, revealing an interesting feature of the local structure that allows it to maintain charge neutrality. The new holography technique is expected to provide new information on local structures using the neutron technique. PMID:28835917

  9. Textual data compression in computational biology: a synopsis.

    PubMed

    Giancarlo, Raffaele; Scaturro, Davide; Utro, Filippo

    2009-07-01

    Textual data compression, and the associated techniques coming from information theory, are often perceived as being of interest for data communication and storage. However, they are also deeply related to classification and data mining and analysis. In recent years, a substantial effort has been made for the application of textual data compression techniques to various computational biology tasks, ranging from storage and indexing of large datasets to comparison and reverse engineering of biological networks. The main focus of this review is on a systematic presentation of the key areas of bioinformatics and computational biology where compression has been used. When possible, a unifying organization of the main ideas and techniques is also provided. It goes without saying that most of the research results reviewed here offer software prototypes to the bioinformatics community. The Supplementary Material provides pointers to software and benchmark datasets for a range of applications of broad interest. In addition to provide reference to software, the Supplementary Material also gives a brief presentation of some fundamental results and techniques related to this paper. It is at: http://www.math.unipa.it/ approximately raffaele/suppMaterial/compReview/

  10. Multiple-wavelength neutron holography with pulsed neutrons.

    PubMed

    Hayashi, Kouichi; Ohoyama, Kenji; Happo, Naohisa; Matsushita, Tomohiro; Hosokawa, Shinya; Harada, Masahide; Inamura, Yasuhiro; Nitani, Hiroaki; Shishido, Toetsu; Yubuta, Kunio

    2017-08-01

    Local structures around impurities in solids provide important information for understanding the mechanisms of material functions, because most of them are controlled by dopants. For this purpose, the x-ray absorption fine structure method, which provides radial distribution functions around specific elements, is most widely used. However, a similar method using neutron techniques has not yet been developed. If one can establish a method of local structural analysis with neutrons, then a new frontier of materials science can be explored owing to the specific nature of neutron scattering-that is, its high sensitivity to light elements and magnetic moments. Multiple-wavelength neutron holography using the time-of-flight technique with pulsed neutrons has great potential to realize this. We demonstrated multiple-wavelength neutron holography using a Eu-doped CaF 2 single crystal and obtained a clear three-dimensional atomic image around trivalent Eu substituted for divalent Ca, revealing an interesting feature of the local structure that allows it to maintain charge neutrality. The new holography technique is expected to provide new information on local structures using the neutron technique.

  11. Corrosion resistance of zirconium oxynitride coatings deposited via DC unbalanced magnetron sputtering and spray pyrolysis-nitriding

    NASA Astrophysics Data System (ADS)

    Cubillos, G. I.; Bethencourt, M.; Olaya, J. J.

    2015-02-01

    ZrOxNy/ZrO2 thin films were deposited on stainless steel using two different methods: ultrasonic spray pyrolysis-nitriding (SPY-N) and the DC unbalanced magnetron sputtering technique (UBMS). Using the first method, ZrO2 was initially deposited and subsequently nitrided in an anhydrous ammonia atmosphere at 1023 K at atmospheric pressure. For UBMS, the film was deposited in an atmosphere of air/argon with a Φair/ΦAr flow ratio of 3.0. Structural analysis was carried out through X-ray diffraction (XRD), and morphological analysis was done through scanning electron microscopy (SEM) and atomic force microscopy (AFM). Chemical analysis was carried out using X-ray photoelectron spectroscopy (XPS). ZrOxNy rhombohedral polycrystalline film was produced with spray pyrolysis-nitriding, whereas using the UBMS technique, the oxynitride films grew with cubic Zr2ON2 crystalline structures preferentially oriented along the (2 2 2) plane. Upon chemical analysis of the surface, the coatings exhibited spectral lines of Zr3d, O1s, and N1s, characteristic of zirconium oxynitride/zirconia. SEM analysis showed the homogeneity of the films, and AFM showed morphological differences according to the deposition technique of the coatings. Zirconium oxynitride films enhanced the stainless steel's resistance to corrosion using both techniques. The protective efficacy was evaluated using electrochemical techniques based on linear polarization (LP). The results indicated that the layers provide good resistance to corrosion when exposed to chloride-containing media.

  12. Constellation Coverage Analysis

    NASA Technical Reports Server (NTRS)

    Lo, Martin W. (Compiler)

    1997-01-01

    The design of satellite constellations requires an understanding of the dynamic global coverage provided by the constellations. Even for a small constellation with a simple circular orbit propagator, the combinatorial nature of the analysis frequently renders the problem intractable. Particularly for the initial design phase where the orbital parameters are still fluid and undetermined, the coverage information is crucial to evaluate the performance of the constellation design. We have developed a fast and simple algorithm for determining the global constellation coverage dynamically using image processing techniques. This approach provides a fast, powerful and simple method for the analysis of global constellation coverage.

  13. Molecular Analysis of Date Palm Genetic Diversity Using Random Amplified Polymorphic DNA (RAPD) and Inter-Simple Sequence Repeats (ISSRs).

    PubMed

    El Sharabasy, Sherif F; Soliman, Khaled A

    2017-01-01

    The date palm is an ancient domesticated plant with great diversity and has been cultivated in the Middle East and North Africa for at last 5000 years. Date palm cultivars are classified based on the fruit moisture content, as dry, semidry, and soft dates. There are a number of biochemical and molecular techniques available for characterization of the date palm variation. This chapter focuses on the DNA-based markers random amplified polymorphic DNA (RAPD) and inter-simple sequence repeats (ISSR) techniques, in addition to biochemical markers based on isozyme analysis. These techniques coupled with appropriate statistical tools proved useful for determining phylogenetic relationships among date palm cultivars and provide information resources for date palm gene banks.

  14. The Translational Role of Diffusion Tensor Image Analysis in Animal Models of Developmental Pathologies

    PubMed Central

    Oguz, Ipek; McMurray, Matthew S.; Styner, Martin; Johns, Josephine M.

    2013-01-01

    Diffusion Tensor Magnetic Resonance Imaging (DTI) has proven itself a powerful technique for clinical investigation of the neurobiological targets and mechanisms underlying developmental pathologies. The success of DTI in clinical studies has demonstrated its great potential for understanding translational animal models of clinical disorders, and preclinical animal researchers are beginning to embrace this new technology to study developmental pathologies. In animal models, genetics can be effectively controlled, drugs consistently administered, subject compliance ensured, and image acquisition times dramatically increased to reduce between-subject variability and improve image quality. When pairing these strengths with the many positive attributes of DTI, such as the ability to investigate microstructural brain organization and connectivity, it becomes possible to delve deeper into the study of both normal and abnormal development. The purpose of this review is to provide new preclinical investigators with an introductory source of information about the analysis of data resulting from small animal DTI studies to facilitate the translation of these studies to clinical data. In addition to an in depth review of translational analysis techniques, we present a number of relevant clinical and animal studies using DTI to investigate developmental insults in order to further illustrate techniques and to highlight where small animal DTI could potentially provide a wealth of translational data to inform clinical researchers. PMID:22627095

  15. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  16. Measuring MERCI: exploring data mining techniques for examining the neurologic outcomes of stroke patients undergoing endo-vascular therapy at Erlanger Southeast Stroke Center.

    PubMed

    McNabb, Matthew; Cao, Yu; Devlin, Thomas; Baxter, Blaise; Thornton, Albert

    2012-01-01

    Mechanical Embolus Removal in Cerebral Ischemia (MERCI) has been supported by medical trials as an improved method of treating ischemic stroke past the safe window of time for administering clot-busting drugs, and was released for medical use in 2004. The importance of analyzing real-world data collected from MERCI clinical trials is key to providing insights on the effectiveness of MERCI. Most of the existing data analysis on MERCI results has thus far employed conventional statistical analysis techniques. To the best of our knowledge, advanced data analytics and data mining techniques have not yet been systematically applied. To address the issue in this thesis, we conduct a comprehensive study on employing state of the art machine learning algorithms to generate prediction criteria for the outcome of MERCI patients. Specifically, we investigate the issue of how to choose the most significant attributes of a data set with limited instance examples. We propose a few search algorithms to identify the significant attributes, followed by a thorough performance analysis for each algorithm. Finally, we apply our proposed approach to the real-world, de-identified patient data provided by Erlanger Southeast Regional Stroke Center, Chattanooga, TN. Our experimental results have demonstrated that our proposed approach performs well.

  17. Isoquinoline alkaloids and their binding with DNA: calorimetry and thermal analysis applications.

    PubMed

    Bhadra, Kakali; Kumar, Gopinatha Suresh

    2010-11-01

    Alkaloids are a group of natural products with unmatched chemical diversity and biological relevance forming potential quality pools in drug screening. The molecular aspects of their interaction with many cellular macromolecules like DNA, RNA and proteins are being currently investigated in order to evolve the structure activity relationship. Isoquinolines constitute an important group of alkaloids. They have extensive utility in cancer therapy and a large volume of data is now emerging in the literature on their mode, mechanism and specificity of binding to DNA. Thermodynamic characterization of the binding of these alkaloids to DNA may offer key insights into the molecular aspects that drive complex formation and these data can provide valuable information about the balance of driving forces. Various thermal techniques have been conveniently used for this purpose and modern calorimetric instrumentation provides direct and quick estimation of thermodynamic parameters. Thermal melting studies and calorimetric techniques like isothermal titration calorimetry and differential scanning calorimetry have further advanced the field by providing authentic, reliable and sensitive data on various aspects of temperature dependent structural analysis of the interaction. In this review we present the application of various thermal techniques, viz. isothermal titration calorimetry, differential scanning calorimetry and optical melting studies in the characterization of drug-DNA interactions with particular emphasis on isoquinoline alkaloid-DNA interaction.

  18. Precision cosmological parameter estimation

    NASA Astrophysics Data System (ADS)

    Fendt, William Ashton, Jr.

    2009-09-01

    Experimental efforts of the last few decades have brought. a golden age to mankind's endeavor to understand tine physical properties of the Universe throughout its history. Recent measurements of the cosmic microwave background (CMB) provide strong confirmation of the standard big bang paradigm, as well as introducing new mysteries, to unexplained by current physical models. In the following decades. even more ambitious scientific endeavours will begin to shed light on the new physics by looking at the detailed structure of the Universe both at very early and recent times. Modern data has allowed us to begins to test inflationary models of the early Universe, and the near future will bring higher precision data and much stronger tests. Cracking the codes hidden in these cosmological observables is a difficult and computationally intensive problem. The challenges will continue to increase as future experiments bring larger and more precise data sets. Because of the complexity of the problem, we are forced to use approximate techniques and make simplifying assumptions to ease the computational workload. While this has been reasonably sufficient until now, hints of the limitations of our techniques have begun to come to light. For example, the likelihood approximation used for analysis of CMB data from the Wilkinson Microwave Anistropy Probe (WMAP) satellite was shown to have short falls, leading to pre-emptive conclusions drawn about current cosmological theories. Also it can he shown that an approximate method used by all current analysis codes to describe the recombination history of the Universe will not be sufficiently accurate for future experiments. With a new CMB satellite scheduled for launch in the coming months, it is vital that we develop techniques to improve the analysis of cosmological data. This work develops a novel technique of both avoiding the use of approximate computational codes as well as allowing the application of new, more precise analysis methods. These techniques will help in the understanding of new physics contained in current and future data sets as well as benefit the research efforts of the cosmology community. Our idea is to shift the computationally intensive pieces of the parameter estimation framework to a parallel training step. We then provide a machine learning code that uses this training set to learn the relationship between the underlying cosmological parameters and the function we wish to compute. This code is very accurate and simple to evaluate. It can provide incredible speed- ups of parameter estimation codes. For some applications this provides the convenience of obtaining results faster, while in other cases this allows the use of codes that would be impossible to apply in the brute force setting. In this thesis we provide several examples where our method allows more accurate computation of functions important for data analysis than is currently possible. As the techniques developed in this work are very general, there are no doubt a wide array of applications both inside and outside of cosmology. We have already seen this interest as other scientists have presented ideas for using our algorithm to improve their computational work, indicating its importance as modern experiments push forward. In fact, our algorithm will play an important role in the parameter analysis of Planck, the next generation CMB space mission.

  19. Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer

    NASA Astrophysics Data System (ADS)

    Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad

    2017-04-01

    Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.

  20. TH-EF-BRC-04: Quality Management Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yorke, E.

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  1. TH-EF-BRC-00: TG-100 Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  2. TH-EF-BRC-02: FMEA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M.

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunscombe, P.

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  4. Comments on "Failures in detecting volcanic ash from a satellite-based technique"

    USGS Publications Warehouse

    Prata, F.; Bluth, G.; Rose, B.; Schneider, D.; Tupper, A.

    2001-01-01

    The recent paper by Simpson et al. [Remote Sens. Environ. 72 (2000) 191.] on failures to detect volcanic ash using the 'reverse' absorption technique provides a timely reminder of the danger that volcanic ash presents to aviation and the urgent need for some form of effective remote detection. The paper unfortunately suffers from a fundamental flaw in its methodology and numerous errors of fact and interpretation. For the moment, the 'reverse' absorption technique provides the best means for discriminating volcanic ash clouds from meteorological clouds. The purpose of our comment is not to defend any particular algorithm; rather, we point out some problems with Simpson et al.'s analysis and re-state the conditions under which the 'reverse' absorption algorithm is likely to succeed. ?? 2001 Elsevier Science Inc. All rights reserved.

  5. Systems thinking, the Swiss Cheese Model and accident analysis: a comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2014-07-01

    The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Effect size calculation in meta-analyses of psychotherapy outcome research.

    PubMed

    Hoyt, William T; Del Re, A C

    2018-05-01

    Meta-analysis of psychotherapy intervention research normally examines differences between treatment groups and some form of comparison group (e.g., wait list control; alternative treatment group). The effect of treatment is normally quantified as a standardized mean difference (SMD). We describe procedures for computing unbiased estimates of the population SMD from sample data (e.g., group Ms and SDs), and provide guidance about a number of complications that may arise related to effect size computation. These complications include (a) incomplete data in research reports; (b) use of baseline data in computing SMDs and estimating the population standard deviation (σ); (c) combining effect size data from studies using different research designs; and (d) appropriate techniques for analysis of data from studies providing multiple estimates of the effect of interest (i.e., dependent effect sizes). Clinical or Methodological Significance of this article: Meta-analysis is a set of techniques for producing valid summaries of existing research. The initial computational step for meta-analyses of research on intervention outcomes involves computing an effect size quantifying the change attributable to the intervention. We discuss common issues in the computation of effect sizes and provide recommended procedures to address them.

  7. Component-Level Electronic-Assembly Repair (CLEAR) Spacecraft Circuit Diagnostics by Analog and Complex Signature Analysis

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Wade, Raymond P.; Izadnegahdar, Alain

    2011-01-01

    The Component-Level Electronic-Assembly Repair (CLEAR) project at the NASA Glenn Research Center is aimed at developing technologies that will enable space-flight crews to perform in situ component-level repair of electronics on Moon and Mars outposts, where there is no existing infrastructure for logistics spares. These technologies must provide effective repair capabilities yet meet the payload and operational constraints of space facilities. Effective repair depends on a diagnostic capability that is versatile but easy to use by crew members that have limited training in electronics. CLEAR studied two techniques that involve extensive precharacterization of "known good" circuits to produce graphical signatures that provide an easy-to-use comparison method to quickly identify faulty components. Analog Signature Analysis (ASA) allows relatively rapid diagnostics of complex electronics by technicians with limited experience. Because of frequency limits and the growing dependence on broadband technologies, ASA must be augmented with other capabilities. To meet this challenge while preserving ease of use, CLEAR proposed an alternative called Complex Signature Analysis (CSA). Tests of ASA and CSA were used to compare capabilities and to determine if the techniques provided an overlapping or complementary capability. The results showed that the methods are complementary.

  8. Next-Generation Sequencing in the Mycology Lab.

    PubMed

    Zoll, Jan; Snelders, Eveline; Verweij, Paul E; Melchers, Willem J G

    New state-of-the-art techniques in sequencing offer valuable tools in both detection of mycobiota and in understanding of the molecular mechanisms of resistance against antifungal compounds and virulence. Introduction of new sequencing platform with enhanced capacity and a reduction in costs for sequence analysis provides a potential powerful tool in mycological diagnosis and research. In this review, we summarize the applications of next-generation sequencing techniques in mycology.

  9. NMR of thin layers using a meanderline surface coil

    DOEpatents

    Cowgill, Donald F.

    2001-01-01

    A miniature meanderline sensor coil which extends the capabilities of nuclear magnetic resonance (NMR) to provide analysis of thin planar samples and surface layer geometries. The sensor coil allows standard NMR techniques to be used to examine thin planar (or curved) layers, extending NMRs utility to many problems of modern interest. This technique can be used to examine contact layers, non-destructively depth profile into films, or image multiple layers in a 3-dimensional sense. It lends itself to high resolution NMR techniques of magic angle spinning and thus can be used to examine the bonding and electronic structure in layered materials or to observe the chemistry associated with aging coatings. Coupling this sensor coil technology with an arrangement of small magnets will produce a penetrator probe for remote in-situ chemical analysis of groundwater or contaminant sediments. Alternatively, the sensor coil can be further miniaturized to provide sub-micron depth resolution within thin films or to orthoscopically examine living tissue. This thin-layer NMR technique using a stationary meanderline coil in a series-resonant circuit has been demonstrated and it has been determined that the flat meanderline geometry has about he same detection sensitivity as a solenoidal coil, but is specifically tailored to examine planar material layers, while avoiding signals from the bulk.

  10. Heterodyne laser spectroscopy system

    DOEpatents

    Wyeth, Richard W.; Paisner, Jeffrey A.; Story, Thomas

    1989-01-01

    A heterodyne laser spectroscopy system utilizes laser heterodyne techniques for purposes of laser isotope separation spectroscopy, vapor diagnostics, processing of precise laser frequency offsets from a reference frequency and the like, and provides spectral analysis of a laser beam.

  11. Lubrication Flows.

    ERIC Educational Resources Information Center

    Papanastasiou, Tasos C.

    1989-01-01

    Discusses fluid mechanics for undergraduates including the differential Navier-Stokes equations, dimensional analysis and simplified dimensionless numbers, control volume principles, the Reynolds lubrication equation for confined and free surface flows, capillary pressure, and simplified perturbation techniques. Provides a vertical dip coating…

  12. Mixed Stationary Liquid Phases for Gas-Liquid Chromatography.

    ERIC Educational Resources Information Center

    Koury, Albert M.; Parcher, Jon F.

    1979-01-01

    Describes a laboratory technique for use in an undergraduate instrumental analysis course that, using the interpretation of window diagrams, prepares a mixed liquid phase column for gas-liquid chromatography. A detailed procedure is provided. (BT)

  13. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  14. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates.

    PubMed

    Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.

  15. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates

    PubMed Central

    2011-01-01

    Background The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. Results We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. Conclusions The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa. PMID:22784572

  16. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  17. Biomechanical stability of intramedullary technique for fixation of joint depressed calcaneus fracture.

    PubMed

    Nelson, Joshua D; McIff, Terence E; Moodie, Patrick G; Iverson, Jamey L; Horton, Greg A

    2010-03-01

    Internal fixation of the os calcis is often complicated by prolonged soft tissue management and posterior facet disruption. An ideal calcaneal construct would include minimal hardware prominence, sturdy posterior facet fixation and nominal soft tissue disruption. The purpose of this study was to develop such a construct and provide a biomechanical analysis comparing our technique to a standard internal fixation technique. Twenty fresh-frozen cadaver calcanei were used to create a reproducible Sanders type-IIB calcaneal fracture pattern. One calcaneus of each pair was randomly selected to be fixed using our compressive headless screw technique. The contralateral matched calcaneus was fixed with a nonlocking calcaneal plate in a traditional fashion. Each calcaneus was cyclically loaded at a frequency of 1 Hz for 4000 cycles using an increasing force from 250 N to 1000 N. An Optotrak motion capturing system was used to detect relative motion of the three fracture fragments at eight different points along the fracture lines. Horizontal separation and vertical displacement at the fracture lines was recorded, as well as relative rotation at the primary fracture line. When the data were averaged, there was more horizontal displacement at the primary fracture line of the plate and screw construct compared to the headless screw construct. The headless screw construct also had less vertical displacement at the primary fracture line at every load. On average those fractures fixed with the headless screw technique had less rotation than those fixed with the side plate technique. A new headless screw technique for calcaneus fracture fixation was shown to provide stability as good as, or better than, a standard side plating technique under the axial loading conditions of our model. Although further testing is needed, the stability of the proposed technique is similar to that typically provided by intramedullary fixation. This fixation technique provides a biomechanically stable construct with the potential for a minimally invasive approach and improved post-operative soft tissue healing.

  18. Characterisation of the PXIE Allison-type emittance scanner

    DOE PAGES

    D'Arcy, R.; Alvarez, M.; Gaynier, J.; ...

    2016-01-26

    An Allison-type emittance scanner has been designed for PXIE at FNAL with the goal of providing fast and accurate phase space reconstruction. The device has been modified from previous LBNL/SNS designs to operate in both pulsed and DC modes with the addition of water-cooled front slits. Extensive calibration techniques and error analysis allowed confinement of uncertainty to the <5% level (with known caveats). With a 16-bit, 1 MHz electronics scheme the device is able to analyse a pulse with a resolution of 1 μs, allowing for analysis of neutralisation effects. As a result, this paper describes a detailed breakdown ofmore » the R&D, as well as post-run analysis techniques.« less

  19. A Fault Tree Approach to Analysis of Behavioral Systems: An Overview.

    ERIC Educational Resources Information Center

    Stephens, Kent G.

    Developed at Brigham Young University, Fault Tree Analysis (FTA) is a technique for enhancing the probability of success in any system by analyzing the most likely modes of failure that could occur. It provides a logical, step-by-step description of possible failure events within a system and their interaction--the combinations of potential…

  20. Analysis of thrips distribution: application of spatial statistics and Kriging

    Treesearch

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

Top