Sample records for development analysis method

  1. Global/local methods research using a common structural analysis framework

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  2. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis

    ERIC Educational Resources Information Center

    Schiazza, Daniela Marie

    2013-01-01

    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  3. Global/local methods research using the CSM testbed

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. Hayden, Jr.; Thompson, Danniella M.

    1990-01-01

    Research activities in global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  4. What Touched Your Heart? Collaborative Story Analysis Emerging From an Apsáalooke Cultural Context.

    PubMed

    Hallett, John; Held, Suzanne; McCormick, Alma Knows His Gun; Simonds, Vanessa; Real Bird, Sloane; Martin, Christine; Simpson, Colleen; Schure, Mark; Turnsplenty, Nicole; Trottier, Coleen

    2017-07-01

    Community-based participatory research and decolonizing research share some recommendations for best practices for conducting research. One commonality is partnering on all stages of research; co-developing methods of data analysis is one stage with a deficit of partnering examples. We present a novel community-based and developed method for analyzing qualitative data within an Indigenous health study and explain incompatibilities of existing methods for our purposes and community needs. We describe how we explored available literature, received counsel from community Elders and experts in the field, and collaboratively developed a data analysis method consonant with community values. The method of analysis, in which interview/story remained intact, team members received story, made meaning through discussion, and generated a conceptual framework to inform intervention development, is detailed. We offer the development process and method as an example for researchers working with communities who want to keep stories intact during qualitative data analysis.

  5. Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.

    1989-01-01

    The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  6. What Touched Your Heart? Collaborative Story Analysis Emerging From an Apsáalooke Cultural Context

    PubMed Central

    Hallett, John; Held, Suzanne; McCormick, Alma Knows His Gun; Simonds, Vanessa; Bird, Sloane Real; Martin, Christine; Simpson, Colleen; Schure, Mark; Turnsplenty, Nicole; Trottier, Coleen

    2017-01-01

    Community-based participatory research and decolonizing research share some recommendations for best practices for conducting research. One commonality is partnering on all stages of research; co-developing methods of data analysis is one stage with a deficit of partnering examples. We present a novel community-based and developed method for analyzing qualitative data within an Indigenous health study and explain incompatibilities of existing methods for our purposes and community needs. We describe how we explored available literature, received counsel from community Elders and experts in the field, and collaboratively developed a data analysis method consonant with community values. The method of analysis, in which interview/story remained intact, team members received story, made meaning through discussion, and generated a conceptual framework to inform intervention development, is detailed. We offer the development process and method as an example for researchers working with communities who want to keep stories intact during qualitative data analysis. PMID:27659019

  7. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu; Jablonowski, Christopher; Lake, Larry

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum designmore » concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.« less

  8. Who's in and why? A typology of stakeholder analysis methods for natural resource management.

    PubMed

    Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C

    2009-04-01

    Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships.

  9. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  10. Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1989-01-01

    An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.

  11. Computational Methods for Structural Mechanics and Dynamics, part 1

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson (Editor); Housner, Jerrold M. (Editor); Tanner, John A. (Editor); Hayduk, Robert J. (Editor)

    1989-01-01

    The structural analysis methods research has several goals. One goal is to develop analysis methods that are general. This goal of generality leads naturally to finite-element methods, but the research will also include other structural analysis methods. Another goal is that the methods be amenable to error analysis; that is, given a physical problem and a mathematical model of that problem, an analyst would like to know the probable error in predicting a given response quantity. The ultimate objective is to specify the error tolerances and to use automated logic to adjust the mathematical model or solution strategy to obtain that accuracy. A third goal is to develop structural analysis methods that can exploit parallel processing computers. The structural analysis methods research will focus initially on three types of problems: local/global nonlinear stress analysis, nonlinear transient dynamics, and tire modeling.

  12. Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.

    PubMed

    Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M

    2017-07-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.

  13. Computational structural mechanics methods research using an evolving framework

    NASA Technical Reports Server (NTRS)

    Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.

    1990-01-01

    Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.

  14. [Development of sample pretreatment techniques-rapid detection coupling methods for food security analysis].

    PubMed

    Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke

    2013-07-01

    This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.

  15. A method for determining spiral-bevel gear tooth geometry for finite element analysis

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.; Litvin, Faydor L.

    1991-01-01

    An analytical method was developed to determine gear tooth surface coordinates of face-milled spiral bevel gears. The method uses the basic gear design parameters in conjunction with the kinematical aspects of spiral bevel gear manufacturing machinery. A computer program, SURFACE, was developed. The computer program calculates the surface coordinates and outputs 3-D model data that can be used for finite element analysis. Development of the modeling method and an example case are presented. This analysis method could also find application for gear inspection and near-net-shape gear forging die design.

  16. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, J.A.; Clauss, S.A.; Grant, K.E.

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work.

  18. Applied Cognitive Task Analysis (ACTA) Methodology

    DTIC Science & Technology

    1997-11-01

    experienced based cognitive skills. The primary goal of this project was to develop streamlined methods of Cognitive Task Analysis that would fill this need...We have made important progression this direction. We have developed streamlined methods of Cognitive Task Analysis . Our evaluation study indicates...developed a CD-based stand alone instructional package, which will make the Applied Cognitive Task Analysis (ACTA) tools widely accessible. A survey of the

  19. Static aeroelastic analysis and tailoring of a single-element racing car wing

    NASA Astrophysics Data System (ADS)

    Sadd, Christopher James

    This thesis presents the research from an Engineering Doctorate research programme in collaboration with Reynard Motorsport Ltd, a manufacturer of racing cars. Racing car wing design has traditionally considered structures to be rigid. However, structures are never perfectly rigid and the interaction between aerodynamic loading and structural flexibility has a direct impact on aerodynamic performance. This interaction is often referred to as static aeroelasticity and the focus of this research has been the development of a computational static aeroelastic analysis method to improve the design of a single-element racing car wing. A static aeroelastic analysis method has been developed by coupling a Reynolds-Averaged Navier-Stokes CFD analysis method with a Finite Element structural analysis method using an iterative scheme. Development of this method has included assessment of CFD and Finite Element analysis methods and development of data transfer and mesh deflection methods. Experimental testing was also completed to further assess the computational analyses. The computational and experimental results show a good correlation and these studies have also shown that a Navier-Stokes static aeroelastic analysis of an isolated wing can be performed at an acceptable computational cost. The static aeroelastic analysis tool was used to assess methods of tailoring the structural flexibility of the wing to increase its aerodynamic performance. These tailoring methods were then used to produce two final wing designs to increase downforce and reduce drag respectively. At the average operating dynamic pressure of the racing car, the computational analysis predicts that the downforce-increasing wing has a downforce of C[1]=-1.377 in comparison to C[1]=-1.265 for the original wing. The computational analysis predicts that the drag-reducing wing has a drag of C[d]=0.115 in comparison to C[d]=0.143 for the original wing.

  20. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  1. CSM Testbed Development and Large-Scale Structural Applications

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.

    1989-01-01

    A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  2. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  3. Batch mode grid generation: An endangered species

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    1992-01-01

    Non-interactive grid generation schemes should thrive as emphasis shifts from development of numerical analysis and design methods to application of these tools to real engineering problems. A strong case is presented for the continued development and application of non-interactive geometry modeling methods. Guidelines, strategies, and techniques for developing and implementing these tools are presented using current non-interactive grid generation methods as examples. These schemes play an important role in the development of multidisciplinary analysis methods and some of these applications are also discussed.

  4. Defining, illustrating and reflecting on logic analysis with an example from a professional development program.

    PubMed

    Tremblay, Marie-Claude; Brousselle, Astrid; Richard, Lucie; Beaudet, Nicole

    2013-10-01

    Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Steroid hormones in environmental matrices: extraction method comparison.

    PubMed

    Andaluri, Gangadhar; Suri, Rominder P S; Graham, Kendon

    2017-11-09

    The U.S. Environmental Protection Agency (EPA) has developed methods for the analysis of steroid hormones in water, soil, sediment, and municipal biosolids by HRGC/HRMS (EPA Method 1698). Following the guidelines provided in US-EPA Method 1698, the extraction methods were validated with reagent water and applied to municipal wastewater, surface water, and municipal biosolids using GC/MS/MS for the analysis of nine most commonly detected steroid hormones. This is the first reported comparison of the separatory funnel extraction (SFE), continuous liquid-liquid extraction (CLLE), and Soxhlet extraction methods developed by the U.S. EPA. Furthermore, a solid phase extraction (SPE) method was also developed in-house for the extraction of steroid hormones from aquatic environmental samples. This study provides valuable information regarding the robustness of the different extraction methods. Statistical analysis of the data showed that SPE-based methods provided better recovery efficiencies and lower variability of the steroid hormones followed by SFE. The analytical methods developed in-house for extraction of biosolids showed a wide recovery range; however, the variability was low (≤ 7% RSD). Soxhlet extraction and CLLE are lengthy procedures and have been shown to provide highly variably recovery efficiencies. The results of this study are guidance for better sample preparation strategies in analytical methods for steroid hormone analysis, and SPE adds to the choice in environmental sample analysis.

  6. Computer Graphics-aided systems analysis: application to well completion design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detamore, J.E.; Sarma, M.P.

    1985-03-01

    The development of an engineering tool (in the form of a computer model) for solving design and analysis problems related with oil and gas well production operations is discussed. The development of the method is based on integrating the concepts of ''Systems Analysis'' with the techniques of ''Computer Graphics''. The concepts behind the method are very general in nature. This paper, however, illustrates the application of the method in solving gas well completion design problems. The use of the method will save time and improve the efficiency of such design and analysis problems. The method can be extended to othermore » design and analysis aspects of oil and gas wells.« less

  7. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hugo, Jacques

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method wasmore » adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.« less

  8. Coupled Electro-Magneto-Mechanical-Acoustic Analysis Method Developed by Using 2D Finite Element Method for Flat Panel Speaker Driven by Magnetostrictive-Material-Based Actuator

    NASA Astrophysics Data System (ADS)

    Yoo, Byungjin; Hirata, Katsuhiro; Oonishi, Atsurou

    In this study, a coupled analysis method for flat panel speakers driven by giant magnetostrictive material (GMM) based actuator was developed. The sound field produced by a flat panel speaker that is driven by a GMM actuator depends on the vibration of the flat panel, this vibration is a result of magnetostriction property of the GMM. In this case, to predict the sound pressure level (SPL) in the audio-frequency range, it is necessary to take into account not only the magnetostriction property of the GMM but also the effect of eddy current and the vibration characteristics of the actuator and the flat panel. In this paper, a coupled electromagnetic-structural-acoustic analysis method is presented; this method was developed by using the finite element method (FEM). This analysis method is used to predict the performance of a flat panel speaker in the audio-frequency range. The validity of the analysis method is verified by comparing with the measurement results of a prototype speaker.

  9. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  10. Nuclear Forensics Applications of Principal Component Analysis on Micro X-ray Fluorescence Images

    DTIC Science & Technology

    analysis on quantified micro x-ray fluorescence intensity values. This method is then applied to address goals of nuclear forensics . Thefirst...researchers in the development and validation of nuclear forensics methods. A method for determining material homogeneity is developed and demonstrated

  11. A Relational Metric, Its Application to Domain Analysis, and an Example Analysis and Model of a Remote Sensing Domain

    DOT National Transportation Integrated Search

    1995-07-01

    An objective and quantitative method has been developed for deriving models of complex and specialized spheres of activity (domains) from domain-generated verbal data. The method was developed for analysis of interview transcripts, incident reports, ...

  12. Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.

    PubMed

    Donaldson, G

    1996-04-01

    An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche.

  13. A method for data base management and analysis for wind tunnel data

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1987-01-01

    To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.

  14. [Recent advances in sample preparation methods of plant hormones].

    PubMed

    Wu, Qian; Wang, Lus; Wu, Dapeng; Duan, Chunfeng; Guan, Yafeng

    2014-04-01

    Plant hormones are a group of naturally occurring trace substances which play a crucial role in controlling the plant development, growth and environment response. With the development of the chromatography and mass spectroscopy technique, chromatographic analytical method has become a widely used way for plant hormone analysis. Among the steps of chromatographic analysis, sample preparation is undoubtedly the most vital one. Thus, a highly selective and efficient sample preparation method is critical for accurate identification and quantification of phytohormones. For the three major kinds of plant hormones including acidic plant hormones & basic plant hormones, brassinosteroids and plant polypeptides, the sample preparation methods are reviewed in sequence especially the recently developed methods. The review includes novel methods, devices, extractive materials and derivative reagents for sample preparation of phytohormones analysis. Especially, some related works of our group are included. At last, the future developments in this field are also prospected.

  15. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  16. Rapid Radiochemical Method for Radium-226 in Building ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Radium-226 in building materials Method Selected for: SAM lists this method for qualitative analysis of radium-226 in concrete or brick building materials Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  17. Rapid Radiochemical Method for Americium-241 in Building ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Americium-241 in building materials Method Selected for: SAM lists this method for qualitative analysis of americium-241 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  18. A Method for Cognitive Task Analysis

    DTIC Science & Technology

    1992-07-01

    A method for cognitive task analysis is described based on the notion of ’generic tasks’. The method distinguishes three layers of analysis. At the...model for applied areas such as the development of knowledge-based systems and training, are discussed. Problem solving, Cognitive Task Analysis , Knowledge, Strategies.

  19. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  20. I. Developing Methods for the Analysis of Chemistry Students' Inscriptions, II. Exploring the Regioselectivity of 1,3-Dipolar Cycloadditions of Munchnones, III. Stereochemical Investigations of C-H Activation Reactions Involving Germylene and Stannylene/Aryl Iodide Reagents

    ERIC Educational Resources Information Center

    Kiste, Alan L.

    2009-01-01

    I. Analyzing and comparing student-generated inscriptions in chemistry is crucial to gaining insight into students' understanding about chemistry concepts. Thus, we developed two methods of analyzing student-generated inscriptions: features analysis and thematic analysis. We have also demonstrated how these methods are able to discern differences…

  1. HEATED PURGE AND TRAP METHOD DEVELOPMENT AND TESTING

    EPA Science Inventory

    The goal of the research was to develop a heated purge and trap method that could be used in conjunction with SW-846 method 8240 for the analysis of volatile, water soluble Appendix VIII analytes. The developed method was validated according to a partial single laboratory method ...

  2. Asymptotic approximation method of force reconstruction: Application and analysis of stationary random forces

    NASA Astrophysics Data System (ADS)

    Sanchez, J.

    2018-06-01

    In this paper, the application and analysis of the asymptotic approximation method to a single degree-of-freedom has recently been produced. The original concepts are summarized, and the necessary probabilistic concepts are developed and applied to single degree-of-freedom systems. Then, these concepts are united, and the theoretical and computational models are developed. To determine the viability of the proposed method in a probabilistic context, numerical experiments are conducted, and consist of a frequency analysis, analysis of the effects of measurement noise, and a statistical analysis. In addition, two examples are presented and discussed.

  3. Summary of Technical Operations, 1991

    DTIC Science & Technology

    1992-01-01

    exploit commonality. The project is using the Feature-Oriented Domain Analysis ( FODA ) method, developed by the project in 1990, to perform this...the development of new movement control software. The analysis will also serve as a means of improving the FODA method. The results of this analysis ...STARS environment. The NASA Program Office has officially decided to expand the use of Rate Monotonic Analysis (RMA), which was originally isolated to

  4. Forestry sector analysis for developing countries: issues and methods.

    Treesearch

    R.W. Haynes

    1993-01-01

    A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their applications in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing...

  5. Trial Sequential Methods for Meta-Analysis

    ERIC Educational Resources Information Center

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  6. Heuristic Analysis Model of Nitrided Layers’ Formation Consisting of the Image Processing and Analysis and Elements of Artificial Intelligence

    PubMed Central

    Wójcicki, Tomasz; Nowicki, Michał

    2016-01-01

    The article presents a selected area of research and development concerning the methods of material analysis based on the automatic image recognition of the investigated metallographic sections. The objectives of the analyses of the materials for gas nitriding technology are described. The methods of the preparation of nitrided layers, the steps of the process and the construction and operation of devices for gas nitriding are given. We discuss the possibility of using the methods of digital images processing in the analysis of the materials, as well as their essential task groups: improving the quality of the images, segmentation, morphological transformations and image recognition. The developed analysis model of the nitrided layers formation, covering image processing and analysis techniques, as well as selected methods of artificial intelligence are presented. The model is divided into stages, which are formalized in order to better reproduce their actions. The validation of the presented method is performed. The advantages and limitations of the developed solution, as well as the possibilities of its practical use, are listed. PMID:28773389

  7. Rapid Radiochemical Method for Total Radiostrontium (Sr-90) ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Beta counting Method Developed for: Strontium-89 and strontium-90 in building materials Method Selected for: SAM lists this method for qualitative analysis of strontium-89 and strontium-90 in concrete or brick building materials Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  8. Development of the mathematical model for design and verification of acoustic modal analysis methods

    NASA Astrophysics Data System (ADS)

    Siner, Alexander; Startseva, Maria

    2016-10-01

    To reduce the turbofan noise it is necessary to develop methods for the analysis of the sound field generated by the blade machinery called modal analysis. Because modal analysis methods are very difficult and their testing on the full scale measurements are very expensive and tedious it is necessary to construct some mathematical models allowing to test modal analysis algorithms fast and cheap. At this work the model allowing to set single modes at the channel and to analyze generated sound field is presented. Modal analysis of the sound generated by the ring array of point sound sources is made. Comparison of experimental and numerical modal analysis results is presented at this work.

  9. How to determine spiral bevel gear tooth geometry for finite element analysis

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.; Litvin, Faydor L.

    1991-01-01

    An analytical method was developed to determine gear tooth surface coordinates of face milled spiral bevel gears. The method combines the basic gear design parameters with the kinematical aspects for spiral bevel gear manufacturing. A computer program was developed to calculate the surface coordinates. From this data a 3-D model for finite element analysis can be determined. Development of the modeling method and an example case are presented.

  10. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  11. Learning challenges and sustainable development: A methodological perspective.

    PubMed

    Seppänen, Laura

    2017-01-01

    Sustainable development requires learning, but the contents of learning are often complex and ambiguous. This requires new integrated approaches from research. It is argued that investigation of people's learning challenges in every-day work is beneficial for research on sustainable development. The aim of the paper is to describe a research method for examining learning challenges in promoting sustainable development. This method is illustrated with a case example from organic vegetable farming in Finland. The method, based on Activity Theory, combines historical analysis with qualitative analysis of need expressions in discourse data. The method linking local and subjective need expressions with general historical analysis is a promising way to overcome the gap between the individual and society, so much needed in research for sustainable development. Dialectically informed historical frameworks have practical value as tools in collaborative negotiations and participatory designs for sustainable development. The simultaneous use of systemic and subjective perspectives allows researchers to manage the complexity of practical work activities and to avoid too simplistic presumptions about sustainable development.

  12. Method Development for the Analysis of 1,4-Dioxane in Drinking Water Using Solid Phase Extraction and Gas Chromatography/Mass Spectrometry

    EPA Science Inventory

    1,4-Dioxane has been identified as a probable human carcinogen and an emerging contaminant in drinking water. The National Exposure Research Laboratory (NERL) has developed a method for the analysis of 1,4-dioxane in drinking water at ng/L concentrations. The method consists of...

  13. Laser-based methods for the analysis of low molecular weight compounds in biological matrices.

    PubMed

    Kiss, András; Hopfgartner, Gérard

    2016-07-15

    Laser-based desorption and/or ionization methods play an important role in the field of the analysis of low molecular-weight compounds (LMWCs) because they allow direct analysis with high-throughput capabilities. In the recent years there were several new improvements in ionization methods with the emergence of novel atmospheric ion sources such as laser ablation electrospray ionization or laser diode thermal desorption and atmospheric pressure chemical ionization and in sample preparation methods with the development of new matrix compounds for matrix-assisted laser desorption/ionization (MALDI). Also, the combination of ion mobility separation with laser-based ionization methods starts to gain popularity with access to commercial systems. These developments have been driven mainly by the emergence of new application fields such as MS imaging and non-chromatographic analytical approaches for quantification. This review aims to present these new developments in laser-based methods for the analysis of low-molecular weight compounds by MS and several potential applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Methods for assessing the stability of slopes during earthquakes-A retrospective

    USGS Publications Warehouse

    Jibson, R.W.

    2011-01-01

    During the twentieth century, several methods to assess the stability of slopes during earthquakes were developed. Pseudostatic analysis was the earliest method; it involved simply adding a permanent body force representing the earthquake shaking to a static limit-equilibrium analysis. Stress-deformation analysis, a later development, involved much more complex modeling of slopes using a mesh in which the internal stresses and strains within elements are computed based on the applied external loads, including gravity and seismic loads. Stress-deformation analysis provided the most realistic model of slope behavior, but it is very complex and requires a high density of high-quality soil-property data as well as an accurate model of soil behavior. In 1965, Newmark developed a method that effectively bridges the gap between these two types of analysis. His sliding-block model is easy to apply and provides a useful index of co-seismic slope performance. Subsequent modifications to sliding-block analysis have made it applicable to a wider range of landslide types. Sliding-block analysis provides perhaps the greatest utility of all the types of analysis. It is far easier to apply than stress-deformation analysis, and it yields much more useful information than does pseudostatic analysis. ?? 2010.

  15. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.

    2017-05-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.

  16. A Geostatistics-Informed Hierarchical Sensitivity Analysis Method for Complex Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2017-12-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.

  17. Systematic Engine Uprate Technology Development and Deployment for Pipeline Compressor Engines through Increased Torque

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis Schmitt; Daniel Olsen

    2005-09-30

    Three methods were utilized to analyze key components of slow-speed, large-bore, natural gas integral engines. These three methods included the application of computational fluid dynamics (CFD), dynamic modal analysis using finite element analysis (FEA), and a stress analysis method also using FEA. The CFD analysis focuses primarily on the fuel mixing in the combustion chamber of a TLA engine. Results indicate a significant increase in the homogeneity of the air and fuel using high-pressure fuel injection (HPFI) instead of standard low-pressure mechanical gas admission valve (MGAV). A modal analysis of three engine crankshafts (TLA-6, HBA-6, and GMV-10) is developed andmore » presented. Results indicate that each crankshaft has a natural frequency and corresponding speed that is well away from the typical engine operating speed. A frame stress analysis method is also developed and presented. Two different crankcases are examined. A TLA-6 crankcase is modeled and a stress analysis is performed. The method of dynamic load determination, model setup, and the results from the stress analysis are discussed. Preliminary results indicate a 10%-15% maximum increase in frame stress due to a 20% increase in HP. However, the high stress regions were localized. A new hydraulically actuated mechanical fuel valve is also developed and presented. This valve provides equivalent high-energy (supersonic) fuel injection comparable to a HPFI system, at 1/5th of the natural gas fuel pressure. This valve was developed in cooperation with the Dresser-Rand Corporation.« less

  18. Dynamic analysis of nonlinear rotor-housing systems

    NASA Technical Reports Server (NTRS)

    Noah, Sherif T.

    1988-01-01

    Nonlinear analysis methods are developed which will enable the reliable prediction of the dynamic behavior of the space shuttle main engine (SSME) turbopumps in the presence of bearing clearances and other local nonlinearities. A computationally efficient convolution method, based on discretized Duhamel and transition matrix integral formulations, is developed for the transient analysis. In the formulation, the coupling forces due to the nonlinearities are treated as external forces acting on the coupled subsystems. Iteration is utilized to determine their magnitudes at each time increment. The method is applied to a nonlinear generic model of the high pressure oxygen turbopump (HPOTP). As compared to the fourth order Runge-Kutta numerical integration methods, the convolution approach proved to be more accurate and more highly efficient. For determining the nonlinear, steady-state periodic responses, an incremental harmonic balance method was also developed. The method was successfully used to determine dominantly harmonic and subharmonic responses fo the HPOTP generic model with bearing clearances. A reduction method similar to the impedance formulation utilized with linear systems is used to reduce the housing-rotor models to their coordinates at the bearing clearances. Recommendations are included for further development of the method, for extending the analysis to aperiodic and chaotic regimes and for conducting critical parameteric studies of the nonlinear response of the current SSME turbopumps.

  19. Multifidelity Analysis and Optimization for Supersonic Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Willcox, Karen; March, Andrew; Haas, Alex; Rajnarayan, Dev; Kays, Cory

    2010-01-01

    Supersonic aircraft design is a computationally expensive optimization problem and multifidelity approaches over a significant opportunity to reduce design time and computational cost. This report presents tools developed to improve supersonic aircraft design capabilities including: aerodynamic tools for supersonic aircraft configurations; a systematic way to manage model uncertainty; and multifidelity model management concepts that incorporate uncertainty. The aerodynamic analysis tools developed are appropriate for use in a multifidelity optimization framework, and include four analysis routines to estimate the lift and drag of a supersonic airfoil, a multifidelity supersonic drag code that estimates the drag of aircraft configurations with three different methods: an area rule method, a panel method, and an Euler solver. In addition, five multifidelity optimization methods are developed, which include local and global methods as well as gradient-based and gradient-free techniques.

  20. Decision Support Methods and Tools

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.

    2006-01-01

    This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed

  1. Using Work Action Analysis to Identify Web-Portal Requirements for a Professional Development Program

    ERIC Educational Resources Information Center

    Nickles, George

    2007-01-01

    This article describes using Work Action Analysis (WAA) as a method for identifying requirements for a web-based portal that supports a professional development program. WAA is a cognitive systems engineering method for modeling multi-agent systems to support design and evaluation. A WAA model of the professional development program of the…

  2. Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems

    NASA Technical Reports Server (NTRS)

    Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.

    2005-01-01

    The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.

  3. Analysis and development of adjoint-based h-adaptive direct discontinuous Galerkin method for the compressible Navier-Stokes equations

    NASA Astrophysics Data System (ADS)

    Cheng, Jian; Yue, Huiqiang; Yu, Shengjiao; Liu, Tiegang

    2018-06-01

    In this paper, an adjoint-based high-order h-adaptive direct discontinuous Galerkin method is developed and analyzed for the two dimensional steady state compressible Navier-Stokes equations. Particular emphasis is devoted to the analysis of the adjoint consistency for three different direct discontinuous Galerkin discretizations: including the original direct discontinuous Galerkin method (DDG), the direct discontinuous Galerkin method with interface correction (DDG(IC)) and the symmetric direct discontinuous Galerkin method (SDDG). Theoretical analysis shows the extra interface correction term adopted in the DDG(IC) method and the SDDG method plays a key role in preserving the adjoint consistency. To be specific, for the model problem considered in this work, we prove that the original DDG method is not adjoint consistent, while the DDG(IC) method and the SDDG method can be adjoint consistent with appropriate treatment of boundary conditions and correct modifications towards the underlying output functionals. The performance of those three DDG methods is carefully investigated and evaluated through typical test cases. Based on the theoretical analysis, an adjoint-based h-adaptive DDG(IC) method is further developed and evaluated, numerical experiment shows its potential in the applications of adjoint-based adaptation for simulating compressible flows.

  4. Advanced composites structural concepts and materials technologies for primary aircraft structures: Structural response and failure analysis

    NASA Technical Reports Server (NTRS)

    Dorris, William J.; Hairr, John W.; Huang, Jui-Tien; Ingram, J. Edward; Shah, Bharat M.

    1992-01-01

    Non-linear analysis methods were adapted and incorporated in a finite element based DIAL code. These methods are necessary to evaluate the global response of a stiffened structure under combined in-plane and out-of-plane loading. These methods include the Arc Length method and target point analysis procedure. A new interface material model was implemented that can model elastic-plastic behavior of the bond adhesive. Direct application of this method is in skin/stiffener interface failure assessment. Addition of the AML (angle minus longitudinal or load) failure procedure and Hasin's failure criteria provides added capability in the failure predictions. Interactive Stiffened Panel Analysis modules were developed as interactive pre-and post-processors. Each module provides the means of performing self-initiated finite elements based analysis of primary structures such as a flat or curved stiffened panel; a corrugated flat sandwich panel; and a curved geodesic fuselage panel. This module brings finite element analysis into the design of composite structures without the requirement for the user to know much about the techniques and procedures needed to actually perform a finite element analysis from scratch. An interactive finite element code was developed to predict bolted joint strength considering material and geometrical non-linearity. The developed method conducts an ultimate strength failure analysis using a set of material degradation models.

  5. Software development for teleroentgenogram analysis

    NASA Astrophysics Data System (ADS)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  6. Software Safety Progress in NASA

    NASA Technical Reports Server (NTRS)

    Radley, Charles F.

    1995-01-01

    NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.

  7. A Review of Flow Analysis Methods for Determination of Radionuclides in Nuclear Wastes and Nuclear Reactor Coolants

    DOE PAGES

    Trojanowicz, Marek; Kolacinska, Kamila; Grate, Jay W.

    2018-02-13

    Here, the safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. Themore » benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β–radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection.« less

  8. A Review of Flow Analysis Methods for Determination of Radionuclides in Nuclear Wastes and Nuclear Reactor Coolants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trojanowicz, Marek; Kolacinska, Kamila; Grate, Jay W.

    Here, the safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. Themore » benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β–radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection.« less

  9. A review of flow analysis methods for determination of radionuclides in nuclear wastes and nuclear reactor coolants.

    PubMed

    Trojanowicz, Marek; Kołacińska, Kamila; Grate, Jay W

    2018-06-01

    The safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. The benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β-radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Component-specific modeling

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.

    1985-01-01

    A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.

  11. Shape design sensitivity analysis and optimization of three dimensional elastic solids using geometric modeling and automatic regridding. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Yao, Tse-Min; Choi, Kyung K.

    1987-01-01

    An automatic regridding method and a three dimensional shape design parameterization technique were constructed and integrated into a unified theory of shape design sensitivity analysis. An algorithm was developed for general shape design sensitivity analysis of three dimensional eleastic solids. Numerical implementation of this shape design sensitivity analysis method was carried out using the finite element code ANSYS. The unified theory of shape design sensitivity analysis uses the material derivative of continuum mechanics with a design velocity field that represents shape change effects over the structural design. Automatic regridding methods were developed by generating a domain velocity field with boundary displacement method. Shape design parameterization for three dimensional surface design problems was illustrated using a Bezier surface with boundary perturbations that depend linearly on the perturbation of design parameters. A linearization method of optimization, LINRM, was used to obtain optimum shapes. Three examples from different engineering disciplines were investigated to demonstrate the accuracy and versatility of this shape design sensitivity analysis method.

  12. Aeroelastic stability and response of rotating structures

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.

    1993-01-01

    A summary of the work performed during the progress period is presented. Analysis methods for predicting loads and instabilities of wind turbines were developed. Three new areas of research to aid the Advanced Turboprop Project (ATP) were initiated and developed. These three areas of research are aeroelastic analysis methods for cascades including blade and disk flexibility; stall flutter analysis; and computational aeroelasticity.

  13. Validated chromatographic and spectrophotometric methods for analysis of some amoebicide drugs in their combined pharmaceutical preparation.

    PubMed

    Abdelaleem, Eglal Adelhamid; Abdelwahab, Nada Sayed

    2013-01-01

    This work is concerned with development and validation of chromatographic and spectrophotometric methods for analysis of mebeverine HCl (MEH), diloxanide furoate (DF) and metronidazole (MET) in Dimetrol® tablets - spectrophotometric and RP-HPLC methods using UV detection. The developed spectrophotometric methods depend on determination of MEH and DF in the combined dosage form using the successive derivative ratio spectra method which depends on derivatization of the obtained ratio spectra in two steps using methanol as a solvent and measuring MEH at 226.4-232.2 nm (peak to peak) and DF at 260.6-264.8 nm (peak to peak). While MET concentrations were determined using first derivative (1D) at λ = 327 nm using the same solvent. The chromatographic method depends on HPLC separation on ODS column and elution with a mobile phase consisting water: methanol: triethylamine (25: 75: 0.5, by volume, orthophosphoric acid to pH =4). Pumping the mobile phase at 0.7 ml min-1 with UV at 230 nm. Factors affecting the developed methods were studied and optimized, moreover, they have been validated as per ICH guideline and the results demonstrated that the suggested methods are reproducible, reliable and can be applied for routine use with short time of analysis. Statistical analysis of the two developed methods with each other using F and student's-t tests showed no significant difference.

  14. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diprete, D.; McCabe, D.

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat ® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tankmore » waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.« less

  15. Frequency analysis of uncertain structures using imprecise probability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Modares, Mehdi; Bergerson, Joshua

    2015-01-01

    Two new methods for finite element based frequency analysis of a structure with uncertainty are developed. An imprecise probability formulation based on enveloping p-boxes is used to quantify the uncertainty present in the mechanical characteristics of the structure. For each element, independent variations are considered. Using the two developed methods, P-box Frequency Analysis (PFA) and Interval Monte-Carlo Frequency Analysis (IMFA), sharp bounds on natural circular frequencies at different probability levels are obtained. These methods establish a framework for handling incomplete information in structural dynamics. Numerical example problems are presented that illustrate the capabilities of the new methods along with discussionsmore » on their computational efficiency.« less

  16. Analysis of the nature and cause of turbulence upset using airline flight records

    NASA Technical Reports Server (NTRS)

    Parks, E. K.; Bach, R. E., Jr.; Wingrove, R. C.

    1982-01-01

    The development and application of methods for determining aircraft motions and related winds, using data normally recorded during airline flight operations, are described. The methods are being developed, in cooperation with the National Transportation Safety Board, to aid in the analysis and understanding of circumstances associated with aircraft accidents or incidents. Data from a recent DC-10 encounter with severe, high-altitude turbulence are used to illustrate the methods. The analysis of this encounter shows the turbulence to be a series of equally spaced horizontal swirls known as 'cat's eyes' vortices. The use of flight-data analysis methods to identify this type of turbulence phenomenon is presented for the first time.

  17. GMDD: a database of GMO detection methods.

    PubMed

    Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing

    2008-06-04

    Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.

  18. Analysis of Endocrine Disrupting Pesticides by Capillary GC with Mass Spectrometric Detection

    PubMed Central

    Matisová, Eva; Hrouzková, Svetlana

    2012-01-01

    Endocrine disrupting chemicals, among them many pesticides, alter the normal functioning of the endocrine system of both wildlife and humans at very low concentration levels. Therefore, the importance of method development for their analysis in food and the environment is increasing. This also covers contributions in the field of ultra-trace analysis of multicomponent mixtures of organic pollutants in complex matrices. With this fact conventional capillary gas chromatography (CGC) and fast CGC with mass spectrometric detection (MS) has acquired a real importance in the analysis of endocrine disrupting pesticide (EDP) residues. This paper provides an overview of GC methods, including sample preparation steps, for analysis of EDPs in a variety of matrices at ultra-trace concentration levels. Emphasis is put on separation method, mode of MS detection and ionization and obtained limits of detection and quantification. Analysis time is one of the most important aspects that should be considered in the choice of analytical methods for routine analysis. Therefore, the benefits of developed fast GC methods are important. PMID:23202677

  19. Rapid Radiochemical Method for Isotopic Uranium in Building ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Uranium-234, uranium-235, and uranium-238 in concrete and brick samples Method Selected for: SAM lists this method for qualitative analysis of uranium-234, uranium-235, and uranium-238 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  20. Structural system reliability calculation using a probabilistic fault tree analysis method

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  1. Integrating Cognitive Task Analysis into Instructional Systems Development.

    ERIC Educational Resources Information Center

    Ryder, Joan M.; Redding, Richard E.

    1993-01-01

    Discussion of instructional systems development (ISD) focuses on recent developments in cognitive task analysis and describes the Integrated Task Analysis Model, a framework for integrating cognitive and behavioral task analysis methods within the ISD model. Three components of expertise are analyzed: skills, knowledge, and mental models. (96…

  2. Three-dimensional Stress Analysis Using the Boundary Element Method

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1984-01-01

    The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

  3. The International College of Neuropsychopharmacology (CINP) Treatment Guidelines for Bipolar Disorder in Adults (CINP-BD-2017), Part 1: Background and Methods of the Development of Guidelines

    PubMed Central

    Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried

    2017-01-01

    Abstract Background: This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. Methods: The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. Results: The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. Conclusion: The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. PMID:27815414

  4. Toward a Method for Exposing and Elucidating Ethical Issues with Human Cognitive Enhancement Technologies.

    PubMed

    Hofmann, Bjørn

    2017-04-01

    To develop a method for exposing and elucidating ethical issues with human cognitive enhancement (HCE). The intended use of the method is to support and facilitate open and transparent deliberation and decision making with respect to this emerging technology with great potential formative implications for individuals and society. Literature search to identify relevant approaches. Conventional content analysis of the identified papers and methods in order to assess their suitability for assessing HCE according to four selection criteria. Method development. Amendment after pilot testing on smart-glasses. Based on three existing approaches in health technology assessment a method for exposing and elucidating ethical issues in the assessment of HCE technologies was developed. Based on a pilot test for smart-glasses, the method was amended. The method consists of six steps and a guiding list of 43 questions. A method for exposing and elucidating ethical issues in the assessment of HCE was developed. The method provides the ground work for context specific ethical assessment and analysis. Widespread use, amendments, and further developments of the method are encouraged.

  5. [Development of selective determination methods for quinones with fluorescence and chemiluminescence detection and their application to environmental and biological samples].

    PubMed

    Kishikawa, Naoya

    2010-10-01

    Quinones are compounds that have various characteristics such as a biological electron transporter, an industrial product and a harmful environmental pollutant. Therefore, an effective determination method for quinones is required in many fields. This review describes the development of sensitive and selective determination methods for quinones based on some detection principles and their application to analyses in environmental, pharmaceutical and biological samples. Firstly, a fluorescence method was developed based on fluorogenic derivatization of quinones and applied to environmental analysis. Secondly, a luminol chemiluminescence method was developed based on generation of reactive oxygen species through the redox cycle of quinone and applied to pharmaceutical analysis. Thirdly, a photo-induced chemiluminescence method was developed based on formation of reactive oxygen species and fluorophore or chemiluminescence enhancer by the photoreaction of quinones and applied to biological and environmental analyses.

  6. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.

  7. David W. Templeton | NREL

    Science.gov Websites

    and algal biomass analysis methods and applications of these methods to different processes. Templeton , internally funded research project to develop microalgal compositional analysis methods that included setting methods Closing mass and component balances around pretreatment, saccharification, and fermentation unit

  8. Review of analytical methods for the quantification of iodine in complex matrices.

    PubMed

    Shelor, C Phillip; Dasgupta, Purnendu K

    2011-09-19

    Iodine is an essential element of human nutrition. Nearly a third of the global population has insufficient iodine intake and is at risk of developing Iodine Deficiency Disorders (IDD). Most countries have iodine supplementation and monitoring programs. Urinary iodide (UI) is the biomarker used for epidemiological studies; only a few methods are currently used routinely for analysis. These methods either require expensive instrumentation with qualified personnel (inductively coupled plasma-mass spectrometry, instrumental nuclear activation analysis) or oxidative sample digestion to remove potential interferences prior to analysis by a kinetic colorimetric method originally introduced by Sandell and Kolthoff ~75 years ago. The Sandell-Kolthoff (S-K) method is based on the catalytic effect of iodide on the reaction between Ce(4+) and As(3+). No available technique fully fits the needs of developing countries; research into inexpensive reliable methods and instrumentation are needed. There have been multiple reviews of methods used for epidemiological studies and specific techniques. However, a general review of iodine determination on a wide-ranging set of complex matrices is not available. While this review is not comprehensive, we cover the principal developments since the original development of the S-K method. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Multiresolution and Explicit Methods for Vector Field Analysis and Visualization

    NASA Technical Reports Server (NTRS)

    Nielson, Gregory M.

    1997-01-01

    This is a request for a second renewal (3d year of funding) of a research project on the topic of multiresolution and explicit methods for vector field analysis and visualization. In this report, we describe the progress made on this research project during the second year and give a statement of the planned research for the third year. There are two aspects to this research project. The first is concerned with the development of techniques for computing tangent curves for use in visualizing flow fields. The second aspect of the research project is concerned with the development of multiresolution methods for curvilinear grids and their use as tools for visualization, analysis and archiving of flow data. We report on our work on the development of numerical methods for tangent curve computation first.

  10. Verification and Validation Process for Progressive Damage and Failure Analysis Methods in the NASA Advanced Composites Consortium

    NASA Technical Reports Server (NTRS)

    Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl

    2017-01-01

    The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.

  11. [Development and application of morphological analysis method in Aspergillus niger fermentation].

    PubMed

    Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang

    2015-02-01

    Filamentous fungi are widely used in industrial fermentation. Particular fungal morphology acts as a critical index for a successful fermentation. To break the bottleneck of morphological analysis, we have developed a reliable method for fungal morphological analysis. By this method, we can prepare hundreds of pellet samples simultaneously and obtain quantitative morphological information at large scale quickly. This method can largely increase the accuracy and reliability of morphological analysis result. Based on that, the studies of Aspergillus niger morphology under different oxygen supply conditions and shear rate conditions were carried out. As a result, the morphological responding patterns of A. niger morphology to these conditions were quantitatively demonstrated, which laid a solid foundation for the further scale-up.

  12. Robust Methods for Moderation Analysis with a Two-Level Regression Model.

    PubMed

    Yang, Miao; Yuan, Ke-Hai

    2016-01-01

    Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.

  13. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  14. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  15. Meta-analysis and The Cochrane Collaboration: 20 years of the Cochrane Statistical Methods Group

    PubMed Central

    2013-01-01

    The Statistical Methods Group has played a pivotal role in The Cochrane Collaboration over the past 20 years. The Statistical Methods Group has determined the direction of statistical methods used within Cochrane reviews, developed guidance for these methods, provided training, and continued to discuss and consider new and controversial issues in meta-analysis. The contribution of Statistical Methods Group members to the meta-analysis literature has been extensive and has helped to shape the wider meta-analysis landscape. In this paper, marking the 20th anniversary of The Cochrane Collaboration, we reflect on the history of the Statistical Methods Group, beginning in 1993 with the identification of aspects of statistical synthesis for which consensus was lacking about the best approach. We highlight some landmark methodological developments that Statistical Methods Group members have contributed to in the field of meta-analysis. We discuss how the Group implements and disseminates statistical methods within The Cochrane Collaboration. Finally, we consider the importance of robust statistical methodology for Cochrane systematic reviews, note research gaps, and reflect on the challenges that the Statistical Methods Group faces in its future direction. PMID:24280020

  16. Simulation of multi-element multispectral UV radiation source for optical-electronic system of minerals luminescence analysis

    NASA Astrophysics Data System (ADS)

    Peretyagin, Vladimir S.; Korolev, Timofey K.; Chertov, Aleksandr N.

    2017-02-01

    The problems of dressability the solid minerals are attracted attention of specialists, where the extraction of mineral raw materials is a significant sector of the economy. There are a significant amount of mineral ore dressability methods. At the moment the radiometric dressability methods are considered the most promising. One of radiometric methods is method photoluminescence. This method is based on the spectral analysis, amplitude and kinetic parameters luminescence of minerals (under UV radiation), as well as color parameters of radiation. The absence of developed scientific and methodological approaches of analysis irradiation area to UV radiation as well as absence the relevant radiation sources are the factors which hinder development and use of photoluminescence method. The present work is devoted to the development of multi-element UV radiation source designed for the solution problem of analysis and sorting minerals by their selective luminescence. This article is presented a method of theoretical modeling of the radiation devices based on UV LEDs. The models consider such factors as spectral component, the spatial and energy parameters of the LEDs. Also, this article is presented the results of experimental studies of the some samples minerals.

  17. Method Development for Analysis of Aspirin Tablets.

    ERIC Educational Resources Information Center

    Street, Kenneth W., Jr.

    1988-01-01

    Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

  18. Dynamic Chest Image Analysis: Evaluation of Model-Based Pulmonary Perfusion Analysis With Pyramid Images

    DTIC Science & Technology

    2001-10-25

    Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for

  19. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry.

    PubMed

    Ornatsky, Olga I; Kinach, Robert; Bandura, Dmitry R; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I; Nitz, Mark; Winnik, Mitchell A

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping.

  20. Operations planning and analysis handbook for NASA/MSFC phase B development projects

    NASA Technical Reports Server (NTRS)

    Batson, Robert C.

    1986-01-01

    Current operations planning and analysis practices on NASA/MSFC Phase B projects were investigated with the objectives of (1) formalizing these practices into a handbook and (2) suggesting improvements. The study focused on how Science and Engineering (S&E) Operational Personnel support Program Development (PD) Task Teams. The intimate relationship between systems engineering and operations analysis was examined. Methods identified for use by operations analysts during Phase B include functional analysis, interface analysis methods to calculate/allocate such criteria as reliability, Maintainability, and operations and support cost.

  1. Robustness Analysis and Reliable Flight Regime Estimation of an Integrated Resilent Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine

    2008-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. As a part of the validation process, this paper describes an analysis method for determining a reliable flight regime in the flight envelope within which an integrated resilent control system can achieve the desired performance of tracking command signals and detecting additive faults in the presence of parameter uncertainty and unmodeled dynamics. To calculate a reliable flight regime, a structured singular value analysis method is applied to analyze the closed-loop system over the entire flight envelope. To use the structured singular value analysis method, a linear fractional transform (LFT) model of a transport aircraft longitudinal dynamics is developed over the flight envelope by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The developed LFT model can capture original nonlinear dynamics over the flight envelope with the ! block which contains key varying parameters: angle of attack and velocity, and real parameter uncertainty: aerodynamic coefficient uncertainty and moment of inertia uncertainty. Using the developed LFT model and a formal robustness analysis method, a reliable flight regime is calculated for a transport aircraft closed-loop system.

  2. Evaluation of qPCR curve analysis methods for reliable biomarker discovery: bias, resolution, precision, and implications.

    PubMed

    Ruijter, Jan M; Pfaffl, Michael W; Zhao, Sheng; Spiess, Andrej N; Boggy, Gregory; Blom, Jochen; Rutledge, Robert G; Sisti, Davide; Lievens, Antoon; De Preter, Katleen; Derveaux, Stefaan; Hellemans, Jan; Vandesompele, Jo

    2013-01-01

    RNA transcripts such as mRNA or microRNA are frequently used as biomarkers to determine disease state or response to therapy. Reverse transcription (RT) in combination with quantitative PCR (qPCR) has become the method of choice to quantify small amounts of such RNA molecules. In parallel with the democratization of RT-qPCR and its increasing use in biomedical research or biomarker discovery, we witnessed a growth in the number of gene expression data analysis methods. Most of these methods are based on the principle that the position of the amplification curve with respect to the cycle-axis is a measure for the initial target quantity: the later the curve, the lower the target quantity. However, most methods differ in the mathematical algorithms used to determine this position, as well as in the way the efficiency of the PCR reaction (the fold increase of product per cycle) is determined and applied in the calculations. Moreover, there is dispute about whether the PCR efficiency is constant or continuously decreasing. Together this has lead to the development of different methods to analyze amplification curves. In published comparisons of these methods, available algorithms were typically applied in a restricted or outdated way, which does not do them justice. Therefore, we aimed at development of a framework for robust and unbiased assessment of curve analysis performance whereby various publicly available curve analysis methods were thoroughly compared using a previously published large clinical data set (Vermeulen et al., 2009) [11]. The original developers of these methods applied their algorithms and are co-author on this study. We assessed the curve analysis methods' impact on transcriptional biomarker identification in terms of expression level, statistical significance, and patient-classification accuracy. The concentration series per gene, together with data sets from unpublished technical performance experiments, were analyzed in order to assess the algorithms' precision, bias, and resolution. While large differences exist between methods when considering the technical performance experiments, most methods perform relatively well on the biomarker data. The data and the analysis results per method are made available to serve as benchmark for further development and evaluation of qPCR curve analysis methods (http://qPCRDataMethods.hfrc.nl). Copyright © 2012 Elsevier Inc. All rights reserved.

  3. RESEARCH METHOD FOR SAMPLING AND ANALYSIS OF FIBROUS AMPHIBOLE IN VERMICULITE ATTIC INSULATION

    EPA Science Inventory

    NRMRL hosted a meeting on July 17-18, 2003 entitled, "Analytical Method for Bulk Analysis of Vermiculite." The purpose of this effort was to produce an interim research method for use by U.S. EPA's Office of Research and Development (ORD) for the analysis of bulk vermiculite for...

  4. A Study on Multi-Swing Stability Analysis of Power System using Damping Rate Inversion

    NASA Astrophysics Data System (ADS)

    Tsuji, Takao; Morii, Yuki; Oyama, Tsutomu; Hashiguchi, Takuhei; Goda, Tadahiro; Nomiyama, Fumitoshi; Kosugi, Narifumi

    In recent years, much attention is paid to the nonlinear analysis method in the field of stability analysis of power systems. Especially for the multi-swing stability analysis, the unstable limit cycle has an important meaning as a stability margin. It is required to develop a high speed calculation method of stability boundary regarding multi-swing stability because the real-time calculation of ATC is necessary to realize the flexible wheeling trades. Therefore, the authors have developed a new method which can calculate the unstable limit cycle based on damping rate inversion method. Using the unstable limit cycle, it is possible to predict the multi-swing stability at the time when the fault transmission line is reclosed. The proposed method is tested in Lorenz equation, single-machine infinite-bus system model and IEEJ WEST10 system model.

  5. Lab-on-a-chip nucleic-acid analysis towards point-of-care applications

    NASA Astrophysics Data System (ADS)

    Kopparthy, Varun Lingaiah

    Recent infectious disease outbreaks, such as Ebola in 2013, highlight the need for fast and accurate diagnostic tools to combat the global spread of the disease. Detection and identification of the disease-causing viruses and bacteria at the genetic level is required for accurate diagnosis of the disease. Nucleic acid analysis systems have shown promise in identifying diseases such as HIV, anthrax, and Ebola in the past. Conventional nucleic acid analysis systems are still time consuming, and are not suitable for point-ofcare applications. Miniaturized nucleic acid systems has shown great promise for rapid analysis, but they have not been commercialized due to several factors such as footprint, complexity, portability, and power consumption. This dissertation presents the development of technologies and methods for a labon-a-chip nucleic acid analysis towards point-of-care applications. An oscillatory-flow PCR methodology in a thermal gradient is developed which provides real-time analysis of nucleic-acid samples. Oscillating flow PCR was performed in the microfluidic device under thermal gradient in 40 minutes. Reverse transcription PCR (RT-PCR) was achieved in the system without an additional heating element for incubation to perform reverse transcription step. A novel method is developed for the simultaneous pattering and bonding of all-glass microfluidic devices in a microwave oven. Glass microfluidic devices were fabricated in less than 4 minutes. Towards an integrated system for the detection of amplified products, a thermal sensing method is studied for the optimization of the sensor output. Calorimetric sensing method is characterized to identify design considerations and optimal parameters such as placement of the sensor, steady state response, and flow velocity for improved performance. An understanding of these developed technologies and methods will facilitate the development of lab-on-a-chip systems for point-of-care analysis.

  6. GMDD: a database of GMO detection methods

    PubMed Central

    Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans JP; Guo, Rong; Liang, Wanqi; Zhang, Dabing

    2008-01-01

    Background Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. Results GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. Conclusion GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier. PMID:18522755

  7. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  8. RECENT DEVELOPMENTS IN ANALYTICAL METHODS FOR FIBROUS AMPHIBOLE IN VERMICULITE ATTIC INSULATION

    EPA Science Inventory

    The U.S. Environmental Protection Agency has developed a test method for the analysis of fibrous amphibole in vermiculite attic insulation. This method was developed to provide the Agency with monitoring tools to study the occurrence and potential for exposure to fibrous amphibo...

  9. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis

    USDA-ARS?s Scientific Manuscript database

    Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...

  10. Identification or Development of Chemical Analysis Methods for Plants and Animal Tissues

    DTIC Science & Technology

    1981-01-01

    Report No. DRXTH-TE-CR-80086 [E EMWSTEEACISITE - IDENTIFICATION OR DEVELOPMENT OF CHEMICAL ANALYSISI METHODS FOR PLANTS AND ANIMAL TISSUES D l...86i 4TITLE (ansdo"t) TYPE or" P 4. Iih~iti.)Final epwt. )9 A4,417 ~, Identlifiation or Development of Chemical I-g 9 O d 18 Analysis Methods for Plants...n TN oT sacmpihdo in. wEr DContinng er adetcsr aid Iuent bybopkmertda lo aeo 1.5th 1/s deeipta24ndectr Tritrotasolsoepae d andT Boloia Matrc,, 1473

  11. Fractal analysis of GPS time series for early detection of disastrous seismic events

    NASA Astrophysics Data System (ADS)

    Filatov, Denis M.; Lyubushin, Alexey A.

    2017-03-01

    A new method of fractal analysis of time series for estimating the chaoticity of behaviour of open stochastic dynamical systems is developed. The method is a modification of the conventional detrended fluctuation analysis (DFA) technique. We start from analysing both methods from the physical point of view and demonstrate the difference between them which results in a higher accuracy of the new method compared to the conventional DFA. Then, applying the developed method to estimate the measure of chaoticity of a real dynamical system - the Earth's crust, we reveal that the latter exhibits two distinct mechanisms of transition to a critical state: while the first mechanism has already been known due to numerous studies of other dynamical systems, the second one is new and has not previously been described. Using GPS time series, we demonstrate efficiency of the developed method in identification of critical states of the Earth's crust. Finally we employ the method to solve a practically important task: we show how the developed measure of chaoticity can be used for early detection of disastrous seismic events and provide a detailed discussion of the numerical results, which are shown to be consistent with outcomes of other researches on the topic.

  12. RESEARCH ASSOCIATED WITH THE DEVELOPMENT OF EPA METHOD 552.2

    EPA Science Inventory

    The work presented in this paper entails the development of a method for haloacetic acid (HAA) analysis, Environmental Protection Agency (EPA)method 552.2, that improves the saftey and efficiency of previous methods and incorporates three additional trihalogenated acetic acids: b...

  13. A method for studying decision-making by guideline development groups.

    PubMed

    Gardner, Benjamin; Davidson, Rosemary; McAteer, John; Michie, Susan

    2009-08-05

    Multidisciplinary guideline development groups (GDGs) have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR) study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE) acute physical health GDG. This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision-making groups.

  14. A needs analysis method for land-use planning of illegal dumping sites: a case study in Aomori-Iwate, Japan.

    PubMed

    Ishii, Kazuei; Furuichi, Toru; Nagao, Yukari

    2013-02-01

    Land use at contaminated sites, following remediation, is often needed for regional redevelopment. However, there exist few methods of developing economically and socially feasible land-use plans based on regional needs because of the wide variety of land-use requirements. This study proposes a new needs analysis method for the conceptual land-use planning of contaminated sites and illustrates this method with a case study of an illegal dumping site for hazardous waste. In this method, planning factors consisting of the land-use attributes and related facilities are extracted from the potential needs of the residents through a preliminary questionnaire. Using the extracted attributes of land use and the related facilities, land-use cases are designed for selection-based conjoint analysis. A second questionnaire for respondents to the first one who indicated an interest in participating in the second questionnaire is conducted for the conjoint analysis to determine the utility function and marginal cost of each attribute in order to prioritize the planning factors to develop a quantitative and economically and socially feasible land-use plan. Based on the results, site-specific land-use alternatives are developed and evaluated by the utility function obtained from the conjoint analysis. In this case study of an illegal dumping site for hazardous waste, the uses preferred as part of a conceptual land-use plan following remediation of the site were (1) agricultural land and a biogas plant designed to recover energy from biomass or (2) a park with a welfare facility and an athletic field. Our needs analysis method with conjoint analysis is applicable to the development of conceptual land-use planning for similar sites following remediation, particularly when added value is considered. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Atomistic cluster alignment method for local order mining in liquids and glasses

    NASA Astrophysics Data System (ADS)

    Fang, X. W.; Wang, C. Z.; Yao, Y. X.; Ding, Z. J.; Ho, K. M.

    2010-11-01

    An atomistic cluster alignment method is developed to identify and characterize the local atomic structural order in liquids and glasses. With the “order mining” idea for structurally disordered systems, the method can detect the presence of any type of local order in the system and can quantify the structural similarity between a given set of templates and the aligned clusters in a systematic and unbiased manner. Moreover, population analysis can also be carried out for various types of clusters in the system. The advantages of the method in comparison with other previously developed analysis methods are illustrated by performing the structural analysis for four prototype systems (i.e., pure Al, pure Zr, Zr35Cu65 , and Zr36Ni64 ). The results show that the cluster alignment method can identify various types of short-range orders (SROs) in these systems correctly while some of these SROs are difficult to capture by most of the currently available analysis methods (e.g., Voronoi tessellation method). Such a full three-dimensional atomistic analysis method is generic and can be applied to describe the magnitude and nature of noncrystalline ordering in many disordered systems.

  16. Methodology in the Assessment of Construction and Development Investment Projects, Including the Graphic Multi-Criteria Analysis - a Systemic Approach

    NASA Astrophysics Data System (ADS)

    Szafranko, Elżbieta

    2017-10-01

    Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.

  17. Development and Validation of HPLC-DAD and UHPLC-DAD Methods for the Simultaneous Determination of Guanylhydrazone Derivatives Employing a Factorial Design.

    PubMed

    Azevedo de Brito, Wanessa; Gomes Dantas, Monique; Andrade Nogueira, Fernando Henrique; Ferreira da Silva-Júnior, Edeildo; Xavier de Araújo-Júnior, João; Aquino, Thiago Mendonça de; Adélia Nogueira Ribeiro, Êurica; da Silva Solon, Lilian Grace; Soares Aragão, Cícero Flávio; Barreto Gomes, Ana Paula

    2017-08-30

    Guanylhydrazones are molecules with great pharmacological potential in various therapeutic areas, including antitumoral activity. Factorial design is an excellent tool in the optimization of a chromatographic method, because it is possible quickly change factors such as temperature, mobile phase composition, mobile phase pH, column length, among others to establish the optimal conditions of analysis. The aim of the present work was to develop and validate a HPLC and UHPLC methods for the simultaneous determination of guanylhydrazones with anticancer activity employing experimental design. Precise, exact, linear and robust HPLC and UHPLC methods were developed and validated for the simultaneous quantification of the guanylhydrazones LQM10, LQM14, and LQM17. The UHPLC method was more economic, with a four times less solvent consumption, and 20 times less injection volume, what allowed better column performance. Comparing the empirical approach employed in the HPLC method development to the DoE approach employed in the UHPLC method development, we can conclude that the factorial design made the method development faster, more practical and rational. This resulted in methods that can be employed in the analysis, evaluation and quality control of these new synthetic guanylhydrazones.

  18. STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitch, S.H.; Morris, J.W.

    1962-12-15

    Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)

  19. On 3D inelastic analysis methods for hot section components

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Chen, P. C.; Dame, L. T.; Holt, R. V.; Huang, H.; Hartle, M.; Gellin, S.; Allen, D. H.; Haisler, W. E.

    1986-01-01

    Accomplishments are described for the 2-year program, to develop advanced 3-D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades and vanes. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulations models were developed; an eight-noded mid-surface shell element, a nine-noded mid-surface shell element and a twenty-noded isoparametric solid element. A separate computer program was developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.

  20. The 3D inelastic analysis methods for hot section components

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.

    1992-01-01

    A two-year program to develop advanced 3D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades, and vanes is described. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulation models were developed: an eight-noded midsurface shell element; a nine-noded midsurface shell element; and a twenty-noded isoparametric solid element. A separate computer program has been developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.

  1. Network Analysis: Applications for the Developing Brain

    PubMed Central

    Chu-Shore, Catherine J.; Kramer, Mark A.; Bianchi, Matt T.; Caviness, Verne S.; Cash, Sydney S.

    2011-01-01

    Development of the human brain follows a complex trajectory of age-specific anatomical and physiological changes. The application of network analysis provides an illuminating perspective on the dynamic interregional and global properties of this intricate and complex system. Here, we provide a critical synopsis of methods of network analysis with a focus on developing brain networks. After discussing basic concepts and approaches to network analysis, we explore the primary events of anatomical cortical development from gestation through adolescence. Upon this framework, we describe early work revealing the evolution of age-specific functional brain networks in normal neurodevelopment. Finally, we review how these relationships can be altered in disease and perhaps even rectified with treatment. While this method of description and inquiry remains in early form, there is already substantial evidence that the application of network models and analysis to understanding normal and abnormal human neural development holds tremendous promise for future discovery. PMID:21303762

  2. Development of a method for efficient cost-effective screening of Aspergillus niger mutants having increased production of glucoamylase.

    PubMed

    Zhu, Xudong; Arman, Bessembayev; Chu, Ju; Wang, Yonghong; Zhuang, Yingping

    2017-05-01

    To develop an efficient cost-effective screening process to improve production of glucoamylase in Aspergillus niger. The cultivation of A. niger was achieved with well-dispersed morphology in 48-deep-well microtiter plates, which increased the throughput of the samples compared to traditional flask cultivation. There was a close negative correlation between glucoamylase and its pH of the fermentation broth. A novel high-throughput analysis method using Methyl Orange was developed. When compared to the conventional analysis method using 4-nitrophenyl α-D-glucopyranoside as substrate, a correlation coefficient of 0.96 by statistical analysis was obtained. Using this novel screening method, we acquired a strain with an activity of 2.2 × 10 3  U ml -1 , a 70% higher yield of glucoamylase than its parent strain.

  3. Analysis of the influencing factors of global energy interconnection development

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; He, Yongxiu; Ge, Sifan; Liu, Lin

    2018-04-01

    Under the background of building global energy interconnection and achieving green and low-carbon development, this paper grasps a new round of energy restructuring and the trend of energy technology change, based on the present situation of global and China's global energy interconnection development, established the index system of the impact of global energy interconnection development factors. A subjective and objective weight analysis of the factors affecting the development of the global energy interconnection was conducted separately by network level analysis and entropy method, and the weights are summed up by the method of additive integration, which gives the comprehensive weight of the influencing factors and the ranking of their influence.

  4. Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis

    PubMed Central

    Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.

    2011-01-01

    Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184

  5. Development of new methodologies for evaluating the energy performance of new commercial buildings

    NASA Astrophysics Data System (ADS)

    Song, Suwon

    The concept of Measurement and Verification (M&V) of a new building continues to become more important because efficient design alone is often not sufficient to deliver an efficient building. Simulation models that are calibrated to measured data can be used to evaluate the energy performance of new buildings if they are compared to energy baselines such as similar buildings, energy codes, and design standards. Unfortunately, there is a lack of detailed M&V methods and analysis methods to measure energy savings from new buildings that would have hypothetical energy baselines. Therefore, this study developed and demonstrated several new methodologies for evaluating the energy performance of new commercial buildings using a case-study building in Austin, Texas. First, three new M&V methods were developed to enhance the previous generic M&V framework for new buildings, including: (1) The development of a method to synthesize weather-normalized cooling energy use from a correlation of Motor Control Center (MCC) electricity use when chilled water use is unavailable, (2) The development of an improved method to analyze measured solar transmittance against incidence angle for sample glazing using different solar sensor types, including Eppley PSP and Li-Cor sensors, and (3) The development of an improved method to analyze chiller efficiency and operation at part-load conditions. Second, three new calibration methods were developed and analyzed, including: (1) A new percentile analysis added to the previous signature method for use with a DOE-2 calibration, (2) A new analysis to account for undocumented exhaust air in DOE-2 calibration, and (3) An analysis of the impact of synthesized direct normal solar radiation using the Erbs correlation on DOE-2 simulation. Third, an analysis of the actual energy savings compared to three different energy baselines was performed, including: (1) Energy Use Index (EUI) comparisons with sub-metered data, (2) New comparisons against Standards 90.1-1989 and 90.1-2001, and (3) A new evaluation of the performance of selected Energy Conservation Design Measures (ECDMs). Finally, potential energy savings were also simulated from selected improvements, including: minimum supply air flow, undocumented exhaust air, and daylighting.

  6. Quality Analysis of Chlorogenic Acid and Hyperoside in Crataegi fructus

    PubMed Central

    Weon, Jin Bae; Jung, Youn Sik; Ma, Choong Je

    2016-01-01

    Background: Crataegi fructus is a herbal medicine for strong stomach, sterilization, and alcohol detoxification. Chlorogenic acid and hyperoside are the major compounds in Crataegi fructus. Objective: In this study, we established novel high-performance liquid chromatography (HPLC)-diode array detection analysis method of chlorogenic acid and hyperoside for quality control of Crataegi fructus. Materials and Methods: HPLC analysis was achieved on a reverse-phase C18 column (5 μm, 4.6 mm × 250 mm) using water and acetonitrile as mobile phase with gradient system. The method was validated for linearity, precision, and accuracy. About 31 batches of Crataegi fructus samples collected from Korea and China were analyzed by using HPLC fingerprint of developed HPLC method. Then, the contents of chlorogenic acid and hyperoside were compared for quality evaluation of Crataegi fructus. Results: The results have shown that the average contents (w/w %) of chlorogenic acid and hyperoside in Crataegi fructus collected from Korea were 0.0438% and 0.0416%, respectively, and the average contents (w/w %) of 0.0399% and 0.0325%, respectively. Conclusion: In conclusion, established HPLC analysis method was stable and could provide efficient quality evaluation for monitoring of commercial Crataegi fructus. SUMMARY Quantitative analysis method of chlorogenic acid and hyperoside in Crataegi fructus is developed by high.performance liquid chromatography.(HPLC).diode array detectionEstablished HPLC analysis method is validated with linearity, precision, and accuracyThe developed method was successfully applied for quantitative analysis of Crataegi fructus sample collected from Korea and China. Abbreviations used: HPLC: High-performance liquid chromatography, GC: Gas chromatography, MS: Mass spectrometer, LOD: Limits of detection, LOQ: Limits of quantification, RSD: Relative standard deviation, RRT: Relative retention time, RPA: Relation peak area. PMID:27076744

  7. Quad-Tree Visual-Calculus Analysis of Satellite Coverage

    NASA Technical Reports Server (NTRS)

    Lo, Martin W.; Hockney, George; Kwan, Bruce

    2003-01-01

    An improved method of analysis of coverage of areas of the Earth by a constellation of radio-communication or scientific-observation satellites has been developed. This method is intended to supplant an older method in which the global-coverage-analysis problem is solved from a ground-to-satellite perspective. The present method provides for rapid and efficient analysis. This method is derived from a satellite-to-ground perspective and involves a unique combination of two techniques for multiresolution representation of map features on the surface of a sphere.

  8. Recent Advances in the Method of Forces: Integrated Force Method of Structural Analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1998-01-01

    Stress that can be induced in an elastic continuum can be determined directly through the simultaneous application of the equilibrium equations and the compatibility conditions. In the literature, this direct stress formulation is referred to as the integrated force method. This method, which uses forces as the primary unknowns, complements the popular equilibrium-based stiffness method, which considers displacements as the unknowns. The integrated force method produces accurate stress, displacement, and frequency results even for modest finite element models. This version of the force method should be developed as an alternative to the stiffness method because the latter method, which has been researched for the past several decades, may have entered its developmental plateau. Stress plays a primary role in the development of aerospace and other products, and its analysis is difficult. Therefore, it is advisable to use both methods to calculate stress and eliminate errors through comparison. This paper examines the role of the integrated force method in analysis, animation and design.

  9. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  10. DEVELOPMENT OF A NOVEL METHOD FOR ANALYSIS OF TRANSCRIPTIONAL CHANGES IN TRANSITIONAL EPITHELIUM FROM URINARY BLADDERS OF RATS EXPOSED TO DRINKING WATER DISINFECTION BY-PRODUCTS

    EPA Science Inventory


    Development of a Novel Method for Analysis of Transcriptional Changes in Transitional Epithelium from Urinary Bladders of Rats Exposed to Drinking Water Disinfection By- products.

    Epidemiologic studies in human populations that drink chemically disinfected drinking wa...

  11. Development of the High-Order Decoupled Direct Method in Three Dimensions for Particulate Matter: Enabling Advanced Sensitivity Analysis in Air Quality Models

    EPA Science Inventory

    The high-order decoupled direct method in three dimensions for particular matter (HDDM-3D/PM) has been implemented in the Community Multiscale Air Quality (CMAQ) model to enable advanced sensitivity analysis. The major effort of this work is to develop high-order DDM sensitivity...

  12. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry

    PubMed Central

    Ornatsky, Olga I.; Kinach, Robert; Bandura, Dmitry R.; Lou, Xudong; Tanner, Scott D.; Baranov, Vladimir I.; Nitz, Mark; Winnik, Mitchell A.

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping. PMID:19122859

  13. Situational Analysis for Complex Systems: Methodological Development in Public Health Research.

    PubMed

    Martin, Wanda; Pauly, Bernie; MacDonald, Marjorie

    2016-01-01

    Public health systems have suffered infrastructure losses worldwide. Strengthening public health systems requires not only good policies and programs, but also development of new research methodologies to support public health systems renewal. Our research team considers public health systems to be complex adaptive systems and as such new methods are necessary to generate knowledge about the process of implementing public health programs and services. Within our program of research, we have employed situational analysis as a method for studying complex adaptive systems in four distinct research studies on public health program implementation. The purpose of this paper is to demonstrate the use of situational analysis as a method for studying complex systems and highlight the need for further methodological development.

  14. Development of a Predictive Corrosion Model Using Locality-Specific Corrosion Indices

    DTIC Science & Technology

    2017-09-12

    6 3.2.1 Statistical data analysis methods ...6 3.2.2 Algorithm development method ...components, and method ) were compiled into an executable program that uses mathematical models of materials degradation, and statistical calcula- tions

  15. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  16. Development of Novel Noninvasive Methods of Stress Assessment in Baleen Whales

    DTIC Science & Technology

    2015-09-30

    large whales. Few methods exist for assessment of physiological stress levels of free-swimming cetaceans (Amaral 2010, ONR 2010, Hunt et al. 2013...adrenal hormone aldosterone . Our aim in this project is to further develop both techniques - respiratory hormone analysis and fecal hormone analysis...development of a noninvasive aldosterone assay (for both feces and blow) that can be used as an alternative measure of adrenal gland activation relative to

  17. Analysis Software

    NASA Technical Reports Server (NTRS)

    1994-01-01

    General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.

  18. Multivariate analysis in the pharmaceutical industry: enabling process understanding and improvement in the PAT and QbD era.

    PubMed

    Ferreira, Ana P; Tobyn, Mike

    2015-01-01

    In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.

  19. Analysis of drugs in human tissues by supercritical fluid extraction/immunoassay

    NASA Astrophysics Data System (ADS)

    Furton, Kenneth G.; Sabucedo, Alberta; Rein, Joseph; Hearn, W. L.

    1997-02-01

    A rapid, readily automated method has been developed for the quantitative analysis of phenobarbital from human liver tissues based on supercritical carbon dioxide extraction followed by fluorescence enzyme immunoassay. The method developed significantly reduces sample handling and utilizes the entire liver homogenate. The current method yields comparable recoveries and precision and does not require the use of an internal standard, although traditional GC/MS confirmation can still be performed on sample extracts. Additionally, the proposed method uses non-toxic, inexpensive carbon dioxide, thus eliminating the use of halogenated organic solvents.

  20. Commercial transport aircraft composite structures

    NASA Technical Reports Server (NTRS)

    Mccarty, J. E.

    1983-01-01

    The role that analysis plays in the development, production, and substantiation of aircraft structures is discussed. The types, elements, and applications of failure that are used and needed; the current application of analysis methods to commercial aircraft advanced composite structures, along with a projection of future needs; and some personal thoughts on analysis development goals and the elements of an approach to analysis development are discussed.

  1. Development of achiral and chiral 2D HPLC methods for analysis of albendazole metabolites in microsomal fractions using multivariate analysis for the in vitro metabolism.

    PubMed

    Belaz, Kátia Roberta A; Pereira-Filho, Edenir Rodrigues; Oliveira, Regina V

    2013-08-01

    In this work, the development of two multidimensional liquid chromatography methods coupled to a fluorescence detector is described for direct analysis of microsomal fractions obtained from rat livers. The chiral multidimensional method was then applied for the optimization of the in vitro metabolism of albendazole by experimental design. Albendazole was selected as a model drug because of its anthelmintics properties and recent potential for cancer treatment. The development of two fully automated achiral-chiral and chiral-chiral high performance liquid chromatography (HPLC) methods for the determination of albendazole (ABZ) and its metabolites albendazole sulphoxide (ABZ-SO), albendazole sulphone (ABZ-SO2) and albendazole 2-aminosulphone (ABZ-SO2NH2) in microsomal fractions are described. These methods involve the use of a phenyl (RAM-phenyl-BSA) or octyl (RAM-C8-BSA) restricted access media bovine serum albumin column for the sample clean-up, followed by an achiral phenyl column (15.0×0.46cmI.D.) or a chiral amylose tris(3,5-dimethylphenylcarbamate) column (15.0×0.46cmI.D.). The chiral 2D HPLC method was applied to the development of a compromise condition for the in vitro metabolism of ABZ by means of experimental design involving multivariate analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. An Improved Manual Method for NOx Emission Measurement.

    ERIC Educational Resources Information Center

    Dee, L. A.; And Others

    The current manual NO (x) sampling and analysis method was evaluated. Improved time-integrated sampling and rapid analysis methods were developed. In the new method, the sample gas is drawn through a heated bed of uniquely active, crystalline, Pb02 where NO (x) is quantitatively absorbed. Nitrate ion is later extracted with water and the…

  3. Quantitative Determination of Cannabinoids in Cannabis and Cannabis Products Using Ultra-High-Performance Supercritical Fluid Chromatography and Diode Array/Mass Spectrometric Detection.

    PubMed

    Wang, Mei; Wang, Yan-Hong; Avula, Bharathi; Radwan, Mohamed M; Wanas, Amira S; Mehmedic, Zlatko; van Antwerp, John; ElSohly, Mahmoud A; Khan, Ikhlas A

    2017-05-01

    Ultra-high-performance supercritical fluid chromatography (UHPSFC) is an efficient analytical technique and has not been fully employed for the analysis of cannabis. Here, a novel method was developed for the analysis of 30 cannabis plant extracts and preparations using UHPSFC/PDA-MS. Nine of the most abundant cannabinoids, viz. CBD, ∆ 8 -THC, THCV, ∆ 9 -THC, CBN, CBG, THCA-A, CBDA, and CBGA, were quantitatively determined (RSDs < 6.9%). Unlike GC methods, no derivatization or decarboxylation was required prior to UHPSFC analysis. The UHPSFC chromatographic separation of cannabinoids displayed an inverse elution order compared to UHPLC. Combining with PDA-MS, this orthogonality is valuable for discrimination of cannabinoids in complex matrices. The developed method was validated, and the quantification results were compared with a standard UHPLC method. The RSDs of these two methods were within ±13.0%. Finally, chemometric analysis including principal component analysis (PCA) and partial least squares-discriminant analysis (PLS-DA) were used to differentiate between cannabis samples. © 2016 American Academy of Forensic Sciences.

  4. The International College of Neuropsychopharmacology (CINP) Treatment Guidelines for Bipolar Disorder in Adults (CINP-BD-2017), Part 1: Background and Methods of the Development of Guidelines.

    PubMed

    Fountoulakis, Konstantinos N; Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried

    2017-02-01

    This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. © The Author 2016. Published by Oxford University Press on behalf of CINP.

  5. Heterogeneity of Metazoan Cells and Beyond: To Integrative Analysis of Cellular Populations at Single-Cell Level.

    PubMed

    Barteneva, Natasha S; Vorobjev, Ivan A

    2018-01-01

    In this paper, we review some of the recent advances in cellular heterogeneity and single-cell analysis methods. In modern research of cellular heterogeneity, there are four major approaches: analysis of pooled samples, single-cell analysis, high-throughput single-cell analysis, and lately integrated analysis of cellular population at a single-cell level. Recently developed high-throughput single-cell genetic analysis methods such as RNA-Seq require purification step and destruction of an analyzed cell often are providing a snapshot of the investigated cell without spatiotemporal context. Correlative analysis of multiparameter morphological, functional, and molecular information is important for differentiation of more uniform groups in the spectrum of different cell types. Simplified distributions (histograms and 2D plots) can underrepresent biologically significant subpopulations. Future directions may include the development of nondestructive methods for dissecting molecular events in intact cells, simultaneous correlative cellular analysis of phenotypic and molecular features by hybrid technologies such as imaging flow cytometry, and further progress in supervised and non-supervised statistical analysis algorithms.

  6. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 1: Analysis methods

    NASA Technical Reports Server (NTRS)

    Waszak, M. R.; Schmidt, D. S.

    1985-01-01

    As aircraft become larger and lighter due to design requirements for increased payload and improved fuel efficiency, they will also become more flexible. For highly flexible vehicles, the handling qualities may not be accurately predicted by conventional methods. This study applies two analysis methods to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop model analysis technique. This method considers the effects of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Volume 1 consists of the development and application of the two analysis methods described above.

  7. Comparisons of Exploratory and Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Daniel, Larry G.

    Historically, most researchers conducting factor analysis have used exploratory methods. However, more recently, confirmatory factor analytic methods have been developed that can directly test theory either during factor rotation using "best fit" rotation methods or during factor extraction, as with the LISREL computer programs developed…

  8. Advances in Statistical Methods for Substance Abuse Prevention Research

    PubMed Central

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  9. Emergy analysis of an industrial park: the case of Dalian, China.

    PubMed

    Geng, Yong; Zhang, Pan; Ulgiati, Sergio; Sarkis, Joseph

    2010-10-15

    With the rapid development of eco-industrial park projects in China, evaluating their overall eco-efficiency is becoming an important need and a big challenge academically. Developing ecologically conscious industrial park management requires analysis of both industrial and ecological systems. Traditional evaluation methods based on neoclassical economics and embodied energy and exergy analyses have certain limitations due to their focus with environmental issues considered secondary to the maximization of economic and technical objectives. Such methods focus primarily on the environmental impact of emissions and their economic consequences. These approaches ignore the contribution of ecological products and services as well as the load placed on environmental systems and related problems of carrying capacity of economic and industrial development. This paper presents a new method, based upon emergy analysis and synthesis. Such a method links economic and ecological systems together, highlighting the internal relations among the different subsystems and components. The emergy-based method provides insight into the environmental performance and sustainability of an industrial park. This paper depicts the methodology of emergy analysis at the industrial park level and provides a series of emergy-based indices. A case study is investigated and discussed in order to show the emergy method's practical potential. Results from DEDZ (Dalian Economic Development Zone) case show us the potential of emergy synthesis method at the industrial park level for environmental policy making. Its advantages and limitations are also discussed with avenues for future research identified. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. A New Method for Non-destructive Measurement of Biomass, Growth Rates, Vertical Biomass Distribution and Dry Matter Content Based on Digital Image Analysis

    PubMed Central

    Tackenberg, Oliver

    2007-01-01

    Background and Aims Biomass is an important trait in functional ecology and growth analysis. The typical methods for measuring biomass are destructive. Thus, they do not allow the development of individual plants to be followed and they require many individuals to be cultivated for repeated measurements. Non-destructive methods do not have these limitations. Here, a non-destructive method based on digital image analysis is presented, addressing not only above-ground fresh biomass (FBM) and oven-dried biomass (DBM), but also vertical biomass distribution as well as dry matter content (DMC) and growth rates. Methods Scaled digital images of the plants silhouettes were taken for 582 individuals of 27 grass species (Poaceae). Above-ground biomass and DMC were measured using destructive methods. With image analysis software Zeiss KS 300, the projected area and the proportion of greenish pixels were calculated, and generalized linear models (GLMs) were developed with destructively measured parameters as dependent variables and parameters derived from image analysis as independent variables. A bootstrap analysis was performed to assess the number of individuals required for re-calibration of the models. Key Results The results of the developed models showed no systematic errors compared with traditionally measured values and explained most of their variance (R2 ≥ 0·85 for all models). The presented models can be directly applied to herbaceous grasses without further calibration. Applying the models to other growth forms might require a re-calibration which can be based on only 10–20 individuals for FBM or DMC and on 40–50 individuals for DBM. Conclusions The methods presented are time and cost effective compared with traditional methods, especially if development or growth rates are to be measured repeatedly. Hence, they offer an alternative way of determining biomass, especially as they are non-destructive and address not only FBM and DBM, but also vertical biomass distribution and DMC. PMID:17353204

  11. Quantitative analysis of unconjugated and total bisphenol A in human urine using solid-phase extraction and UPLC-MS/MS: method implementation, method qualification and troubleshooting.

    PubMed

    Buscher, Brigitte; van de Lagemaat, Dick; Gries, Wolfgang; Beyer, Dieter; Markham, Dan A; Budinsky, Robert A; Dimond, Stephen S; Nath, Rajesh V; Snyder, Stephanie A; Hentges, Steven G

    2015-11-15

    The aim of the presented investigation was to document challenges encountered during implementation and qualification of a method for bisphenol A (BPA) analysis and to develop and discuss precautions taken to avoid and to monitor contamination with BPA during sample handling and analysis. Previously developed and published HPLC-MS/MS methods for the determination of unconjugated BPA (Markham et al. Journal of Analytical Toxicology, 34 (2010) 293-303) [17] and total BPA (Markham et al. Journal of Analytical Toxicology, 38 (2014) 194-203) [20] in human urine were combined and transferred into another laboratory. The initial method for unconjugated BPA was developed and evaluated in two independent laboratories simultaneously. The second method for total BPA was developed and evaluated in one of these laboratories to conserve resources. Accurate analysis of BPA at sub-ppb levels is a challenging task as BPA is a widely used material and is ubiquitous in the environment at trace concentrations. Propensity for contamination of biological samples with BPA is reported in the literature during sample collection, storage, and/or analysis. Contamination by trace levels of BPA is so pervasive that even with extraordinary care, it is difficult to completely exclude the introduction of BPA into biological samples and, consequently, contamination might have an impact on BPA biomonitoring data. The applied UPLC-MS/MS method was calibrated from 0.05 to 25ng/ml. The limit of quantification was 0.1ng/ml for unconjugated BPA and 0.2ng/ml for total BPA, respectively, in human urine. Finally, the method was applied to urine samples derived from 20 volunteers. Overall, BPA can be analyzed in human urine with acceptable recovery and repeatability if sufficient measures are taken to avoid contamination throughout the procedure from sample collection until UPLC-MS/MS analysis. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Modal analysis applied to circular, rectangular, and coaxial waveguides

    NASA Technical Reports Server (NTRS)

    Hoppe, D. J.

    1988-01-01

    Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.

  13. A relational metric, its application to domain analysis, and an example analysis and model of a remote sensing domain

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1995-01-01

    An objective and quantitative method has been developed for deriving models of complex and specialized spheres of activity (domains) from domain-generated verbal data. The method was developed for analysis of interview transcripts, incident reports, and other text documents whose original source is people who are knowledgeable about, and participate in, the domain in question. To test the method, it is applied here to a report describing a remote sensing project within the scope of the Earth Observing System (EOS). The method has the potential to improve the designs of domain-related computer systems and software by quickly providing developers with explicit and objective models of the domain in a form which is useful for design. Results of the analysis include a network model of the domain, and an object-oriented relational analysis report which describes the nodes and relationships in the network model. Other products include a database of relationships in the domain, and an interactive concordance. The analysis method utilizes a newly developed relational metric, a proximity-weighted frequency of co-occurrence. The metric is applied to relations between the most frequently occurring terms (words or multiword entities) in the domain text, and the terms found within the contexts of these terms. Contextual scope is selectable. Because of the discriminating power of the metric, data reduction from the association matrix to the network is simple. In addition to their value for design. the models produced by the method are also useful for understanding the domains themselves. They can, for example, be interpreted as models of presence in the domain.

  14. Development of high performance liquid chromatography method for miconazole analysis in powder sample

    NASA Astrophysics Data System (ADS)

    Hermawan, D.; Suwandri; Sulaeman, U.; Istiqomah, A.; Aboul-Enein, H. Y.

    2017-02-01

    A simple high performance liquid chromatography (HPLC) method has been developed in this study for the analysis of miconazole, an antifungal drug, in powder sample. The optimized HPLC system using C8 column was achieved using mobile phase composition containing methanol:water (85:15, v/v), a flow rate of 0.8 mL/min, and UV detection at 220 nm. The calibration graph was linear in the range from 10 to 50 mg/L with r 2 of 0.9983. The limit of detection (LOD) and limit of quantitation (LOQ) obtained were 2.24 mg/L and 7.47 mg/L, respectively. The present HPLC method is applicable for the determination of miconazole in the powder sample with a recovery of 101.28 % (RSD = 0.96%, n = 3). The developed HPLC method provides short analysis time, high reproducibility and high sensitivity.

  15. Methods for High-Order Multi-Scale and Stochastic Problems Analysis, Algorithms, and Applications

    DTIC Science & Technology

    2016-10-17

    finite volume schemes, discontinuous Galerkin finite element method, and related methods, for solving computational fluid dynamics (CFD) problems and...approximation for finite element methods. (3) The development of methods of simulation and analysis for the study of large scale stochastic systems of...laws, finite element method, Bernstein-Bezier finite elements , weakly interacting particle systems, accelerated Monte Carlo, stochastic networks 16

  16. Structured Case Analysis: Developing Critical Thinking Skills in a Marketing Case Course

    ERIC Educational Resources Information Center

    Klebba, Joanne M.; Hamilton, Janet G.

    2007-01-01

    Structured case analysis is a hybrid pedagogy that flexibly combines diverse instructional methods with comprehensive case analysis as a mechanism to develop critical thinking skills. An incremental learning framework is proposed that allows instructors to develop and monitor content-specific theory and the corresponding critical thinking skills.…

  17. Developments in Cylindrical Shell Stability Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Starnes, James H., Jr.

    1998-01-01

    Today high-performance computing systems and new analytical and numerical techniques enable engineers to explore the use of advanced materials for shell design. This paper reviews some of the historical developments of shell buckling analysis and design. The paper concludes by identifying key research directions for reliable and robust methods development in shell stability analysis and design.

  18. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  19. Development of Advanced Methods of Structural and Trajectory Analysis for Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Ardema, Mark D.

    1996-01-01

    In this report the author describes: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of flight path optimization. A method of estimating the load-bearing fuselage weight and wing weight of transport aircraft based on fundamental structural principles has been developed. This method of weight estimation represents a compromise between the rapid assessment of component weight using empirical methods based on actual weights of existing aircraft and detailed, but time-consuming, analysis using the finite element method. The method was applied to eight existing subsonic transports for validation and correlation. Integration of the resulting computer program, PDCYL, has been made into the weights-calculating module of the AirCraft SYNThesis (ACSYNT) computer program. ACSYNT bas traditionally used only empirical weight estimation methods; PDCYL adds to ACSYNT a rapid, accurate means of assessing the fuselage and wing weights of unconventional aircraft. PDCYL also allows flexibility in the choice of structural concept, as well as a direct means of determining the impact of advanced materials on structural weight.

  20. Development of a probabilistic analysis methodology for structural reliability estimation

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.

    1991-01-01

    The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.

  1. Preliminary analysis techniques for ring and stringer stiffened cylindrical shells

    NASA Technical Reports Server (NTRS)

    Graham, J.

    1993-01-01

    This report outlines methods of analysis for the buckling of thin-walled circumferentially and longitudinally stiffened cylindrical shells. Methods of analysis for the various failure modes are presented in one cohesive package. Where applicable, more than one method of analysis for a failure mode is presented along with standard practices. The results of this report are primarily intended for use in launch vehicle design in the elastic range. A Microsoft Excel worksheet with accompanying macros has been developed to automate the analysis procedures.

  2. Rapid Method for Sodium Hydroxide/Sodium Peroxide Fusion ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Plutonium-238 and plutonium-239 in water and air filters Method Selected for: SAM lists this method as a pre-treatment technique supporting analysis of refractory radioisotopic forms of plutonium in drinking water and air filters using the following qualitative techniques: • Rapid methods for acid or fusion digestion • Rapid Radiochemical Method for Plutonium-238 and Plutonium 239/240 in Building Materials for Environmental Remediation Following Radiological Incidents. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  3. History and Development of the Schmidt-Hunter Meta-Analysis Methods

    ERIC Educational Resources Information Center

    Schmidt, Frank L.

    2015-01-01

    In this article, I provide answers to the questions posed by Will Shadish about the history and development of the Schmidt-Hunter methods of meta-analysis. In the 1970s, I headed a research program on personnel selection at the US Office of Personnel Management (OPM). After our research showed that validity studies have low statistical power, OPM…

  4. Analysis of Pre-Service Science Teachers' Views about the Methods Which Develop Reflective Thinking

    ERIC Educational Resources Information Center

    Töman, Ufuk; Odabasi Çimer, Sabiha; Çimer, Atilla

    2014-01-01

    In this study, we investigate of science and technology pre-service teachers' opinions about the methods developed reflective thinking and we determined at the level of reflective thinking. This study is a descriptive study. Open-ended questions were used to determine the views of pre-service teachers. Questions used in the statistical analysis of…

  5. Analysis of health trait data from on-farm computer systems in the U.S. II: Comparison of genomic analyses including two-stage and single-step methods

    USDA-ARS?s Scientific Manuscript database

    The development of genomic selection methodology, with accompanying substantial gains in reliability for low-heritability traits, may dramatically improve the feasibility of genetic improvement of dairy cow health. Many methods for genomic analysis have now been developed, including the “Bayesian Al...

  6. District nursing workforce planning: a review of the methods.

    PubMed

    Reid, Bernie; Kane, Kay; Curran, Carol

    2008-11-01

    District nursing services in Northern Ireland face increasing demands and challenges which may be responded to by effective and efficient workforce planning and development. The aim of this paper is to critically analyse district nursing workforce planning and development methods, in an attempt to find a suitable method for Northern Ireland. A systematic analysis of the literature reveals four methods: professional judgement; population-based health needs; caseload analysis and dependency-acuity. Each method has strengths and weaknesses. Professional judgement offers a 'belt and braces' approach but lacks sensitivity to fluctuating patient numbers. Population-based health needs methods develop staffing algorithms that reflect deprivation and geographical spread, but are poorly understood by district nurses. Caseload analysis promotes equitable workloads but poorly performing district nursing localities may continue if benchmarking processes only consider local data. Dependency-acuity methods provide a means of equalizing and prioritizing workload but are prone to district nurses overstating factors in patient dependency or understating carers' capability. In summary a mixed method approach is advocated to evaluate and adjust the size and mix of district nursing teams using empirically determined patient dependency and activity-based variables based on the population's health needs.

  7. Integrated analysis and design of thick composite structures for optimal passive damping characteristics

    NASA Technical Reports Server (NTRS)

    Saravanos, D. A.

    1993-01-01

    The development of novel composite mechanics for the analysis of damping in composite laminates and structures and the more significant results of this effort are summarized. Laminate mechanics based on piecewise continuous in-plane displacement fields are described that can represent both intralaminar stresses and interlaminar shear stresses and the associated effects on the stiffness and damping characteristics of a composite laminate. Among other features, the mechanics can accurately model the static and damped dynamic response of either thin or thick composite laminates, as well as, specialty laminates with embedded compliant damping layers. The discrete laminate damping theory is further incorporated into structural analysis methods. In this context, an exact semi-analytical method for the simulation of the damped dynamic response of composite plates was developed. A finite element based method and a specialty four-node plate element were also developed for the analysis of composite structures of variable shape and boundary conditions. Numerous evaluations and applications demonstrate the quality and superiority of the mechanics in predicting the damped dynamic characteristics of composite structures. Finally, additional development was focused on the development of optimal tailoring methods for the design of thick composite structures based on the developed analytical capability. Applications on composite plates illustrated the influence of composite mechanics in the optimal design of composites and the potential for significant deviations in the resultant designs when more simplified (classical) laminate theories are used.

  8. Analyzing the Structure and Content of Public Health Messages

    PubMed Central

    Morrison, Frances P.; Kukafka, Rita; Johnson, Stephen B.

    2005-01-01

    Background Health messages are crucial to the field of public health in effecting behavior change, but little research is available to assist writers in composing the overall structure of a message. In order to develop software to assist individuals in constructing effective messages, the structure of existing health messages must be understood, and an appropriate method for analyzing health message structure developed. Methods 72 messages from expert sources were used for development of the method, which was then tested for reproducibility using ten randomly selected health messages. Four raters analyzed the messages and inter-coder agreement was calculated. Results A method for analyzing the structure of the messages was developed using sublanguage analysis and discourse analysis. Overall kappa between four coders was 0.69. Conclusion A novel framework for characterizing health message structure and a method for analyzing messages appears to be reproducible and potentially useful for creating an authoring tool. PMID:16779098

  9. Industrial application of green chromatography - II. Separation and analysis of preservatives in skincare products using subcritical water chromatography.

    PubMed

    Yang, Y; Kapalavavi, B; Gujjar, L; Hadrous, S; Marple, R; Gamsky, C

    2012-10-01

    Several high-temperature liquid chromatography (HTLC) and subcritical water chromatography (SBWC) methods have been successfully developed in this study for separation and analysis of preservatives contained in Olay skincare creams. Efficient separation and quantitative analysis of preservatives have been achieved on four commercially available ZirChrom and Waters XBridge columns at temperatures ranging from 100 to 200°C. The quantification results obtained by both HTLC and SBWC methods developed for preservatives analysis are accurate and reproducible. A large number of replicate HTLC and SBWC runs also indicate no significant system building-up or interference for skincare cream analysis. Compared with traditional HPLC separation carried out at ambient temperature, the HTLC methods can save up to 90% methanol required in the HPLC mobile phase. However, the SBWC methods developed in this project completely eliminated the use of toxic organic solvents required in the HPLC mobile phase, thus saving a significant amount of money and making the environment greener. Although both homemade and commercial systems can accomplish SBWC separations, the SBWC methods using the commercial system for preservative analysis are recommended for industrial applications because they can be directly applied in industrial plant settings. © 2012 The Authors ICS © 2012 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  10. ELEMENTAL COMPOSITION OF FRESHLY NUCLEATED PARTICLES

    EPA Science Inventory

    The main objective of this work is to develop a method for real-time sampling and analysis of individual airborne nanoparticles in the 5 - 20 nm diameter range. The size range covered by this method is much smaller than existing single particle methods for chemical analysis. S...

  11. FUZZY DECISION ANALYSIS FOR INTEGRATED ENVIRONMENTAL VULNERABILITY ASSESSMENT OF THE MID-ATLANTIC REGION

    EPA Science Inventory


    A fuzzy decision analysis method for integrating ecological indicators is developed. This is a combination of a fuzzy ranking method and the Analytic Hierarchy Process (AHP). The method is capable ranking ecosystems in terms of environmental conditions and suggesting cumula...

  12. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method.

    PubMed

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko

    2016-08-15

    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Rapid Method for Sodium Hydroxide Fusion of Concrete and ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Americium-241, plutonium-238, plutonium-239, radium-226, strontium-90, uranium-234, uranium-235 and uranium-238 in concrete and brick samples Method Selected for: SAM lists this method for qualitative analysis of americium-241, plutonium-238, plutonium-239, radium-226, strontium-90, uranium-234, uranium-235 and uranium-238 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  14. Diffraction as a Method of Critical Policy Analysis

    ERIC Educational Resources Information Center

    Ulmer, Jasmine B.

    2016-01-01

    Recent developments in critical policy analysis have occurred alongside the new materialisms in qualitative research. These lines of scholarship have unfolded along two separate, but related, tracks. In particular, the new materialist method of "diffraction" aligns with many elements of critical policy analysis. Both involve critical…

  15. Application of computational aerodynamics methods to the design and analysis of transport aircraft

    NASA Technical Reports Server (NTRS)

    Da Costa, A. L.

    1978-01-01

    The application and validation of several computational aerodynamic methods in the design and analysis of transport aircraft is established. An assessment is made concerning more recently developed methods that solve three-dimensional transonic flow and boundary layers on wings. Capabilities of subsonic aerodynamic methods are demonstrated by several design and analysis efforts. Among the examples cited are the B747 Space Shuttle Carrier Aircraft analysis, nacelle integration for transport aircraft, and winglet optimization. The accuracy and applicability of a new three-dimensional viscous transonic method is demonstrated by comparison of computed results to experimental data

  16. Local Analysis of Shock Capturing Using Discontinuous Galerkin Methodology

    NASA Technical Reports Server (NTRS)

    Atkins, H. L.

    1997-01-01

    The compact form of the discontinuous Galerkin method allows for a detailed local analysis of the method in the neighborhood of the shock for a non-linear model problem. Insight gained from the analysis leads to new flux formulas that are stable and that preserve the compactness of the method. Although developed for a model equation, the flux formulas are applicable to systems such as the Euler equations. This article presents the analysis for methods with a degree up to 5. The analysis is accompanied by supporting numerical experiments using Burgers' equation and the Euler equations.

  17. Fourier analysis and signal processing by use of the Moebius inversion formula

    NASA Technical Reports Server (NTRS)

    Reed, Irving S.; Yu, Xiaoli; Shih, Ming-Tang; Tufts, Donald W.; Truong, T. K.

    1990-01-01

    A novel Fourier technique for digital signal processing is developed. This approach to Fourier analysis is based on the number-theoretic method of the Moebius inversion of series. The Fourier transform method developed is shown also to yield the convolution of two signals. A computer simulation shows that this method for finding Fourier coefficients is quite suitable for digital signal processing. It competes with the classical FFT (fast Fourier transform) approach in terms of accuracy, complexity, and speed.

  18. Aeroelastic Stability and Response of Rotating Structures

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.; Reddy, Tondapu

    2004-01-01

    A summary of the work performed under NASA grant is presented. More details can be found in the cited references. This grant led to the development of relatively faster aeroelastic analysis methods for predicting flutter and forced response in fans, compressors, and turbines using computational fluid dynamic (CFD) methods. These methods are based on linearized two- and three-dimensional, unsteady, nonlinear aerodynamic equations. During the period of the grant, aeroelastic analysis that includes the effects of uncertainties in the design variables has also been developed.

  19. Analysis and characterization of heparin impurities.

    PubMed

    Beni, Szabolcs; Limtiaco, John F K; Larive, Cynthia K

    2011-01-01

    This review discusses recent developments in analytical methods available for the sensitive separation, detection and structural characterization of heparin contaminants. The adulteration of raw heparin with oversulfated chondroitin sulfate (OSCS) in 2007-2008 spawned a global crisis resulting in extensive revisions to the pharmacopeia monographs on heparin and prompting the FDA to recommend the development of additional physicochemical methods for the analysis of heparin purity. The analytical chemistry community quickly responded to this challenge, developing a wide variety of innovative approaches, several of which are reported in this special issue. This review provides an overview of methods of heparin isolation and digestion, discusses known heparin contaminants, including OSCS, and summarizes recent publications on heparin impurity analysis using sensors, near-IR, Raman, and NMR spectroscopy, as well as electrophoretic and chromatographic separations.

  20. Development of Measurement Methods for Detection of Special Nuclear Materials using D-D Pulsed Neutron Source

    NASA Astrophysics Data System (ADS)

    Misawa, Tsuyoshi; Takahashi, Yoshiyuki; Yagi, Takahiro; Pyeon, Cheol Ho; Kimura, Masaharu; Masuda, Kai; Ohgaki, Hideaki

    2015-10-01

    For detection of hidden special nuclear materials (SNMs), we have developed an active neutron-based interrogation system combined with a D-D fusion pulsed neutron source and a neutron detection system. In the detection scheme, we have adopted new measurement techniques simultaneously; neutron noise analysis and neutron energy spectrum analysis. The validity of neutron noise analysis method has been experimentally studied in the Kyoto University Critical Assembly (KUCA), and was applied to a cargo container inspection system by simulation.

  1. Implementation of the Veder contact method in daily nursing home care for people with dementia: a process analysis according to the RE-AIM framework.

    PubMed

    Boersma, Petra; van Weert, Julia C M; van Meijel, Berno; Dröes, Rose-Marie

    2017-02-01

    To perform a process analysis of the implementation of the Veder contact method for gaining insight into factors that influence successful implementation. Research showed that the original Veder method, which is a 'living-room theatre performance' provided by actors, positively influenced mood and quality of life of people with dementia. Training caregivers to execute such 'performances' and accomplish the same effects as actors proved difficult. However, key elements of the method were considered suitable for application in daily care, resulting in the development of a modified version of the method, named the Veder contact method. The Veder contact method combines elements from existing psychosocial interventions, e.g. reminiscence, validation and neuro-linguistic-programming with theatrical, poetic and musical communication, and applies this into daily care. For this process analysis a multiple case study design was used with the nursing home ward (n = 6) as the unit of analysis. Eight focus groups with caregivers (n = 42) and 12 interviews with stakeholders were held. Using the Reach, Effectiveness, Adoption, Implementation, Maintenance framework, a thematic analysis was conducted. The reach of the intervention (43-86%) and aspects of implementation-effectiveness (e.g. increased experienced reciprocity in contact with residents) facilitated implementation. For adoption and implementation, both facilitators (e.g. development of competences, feasibility of the Veder contact method without requiring extra time investment) and barriers (e.g. insufficient support of management, resistance of caregivers against the Veder contact method, organisational problems) were identified. Little effort was put into maintenance: only one nursing home developed a long-term implementation strategy. The Veder contact method can be applied in daily care without additional time investments. Although adopted by many caregivers, some were reluctant using the Veder contact method. Organisational factors (e.g. staffing and management changes, budget cuts) impeded long-term implementation. The findings from this study can be used for the development of successful implementation strategies for the Veder contact method and other person-centred care methods. © 2016 John Wiley & Sons Ltd.

  2. Evaluation of prognostic models developed using standardised image features from different PET automated segmentation methods.

    PubMed

    Parkinson, Craig; Foley, Kieran; Whybra, Philip; Hills, Robert; Roberts, Ashley; Marshall, Chris; Staffurth, John; Spezi, Emiliano

    2018-04-11

    Prognosis in oesophageal cancer (OC) is poor. The 5-year overall survival (OS) rate is approximately 15%. Personalised medicine is hoped to increase the 5- and 10-year OS rates. Quantitative analysis of PET is gaining substantial interest in prognostic research but requires the accurate definition of the metabolic tumour volume. This study compares prognostic models developed in the same patient cohort using individual PET segmentation algorithms and assesses the impact on patient risk stratification. Consecutive patients (n = 427) with biopsy-proven OC were included in final analysis. All patients were staged with PET/CT between September 2010 and July 2016. Nine automatic PET segmentation methods were studied. All tumour contours were subjectively analysed for accuracy, and segmentation methods with < 90% accuracy were excluded. Standardised image features were calculated, and a series of prognostic models were developed using identical clinical data. The proportion of patients changing risk classification group were calculated. Out of nine PET segmentation methods studied, clustering means (KM2), general clustering means (GCM3), adaptive thresholding (AT) and watershed thresholding (WT) methods were included for analysis. Known clinical prognostic factors (age, treatment and staging) were significant in all of the developed prognostic models. AT and KM2 segmentation methods developed identical prognostic models. Patient risk stratification was dependent on the segmentation method used to develop the prognostic model with up to 73 patients (17.1%) changing risk stratification group. Prognostic models incorporating quantitative image features are dependent on the method used to delineate the primary tumour. This has a subsequent effect on risk stratification, with patients changing groups depending on the image segmentation method used.

  3. Ammonia Analysis by Gas Chromatograph/Infrared Detector (GC/IRD)

    NASA Technical Reports Server (NTRS)

    Scott, Joseph P.; Whitfield, Steve W.

    2003-01-01

    Methods are being developed at Marshall Space Flight Center's Toxicity Lab on a CG/IRD System that will be used to detect ammonia in low part per million (ppm) levels. These methods will allow analysis of gas samples by syringe injections. The GC is equipped with a unique cryogenic-cooled inlet system that will enable our lab to make large injections of a gas sample. Although the initial focus of the work will be analysis of ammonia, this instrument could identify other compounds on a molecular level. If proper methods can be developed, the IRD could work as a powerful addition to our offgassing capabilities.

  4. Computer-assisted techniques to evaluate fringe patterns

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Bhat, Gopalakrishna K.

    1992-01-01

    Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.

  5. An overview of computational simulation methods for composite structures failure and life analysis

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1993-01-01

    Three parallel computational simulation methods are being developed at the LeRC Structural Mechanics Branch (SMB) for composite structures failure and life analysis: progressive fracture CODSTRAN; hierarchical methods for high-temperature composites; and probabilistic evaluation. Results to date demonstrate that these methods are effective in simulating composite structures failure/life/reliability.

  6. Evaluation of Saltzman and phenoldisulfonic acid methods for determining NO/sub x/ in engine exhaust gases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, R.H.; Calabro, D.S.

    1969-11-01

    The two methods normally used for the analysis of NO/sub x/ are the Saltzman and the phenoldisulfonic acid technique. This paper describes an evaluation of these wet chemical methods to determine their practical application to engine exhaust gas analysis. Parameters considered for the Saltzman method included bubbler collection efficiency, NO to NO/sub 2/ conversion efficiency, masking effect of other contaminants usually present in exhaust gases and the time-temperature effect of these contaminants on store developed solutions. Collection efficiency and the effects of contaminants were also considered for the phenoldisulfonic acid method. Test results indicated satisfactory collection and conversion efficiencies formore » the Saltzman method, but contaminants seriously affected the measurement accuracy particularly if the developed solution was stored for a number of hours at room temperature before analysis. Storage at 32/sup 0/F minimized effect. The standard procedure for the phenoldisulfonic acid method gave good results, but the process was found to be too time consuming for routine analysis and measured only total NO/sub x/. 3 references, 9 tables.« less

  7. Face Gear Drive With Helical Involute Pinion: Geometry, Generation by a Shaper and a Worm, Avoidance of Singularities and Stress Analysis

    NASA Technical Reports Server (NTRS)

    Litvin, Faydor L.; Fuentes, Alfonso; Gonzalez-Perez, Ignacio; Piscopo, Alessandro; Ruzziconi, Paolo

    2005-01-01

    A new type of face-gear drive with intersected axes of rotation formed by a helical involute pinion and conjugated face-gear has been investigated. Generation of face-gears by a shaper free of undercutting and pointing has been investigated. A new method of grinding or cutting of face-gears by a worm of special shape has been developed. A computerized design procedure has been developed to avoid undercutting and pointing by a shaper or by a generating worm. Also, a method to determine the limitations of the helix angle magnitude has been developed. The method provides a localization of the bearing contact to reduce the shift of bearing contact caused by misalignment. The analytical method provides a simulation of the meshing and contact of misaligned gear drives. An automatic mesh generation method has been developed and used to conduct a 3D contact stress analysis of several teeth. The theory developed is illustrated with several examples.

  8. Flight Test Results of a GPS-Based Pitot-Static Calibration Method Using Output-Error Optimization for a Light Twin-Engine Airplane

    NASA Technical Reports Server (NTRS)

    Martos, Borja; Kiszely, Paul; Foster, John V.

    2011-01-01

    As part of the NASA Aviation Safety Program (AvSP), a novel pitot-static calibration method was developed to allow rapid in-flight calibration for subscale aircraft while flying within confined test areas. This approach uses Global Positioning System (GPS) technology coupled with modern system identification methods that rapidly computes optimal pressure error models over a range of airspeed with defined confidence bounds. This method has been demonstrated in subscale flight tests and has shown small 2- error bounds with significant reduction in test time compared to other methods. The current research was motivated by the desire to further evaluate and develop this method for full-scale aircraft. A goal of this research was to develop an accurate calibration method that enables reductions in test equipment and flight time, thus reducing costs. The approach involved analysis of data acquisition requirements, development of efficient flight patterns, and analysis of pressure error models based on system identification methods. Flight tests were conducted at The University of Tennessee Space Institute (UTSI) utilizing an instrumented Piper Navajo research aircraft. In addition, the UTSI engineering flight simulator was used to investigate test maneuver requirements and handling qualities issues associated with this technique. This paper provides a summary of piloted simulation and flight test results that illustrates the performance and capabilities of the NASA calibration method. Discussion of maneuver requirements and data analysis methods is included as well as recommendations for piloting technique.

  9. A New View of Earthquake Ground Motion Data: The Hilbert Spectral Analysis

    NASA Technical Reports Server (NTRS)

    Huang, Norden; Busalacchi, Antonio J. (Technical Monitor)

    2000-01-01

    A brief description of the newly developed Empirical Mode Decomposition (ENID) and Hilbert Spectral Analysis (HSA) method will be given. The decomposition is adaptive and can be applied to both nonlinear and nonstationary data. Example of the method applied to a sample earthquake record will be given. The results indicate those low frequency components, totally missed by the Fourier analysis, are clearly identified by the new method. Comparisons with Wavelet and window Fourier analysis show the new method offers much better temporal and frequency resolutions.

  10. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    PubMed

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  11. A gradient method for the quantitative analysis of cell movement and tissue flow and its application to the analysis of multicellular Dictyostelium development.

    PubMed

    Siegert, F; Weijer, C J; Nomura, A; Miike, H

    1994-01-01

    We describe the application of a novel image processing method, which allows quantitative analysis of cell and tissue movement in a series of digitized video images. The result is a vector velocity field showing average direction and velocity of movement for every pixel in the frame. We apply this method to the analysis of cell movement during different stages of the Dictyostelium developmental cycle. We analysed time-lapse video recordings of cell movement in single cells, mounds and slugs. The program can correctly assess the speed and direction of movement of either unlabelled or labelled cells in a time series of video images depending on the illumination conditions. Our analysis of cell movement during multicellular development shows that the entire morphogenesis of Dictyostelium is characterized by rotational cell movement. The analysis of cell and tissue movement by the velocity field method should be applicable to the analysis of morphogenetic processes in other systems such as gastrulation and neurulation in vertebrate embryos.

  12. Application of selected ion monitoring to the analysis of triacylglycerols in olive oil by high temperature-gas chromatography/mass spectrometry.

    PubMed

    Ruiz-Samblás, C; González-Casado, A; Cuadros-Rodríguez, L; García, F P Rodríguez

    2010-06-30

    The analysis of the triacylglycerol (TAG) composition of oils is a very challenging task, since the TAGs have very similar physico-chemical properties. In this work, a high temperature-gas chromatographic method coupled to electron ionization-mass spectrometry (HT-GC/EI-MS), in the Selected Ion Monitoring (SIM) mode, method was developed for the analysis of TAGs in the olive oil; this is a method suitable for routine analysis. This method was developed using commercially available standard TAGs. The TAGs studied were separated according to their equivalent carbon number and degree of unsaturation. The peak assignment was carried out by locating the characteristic fragment ions having the same retention time on the SIM profile such as [RCO+74](+) and [RCO+128](+) ions, due to the fatty acyl residues on sn-1, sn-2 and sn-3 positions of the TAG molecule and the [M-OCOR](+) ions corresponding to the acyl ions. The developed method was very useful to eliminate the interferences that appeared in the mass spectrum since electron ionization can prevent satisfactory interpretation of spectra. Copyright 2010 Elsevier B.V. All rights reserved.

  13. Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.

    PubMed

    Handels, H; Ehrhardt, J

    2009-01-01

    Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or operation planning is a complex interdisciplinary process. Image computing methods enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.

  14. Analysis of complex decisionmaking processes. [with application to jet engine development

    NASA Technical Reports Server (NTRS)

    Hill, J. D.; Ollila, R. G.

    1978-01-01

    The analysis of corporate decisionmaking processes related to major system developments is unusually difficult because of the number of decisionmakers involved in the process and the long development cycle. A method for analyzing such decision processes is developed and illustrated through its application to the analysis of the commercial jet engine development process. The method uses interaction matrices as the key tool for structuring the problem, recording data, and analyzing the data to establish the rank order of the major factors affecting development decisions. In the example, the use of interaction matrices permitted analysts to collect and analyze approximately 50 factors that influenced decisions during the four phases of the development cycle, and to determine the key influencers of decisions at each development phase. The results of this study indicate that the cost of new technology installed on an aircraft is the prime concern of the engine manufacturer.

  15. Influence analysis in quantitative trait loci detection.

    PubMed

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Optimal Measurement Conditions for Spatiotemporal EEG/MEG Source Analysis.

    ERIC Educational Resources Information Center

    Huizenga, Hilde M.; Heslenfeld, Dirk J.; Molenaar, Peter C. M.

    2002-01-01

    Developed a method to determine the required number and position of sensors for human brain electromagnetic source analysis. Studied the method through a simulation study and an empirical study on visual evoked potentials in one adult male. Results indicate the method is fast and reliable and improves source precision. (SLD)

  17. Preconcentration for Improved Long-Term Monitoring of Contaminants in Groundwater: Sorbent Development

    DTIC Science & Technology

    2013-02-11

    calibration curves was ±5%. Ion chromatography (IC) was used for analysis of perchlorate and other ionic targets. Analysis was carried out on a...The methods utilize liquid or gas chromatography , techniques that do not lend themselves well to portable devices and methods. Portable methods are...

  18. Differential DNA Methylation Analysis without a Reference Genome.

    PubMed

    Klughammer, Johanna; Datlinger, Paul; Printz, Dieter; Sheffield, Nathan C; Farlik, Matthias; Hadler, Johanna; Fritsch, Gerhard; Bock, Christoph

    2015-12-22

    Genome-wide DNA methylation mapping uncovers epigenetic changes associated with animal development, environmental adaptation, and species evolution. To address the lack of high-throughput methods for DNA methylation analysis in non-model organisms, we developed an integrated approach for studying DNA methylation differences independent of a reference genome. Experimentally, our method relies on an optimized 96-well protocol for reduced representation bisulfite sequencing (RRBS), which we have validated in nine species (human, mouse, rat, cow, dog, chicken, carp, sea bass, and zebrafish). Bioinformatically, we developed the RefFreeDMA software to deduce ad hoc genomes directly from RRBS reads and to pinpoint differentially methylated regions between samples or groups of individuals (http://RefFreeDMA.computational-epigenetics.org). The identified regions are interpreted using motif enrichment analysis and/or cross-mapping to annotated genomes. We validated our method by reference-free analysis of cell-type-specific DNA methylation in the blood of human, cow, and carp. In summary, we present a cost-effective method for epigenome analysis in ecology and evolution, which enables epigenome-wide association studies in natural populations and species without a reference genome. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Fast comprehensive analysis of vitamin D and triacylglycerols in dietary supplements using multiple parallel mass spectrometers

    USDA-ARS?s Scientific Manuscript database

    New, faster methods have been developed for analysis of vitamin D and triacylglycerols that eliminate hours of wet chemistry and preparative chromatography, while providing more information than classical methods for analysis. Unprecedented detail is provided by combining liquid chromatography with ...

  20. MATRIX DISCRIMINANT ANALYSIS WITH APPLICATION TO COLORIMETRIC SENSOR ARRAY DATA

    PubMed Central

    Suslick, Kenneth S.

    2014-01-01

    With the rapid development of nano-technology, a “colorimetric sensor array” (CSA) which is referred to as an optical electronic nose has been developed for the identification of toxicants. Unlike traditional sensors which rely on a single chemical interaction, CSA can measure multiple chemical interactions by using chemo-responsive dyes. The color changes of the chemo-responsive dyes are recorded before and after exposure to toxicants and serve as a template for classification. The color changes are digitalized in the form of a matrix with rows representing dye effects and columns representing the spectrum of colors. Thus, matrix-classification methods are highly desirable. In this article, we develop a novel classification method, matrix discriminant analysis (MDA), which is a generalization of linear discriminant analysis (LDA) for the data in matrix form. By incorporating the intrinsic matrix-structure of the data in discriminant analysis, the proposed method can improve CSA’s sensitivity and more importantly, specificity. A penalized MDA method, PMDA, is also introduced to further incorporate sparsity structure in discriminant function. Numerical studies suggest that the proposed MDA and PMDA methods outperform LDA and other competing discriminant methods for matrix predictors. The asymptotic consistency of MDA is also established. R code and data are available online as supplementary material. PMID:26783371

  1. Test/semi-empirical analysis of a carbon/epoxy fabric stiffened panel

    NASA Technical Reports Server (NTRS)

    Spier, E. E.; Anderson, J. A.

    1990-01-01

    The purpose of this work-in-progress is to present a semi-empirical analysis method developed to predict the buckling and crippling loads of carbon/epoxy fabric blade stiffened panels in compression. This is a hand analysis method comprised of well known, accepted techniques, logical engineering judgements, and experimental data that results in conservative solutions. In order to verify this method, a stiffened panel was fabricated and tested. Both the best and analysis results are presented.

  2. System of Systems Analytic Workbench - 2017

    DTIC Science & Technology

    2017-08-31

    and transitional activities with key collaborators. The tools include: System Operational Dependency Analysis/System Developmental Dependency Analysis...in the methods of the SoS-AWB involve the following: 1. System Operability Dependency Analysis (SODA)/System Development Dependency Analysis...available f. Development of standard dependencies with combinations of low-medium-high parameters Report No. SERC-2017-TR-111

  3. Neutron activation analysis: trends in developments and applications

    NASA Astrophysics Data System (ADS)

    de Goeij, J. J.; Bode, P.

    1995-03-01

    New developments in instrumentation for, and methodology of, Instrumental Neutron Activation Analysis (INAA) may lead to new niches for this method of elemental analysis. This paper describes the possibilities of advanced detectors, automated irradiation and counting stations, and very large sample analysis. An overview is given of some typical new fields of application.

  4. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  5. Pulse analysis of acoustic emission signals. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Houghton, J. R.

    1976-01-01

    A method for the signature analysis of pulses in the frequency domain and the time domain is presented. Fourier spectrum, Fourier transfer function, shock spectrum and shock spectrum ratio are examined in the frequency domain analysis, and pulse shape deconvolution is developed for use in the time domain analysis. To demonstrate the relative sensitivity of each of the methods to small changes in the pulse shape, signatures of computer modeled systems with analytical pulses are presented. Optimization techniques are developed and used to indicate the best design parameters values for deconvolution of the pulse shape. Several experiments are presented that test the pulse signature analysis methods on different acoustic emission sources. These include acoustic emissions associated with: (1) crack propagation, (2) ball dropping on a plate, (3) spark discharge and (4) defective and good ball bearings.

  6. Development of synthetic nuclear melt glass for forensic analysis.

    PubMed

    Molgaard, Joshua J; Auxier, John D; Giminaro, Andrew V; Oldham, C J; Cook, Matthew T; Young, Stephen A; Hall, Howard L

    A method for producing synthetic debris similar to the melt glass produced by nuclear surface testing is demonstrated. Melt glass from the first nuclear weapon test (commonly referred to as trinitite) is used as the benchmark for this study. These surrogates can be used to simulate a variety of scenarios and will serve as a tool for developing and validating forensic analysis methods.

  7. METHOD DEVELOPMENT FOR THE ANALYSIS OF N-NITROSODIMETHYLAMINE AND OTHER N-NITROSAMINES IN DRINKING WATER AT LOW NANOGRAM/LITER CONCENTRATIONS USING SOLID PHASE EXTRACTION AND GAS CHROMATOGRAPHY WITH CHEMICAL IONIZATION TANDEM MASS SPECTROMETRY

    EPA Science Inventory

    N-Nitrosodimethylamine (NDMA) is a probable human carcinogen that has been identified as a drinking water contaminant of concern. United States Environmental Protection Agency (USEPA) Method 521 has been developed for the analysis of NDMA and six additional N-nitrosamines in dri...

  8. Platforms for Single-Cell Collection and Analysis.

    PubMed

    Valihrach, Lukas; Androvic, Peter; Kubista, Mikael

    2018-03-11

    Single-cell analysis has become an established method to study cell heterogeneity and for rare cell characterization. Despite the high cost and technical constraints, applications are increasing every year in all fields of biology. Following the trend, there is a tremendous development of tools for single-cell analysis, especially in the RNA sequencing field. Every improvement increases sensitivity and throughput. Collecting a large amount of data also stimulates the development of new approaches for bioinformatic analysis and interpretation. However, the essential requirement for any analysis is the collection of single cells of high quality. The single-cell isolation must be fast, effective, and gentle to maintain the native expression profiles. Classical methods for single-cell isolation are micromanipulation, microdissection, and fluorescence-activated cell sorting (FACS). In the last decade several new and highly efficient approaches have been developed, which not just supplement but may fully replace the traditional ones. These new techniques are based on microfluidic chips, droplets, micro-well plates, and automatic collection of cells using capillaries, magnets, an electric field, or a punching probe. In this review we summarize the current methods and developments in this field. We discuss the advantages of the different commercially available platforms and their applicability, and also provide remarks on future developments.

  9. Platforms for Single-Cell Collection and Analysis

    PubMed Central

    Valihrach, Lukas; Androvic, Peter; Kubista, Mikael

    2018-01-01

    Single-cell analysis has become an established method to study cell heterogeneity and for rare cell characterization. Despite the high cost and technical constraints, applications are increasing every year in all fields of biology. Following the trend, there is a tremendous development of tools for single-cell analysis, especially in the RNA sequencing field. Every improvement increases sensitivity and throughput. Collecting a large amount of data also stimulates the development of new approaches for bioinformatic analysis and interpretation. However, the essential requirement for any analysis is the collection of single cells of high quality. The single-cell isolation must be fast, effective, and gentle to maintain the native expression profiles. Classical methods for single-cell isolation are micromanipulation, microdissection, and fluorescence-activated cell sorting (FACS). In the last decade several new and highly efficient approaches have been developed, which not just supplement but may fully replace the traditional ones. These new techniques are based on microfluidic chips, droplets, micro-well plates, and automatic collection of cells using capillaries, magnets, an electric field, or a punching probe. In this review we summarize the current methods and developments in this field. We discuss the advantages of the different commercially available platforms and their applicability, and also provide remarks on future developments. PMID:29534489

  10. Rapid analysis of charge variants of monoclonal antibodies using non-linear salt gradient in cation-exchange high performance liquid chromatography.

    PubMed

    Joshi, Varsha; Kumar, Vijesh; Rathore, Anurag S

    2015-08-07

    A method is proposed for rapid development of a short, analytical cation exchange high performance liquid chromatography method for analysis of charge heterogeneity in monoclonal antibody products. The parameters investigated and optimized include pH, shape of elution gradient and length of the column. It is found that the most important parameter for development of a shorter method is the choice of the shape of elution gradient. In this paper, we propose a step by step approach to develop a non-linear sigmoidal shape gradient for analysis of charge heterogeneity for two different monoclonal antibody products. The use of this gradient not only decreases the run time of the method to 4min against the conventional method that takes more than 40min but also the resolution is retained. Superiority of the phosphate gradient over sodium chloride gradient for elution of mAbs is also observed. The method has been successfully evaluated for specificity, sensitivity, linearity, limit of detection, and limit of quantification. Application of this method as a potential at-line process analytical technology tool has been suggested. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. PFOA and PFOS: Analytics

    EPA Science Inventory

    EPA Method 537 was developed for the analysis of perfluoroalkyl acids (PFAAs) in drinking water to address the occurrence monitoring needs under EPA’s Unregulated Contaminant Monitoring Regulation (UCMR). The method employs solid-phase extraction with analysis by liquid chr...

  12. Analysis of the methods for assessing socio-economic development level of urban areas

    NASA Astrophysics Data System (ADS)

    Popova, Olga; Bogacheva, Elena

    2017-01-01

    The present paper provides a targeted analysis of current approaches (ratings) in the assessment of socio-economic development of urban areas. The survey focuses on identifying standardized methodologies to area assessment techniques formation that will result in developing the system of intelligent monitoring, dispatching, building management, scheduling and effective management of an administrative-territorial unit. This system is characterized by complex hierarchical structure, including tangible and intangible properties (parameters, attributes). Investigating the abovementioned methods should increase the administrative-territorial unit's attractiveness for investors and residence. The research aims at studying methods for evaluating socio-economic development level of the Russian Federation territories. Experimental and theoretical territory estimating methods were revealed. Complex analysis of the characteristics of the areas was carried out and evaluation parameters were determined. Integral indicators (resulting rating criteria values) as well as the overall rankings (parameters, characteristics) were analyzed. The inventory of the most widely used partial indicators (parameters, characteristics) of urban areas was revealed. The resulting criteria of rating values homogeneity were verified and confirmed by determining the root mean square deviation, i.e. divergence of indices. The principal shortcomings of assessment methodologies were revealed. The assessment methods with enhanced effectiveness and homogeneity were proposed.

  13. A hybrid-stress finite element approach for stress and vibration analysis in linear anisotropic elasticity

    NASA Technical Reports Server (NTRS)

    Oden, J. Tinsley; Fly, Gerald W.; Mahadevan, L.

    1987-01-01

    A hybrid stress finite element method is developed for accurate stress and vibration analysis of problems in linear anisotropic elasticity. A modified form of the Hellinger-Reissner principle is formulated for dynamic analysis and an algorithm for the determination of the anisotropic elastic and compliance constants from experimental data is developed. These schemes were implemented in a finite element program for static and dynamic analysis of linear anisotropic two dimensional elasticity problems. Specific numerical examples are considered to verify the accuracy of the hybrid stress approach and compare it with that of the standard displacement method, especially for highly anisotropic materials. It is that the hybrid stress approach gives much better results than the displacement method. Preliminary work on extensions of this method to three dimensional elasticity is discussed, and the stress shape functions necessary for this extension are included.

  14. Regional frequency analysis of extreme rainfalls using partial L moments method

    NASA Astrophysics Data System (ADS)

    Zakaria, Zahrahtul Amani; Shabri, Ani

    2013-07-01

    An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.

  15. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    USGS Publications Warehouse

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  16. Resilience Metrics for the Electric Power System: A Performance-Based Approach.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vugrin, Eric D.; Castillo, Andrea R; Silva-Monroy, Cesar Augusto

    Grid resilience is a concept related to a power system's ability to continue operating and delivering power even in the event that low probability, high-consequence disruptions such as hurricanes, earthquakes, and cyber-attacks occur. Grid resilience objectives focus on managing and, ideally, minimizing potential consequences that occur as a result of these disruptions. Currently, no formal grid resilience definitions, metrics, or analysis methods have been universally accepted. This document describes an effort to develop and describe grid resilience metrics and analysis methods. The metrics and methods described herein extend upon the Resilience Analysis Process (RAP) developed by Watson et al. formore » the 2015 Quadrennial Energy Review. The extension allows for both outputs from system models and for historical data to serve as the basis for creating grid resilience metrics and informing grid resilience planning and response decision-making. This document describes the grid resilience metrics and analysis methods. Demonstration of the metrics and methods is shown through a set of illustrative use cases.« less

  17. Dynamic Analysis With Stress Mode Animation by the Integrated Force Method

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1997-01-01

    Dynamic animation of stresses and displacements, which complement each other, can be a useful tool in the analysis and design of structural components. At the present time only displacement-mode animation is available through the popular stiffness formulation. This paper attempts to complete this valuable visualization tool by augmenting the existing art with stress mode animation. The reformulated method of forces, which in the literature is known as the integrated force method (IFM), became the analyzer of choice for the development of stress mode animation because stresses are the primary unknowns of its dynamic analysis. Animation of stresses and displacements, which have been developed successfully through the IFM analyzers, is illustrated in several examples along with a brief introduction to IFM dynamic analysis. The usefulness of animation in design optimization is illustrated considering the spacer structure component of the International Space Station as an example. An overview of the integrated force method analysis code (IFM/ANALYZERS) is provided in the appendix.

  18. Ion chromatography for the precise analysis of chloride and sodium in sweat for the diagnosis of cystic fibrosis.

    PubMed

    Doorn, J; Storteboom, T T R; Mulder, A M; de Jong, W H A; Rottier, B L; Kema, I P

    2015-07-01

    Measurement of chloride in sweat is an essential part of the diagnostic algorithm for cystic fibrosis. The lack in sensitivity and reproducibility of current methods led us to develop an ion chromatography/high-performance liquid chromatography (IC/HPLC) method, suitable for the analysis of both chloride and sodium in small volumes of sweat. Precision, linearity and limit of detection of an in-house developed IC/HPLC method were established. Method comparison between the newly developed IC/HPLC method and the traditional Chlorocounter was performed, and trueness was determined using Passing Bablok method comparison with external quality assurance material (Royal College of Pathologists of Australasia). Precision and linearity fulfill criteria as established by UK guidelines are comparable with inductively coupled plasma-mass spectrometry methods. Passing Bablok analysis demonstrated excellent correlation between IC/HPLC measurements and external quality assessment target values, for both chloride and sodium. With a limit of quantitation of 0.95 mmol/L, our method is suitable for the analysis of small amounts of sweat and can thus be used in combination with the Macroduct collection system. Although a chromatographic application results in a somewhat more expensive test compared to a Chlorocounter test, more accurate measurements are achieved. In addition, simultaneous measurements of sodium concentrations will result in better detection of false positives, less test repeating and thus faster and more accurate and effective diagnosis. The described IC/HPLC method, therefore, provides a precise, relatively cheap and easy-to-handle application for the analysis of both chloride and sodium in sweat. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  19. Deciphering the Epigenetic Code: An Overview of DNA Methylation Analysis Methods

    PubMed Central

    Umer, Muhammad

    2013-01-01

    Abstract Significance: Methylation of cytosine in DNA is linked with gene regulation, and this has profound implications in development, normal biology, and disease conditions in many eukaryotic organisms. A wide range of methods and approaches exist for its identification, quantification, and mapping within the genome. While the earliest approaches were nonspecific and were at best useful for quantification of total methylated cytosines in the chunk of DNA, this field has seen considerable progress and development over the past decades. Recent Advances: Methods for DNA methylation analysis differ in their coverage and sensitivity, and the method of choice depends on the intended application and desired level of information. Potential results include global methyl cytosine content, degree of methylation at specific loci, or genome-wide methylation maps. Introduction of more advanced approaches to DNA methylation analysis, such as microarray platforms and massively parallel sequencing, has brought us closer to unveiling the whole methylome. Critical Issues: Sensitive quantification of DNA methylation from degraded and minute quantities of DNA and high-throughput DNA methylation mapping of single cells still remain a challenge. Future Directions: Developments in DNA sequencing technologies as well as the methods for identification and mapping of 5-hydroxymethylcytosine are expected to augment our current understanding of epigenomics. Here we present an overview of methodologies available for DNA methylation analysis with special focus on recent developments in genome-wide and high-throughput methods. While the application focus relates to cancer research, the methods are equally relevant to broader issues of epigenetics and redox science in this special forum. Antioxid. Redox Signal. 18, 1972–1986. PMID:23121567

  20. [CONTEMPORARY MOLECULAR-GENETIC METHODS USED FOR ETIOLOGIC DIAGNOSTICS OF SEPSIS].

    PubMed

    Gavrilov, S N; Skachkova, T S; Shipulina, O Yu; Savochkina, Yu A; Shipulin, G A; Maleev, V V

    2016-01-01

    Etiologic diagnostics of sepsis is one of the most difficult problems of contemporary medicine due to a wide variety of sepsis causative agents, many of which are components of normal human microflora. Disadvantages of contemporary "golden standard" of microbiologic diagnostics of sepsis etiology by seeding of blood for sterility are duration of cultivation, limitation in detection of non-cultivable forms of microorganisms, significant effect of preliminary empiric antibiotics therapy on results of the analysis. Methods of molecular diagnostics that are being actively developed and integrated during the last decade are deprived of these disadvantages. Main contemporary methods of molecular-biological diagnostics are examined in the review, actualdata on their diagnostic characteristic are provided. Special attention is given to methods of PCR-diagnostics, including novel Russian developments. Methods of nucleic acid hybridization and proteomic analysis are examined in comparative aspect. Evaluation of application and perspectives of development of methods of molecular diagnostics of sepsis is given.

  1. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...

  2. Criteria for Developing a Successful Privatization Project

    DTIC Science & Technology

    1989-05-01

    conceptualization and planning are required when pursuing privatization projects. In fact, privatization project proponents need to know how to...selection of projects for analysis, methods of acquiring information about these projects, and the analysis framwork . Chapter IV includes the analysis. A...performed an analysis to determine cormion conceptual and creative approaches and lessons learned. This analysis was then used to develop criteria for

  3. A New Green Method for the Quantitative Analysis of Enrofloxacin by Fourier-Transform Infrared Spectroscopy.

    PubMed

    Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-05-18

    Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.

  4. a Cognitive Approach to Teaching a Graduate-Level Geobia Course

    NASA Astrophysics Data System (ADS)

    Bianchetti, Raechel A.

    2016-06-01

    Remote sensing image analysis training occurs both in the classroom and the research lab. Education in the classroom for traditional pixel-based image analysis has been standardized across college curriculums. However, with the increasing interest in Geographic Object-Based Image Analysis (GEOBIA), there is a need to develop classroom instruction for this method of image analysis. While traditional remote sensing courses emphasize the expansion of skills and knowledge related to the use of computer-based analysis, GEOBIA courses should examine the cognitive factors underlying visual interpretation. This current paper provides an initial analysis of the development, implementation, and outcomes of a GEOBIA course that considers not only the computational methods of GEOBIA, but also the cognitive factors of expertise, that such software attempts to replicate. Finally, a reflection on the first instantiation of this course is presented, in addition to plans for development of an open-source repository for course materials.

  5. The Shock and Vibration Bulletin. Part 2. Invited Papers, Structural Dynamics

    DTIC Science & Technology

    1974-08-01

    VIKING LANDER DYNAMICS 41 Mr. Joseph C. Pohlen, Martin Marietta Aerospace, Denver, Colorado Structural Dynamics PERFORMANCE OF STATISTICAL ENERGY ANALYSIS 47...aerospace structures. Analytical prediction of these environments is beyond the current scope of classical modal techniques. Statistical energy analysis methods...have been developed that circumvent the difficulties of high-frequency nodal analysis. These statistical energy analysis methods are evaluated

  6. High accuracy method for the application of isotope dilution to gas chromatography/mass spectrometric analysis of gases.

    PubMed

    Milton, Martin J T; Wang, Jian

    2003-01-01

    A new isotope dilution mass spectrometry (IDMS) method for high-accuracy quantitative analysis of gases has been developed and validated by the analysis of standard mixtures of carbon dioxide in nitrogen. The method does not require certified isotopic reference materials and does not require direct measurements of the highly enriched spike. The relative uncertainty of the method is shown to be 0.2%. Reproduced with the permission of Her Majesty's Stationery Office. Copyright Crown copyright 2003.

  7. A method for the analysis of nonlinearities in aircraft dynamic response to atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1976-01-01

    An analytical method is developed which combines the equivalent linearization technique for the analysis of the response of nonlinear dynamic systems with the amplitude modulated random process (Press model) for atmospheric turbulence. The method is initially applied to a bilinear spring system. The analysis of the response shows good agreement with exact results obtained by the Fokker-Planck equation. The method is then applied to an example of control-surface displacement limiting in an aircraft with a pitch-hold autopilot.

  8. Skeletal Mechanism Generation of Surrogate Jet Fuels for Aeropropulsion Modeling

    NASA Astrophysics Data System (ADS)

    Sung, Chih-Jen; Niemeyer, Kyle E.

    2010-05-01

    A novel implementation for the skeletal reduction of large detailed reaction mechanisms using the directed relation graph with error propagation and sensitivity analysis (DRGEPSA) is developed and presented with skeletal reductions of two important hydrocarbon components, n-heptane and n-decane, relevant to surrogate jet fuel development. DRGEPSA integrates two previously developed methods, directed relation graph-aided sensitivity analysis (DRGASA) and directed relation graph with error propagation (DRGEP), by first applying DRGEP to efficiently remove many unimportant species prior to sensitivity analysis to further remove unimportant species, producing an optimally small skeletal mechanism for a given error limit. It is illustrated that the combination of the DRGEP and DRGASA methods allows the DRGEPSA approach to overcome the weaknesses of each previous method, specifically that DRGEP cannot identify all unimportant species and that DRGASA shields unimportant species from removal.

  9. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    ERIC Educational Resources Information Center

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  10. Advanced Productivity Analysis Methods for Air Traffic Control Operations.

    DOT National Transportation Integrated Search

    1976-12-01

    This report gives a description of the Air Traffic Control (ATC) productivity analysis methods developed, implemented, and refined by the Stanford Research Institute (SRI) under the sponsorship of FAA and TSC. Two models are included in the productiv...

  11. A Review of Classical Methods of Item Analysis.

    ERIC Educational Resources Information Center

    French, Christine L.

    Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…

  12. A review of promising new immunoassay technology for monitoring forest herbicides

    Treesearch

    Charles K. McMahon

    1993-01-01

    Rising costs of classical instrumental methods of chemical analysis coupled with an increasing need for environmental monitoring has lead to the development of highly sensitive, low-cost immunochemical methods of analysis for the detection of environmental contaminants. These methods known simply as immunoassays are chemical assays which use antibodies as reagents. A...

  13. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  14. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  15. Immobilized monolithic enzymatic reactor and its application for analysis of in-vitro fertilization media samples.

    PubMed

    Chen, Wei-Qiang; Obermayr, Philipp; Černigoj, Urh; Vidič, Jana; Panić-Janković, Tanta; Mitulović, Goran

    2017-11-01

    Classical proteomics approaches involve enzymatic hydrolysis of proteins (either separated by polyacrylamide gels or in solution) followed by peptide identification using LC-MS/MS analysis. This method requires normally more than 16 h to complete. In the case of clinical analysis, it is of the utmost importance to provide fast and reproducible analysis with minimal manual sample handling. Herein we report the method development for online protein digestion on immobilized monolithic enzymatic reactors (IMER) to accelerate protein digestion, reduce manual sample handling, and provide reproducibility to the digestion process in clinical laboratory. An integrated online digestion and separation method using monolithic immobilized enzymatic reactor was developed and applied to digestion and separation of in-vitro-fertilization media. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Boundary element analysis of corrosion problems for pumps and pipes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miyasaka, M.; Amaya, K.; Kishimoto, K.

    1995-12-31

    Three-dimensional (3D) and axi-symmetric boundary element methods (BEM) were developed to quantitatively estimate cathodic protection and macro-cell corrosion. For 3D analysis, a multiple-region method (MRM) was developed in addition to a single-region method (SRM). The validity and usefulness of the BEMs were demonstrated by comparing numerical results with experimental data from galvanic corrosion systems of a cylindrical model and a seawater pipe, and from a cathodic protection system of an actual seawater pump. It was shown that a highly accurate analysis could be performed for fluid machines handling seawater with complex 3D fields (e.g. seawater pump) by taking account ofmore » flow rate and time dependencies of polarization curve. Compared to the 3D BEM, the axi-symmetric BEM permitted large reductions in numbers of elements and nodes, which greatly simplified analysis of axi-symmetric fields such as pipes. Computational accuracy and CPU time were compared between analyses using two approximation methods for polarization curves: a logarithmic-approximation method and a linear-approximation method.« less

  17. Use of response surface methodology for development of new microwell-based spectrophotometric method for determination of atrovastatin calcium in tablets

    PubMed Central

    2012-01-01

    Background Response surface methodology by Box–Behnken design employing the multivariate approach enables substantial improvement in the method development using fewer experiments, without wastage of large volumes of organic solvents, which leads to high analysis cost. This methodology has not been employed for development of a method for analysis of atorvastatin calcium (ATR-Ca). Results The present research study describes the use of in optimization and validation of a new microwell-based UV-Visible spectrophotometric method of for determination of ATR-Ca in its tablets. By the use of quadratic regression analysis, equations were developed to describe the behavior of the response as simultaneous functions of the selected independent variables. Accordingly, the optimum conditions were determined which included concentration of 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ), time of reaction and temperature. The absorbance of the colored-CT complex was measured at 460 nm by microwell-plate absorbance reader. The method was validated, in accordance with ICH guidelines for accuracy, precision, selectivity and linearity (r² = 0.9993) over the concentration range of 20–200 μg/ml. The assay was successfully applied to the analysis of ATR-Ca in its pharmaceutical dosage forms with good accuracy and precision. Conclusion The assay described herein has great practical value in the routine analysis of ATR-Ca in quality control laboratories, as it has high throughput property, consumes minimum volume of organic solvent thus it offers the reduction in the exposures of the analysts to the toxic effects of organic solvents, environmentally friendly "Green" approach) and reduction in the analysis cost by 50-fold. PMID:23146143

  18. Application of the probabilistic approximate analysis method to a turbopump blade analysis. [for Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.

    1990-01-01

    An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.

  19. Development and validation of an extraction method for the analysis of perfluoroalkyl substances in human hair.

    PubMed

    Kim, Da-Hye; Oh, Jeong-Eun

    2017-05-01

    Human hair has many advantages as a non-invasive sample; however, analytical methods for detecting perfluoroalkyl substances (PFASs) in human hair are still in the development stage. Therefore, the aim of this study was to develop and validate a method for monitoring 11 PFASs in human hair. Solid-phase extraction (SPE), ion-pairing extraction (IPE), a combined method (SPE+IPE) and solvent extraction with ENVI-carb clean-up were compared to develop an optimal extraction method using two types of hair sample (powder and piece forms). Analysis of PFASs was performed using liquid chromatography and tandem mass spectrometry. Among the four different extraction procedures, the SPE method using powdered hair showed the best extraction efficiency and recoveries ranged from 85.8 to 102%. The method detection limits for the SPE method were 0.114-0.796 ng/g and good precision (below 10%) and accuracy (66.4-110%) were obtained. In light of these results, SPE is considered the optimal method for PFAS extraction from hair. It was also successfully used to detect PFASs in human hair samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  1. Error analysis of motion correction method for laser scanning of moving objects

    NASA Astrophysics Data System (ADS)

    Goel, S.; Lohani, B.

    2014-05-01

    The limitation of conventional laser scanning methods is that the objects being scanned should be static. The need of scanning moving objects has resulted in the development of new methods capable of generating correct 3D geometry of moving objects. Limited literature is available showing development of very few methods capable of catering to the problem of object motion during scanning. All the existing methods utilize their own models or sensors. Any studies on error modelling or analysis of any of the motion correction methods are found to be lacking in literature. In this paper, we develop the error budget and present the analysis of one such `motion correction' method. This method assumes availability of position and orientation information of the moving object which in general can be obtained by installing a POS system on board or by use of some tracking devices. It then uses this information along with laser scanner data to apply correction to laser data, thus resulting in correct geometry despite the object being mobile during scanning. The major application of this method lie in the shipping industry to scan ships either moving or parked in the sea and to scan other objects like hot air balloons or aerostats. It is to be noted that the other methods of "motion correction" explained in literature can not be applied to scan the objects mentioned here making the chosen method quite unique. This paper presents some interesting insights in to the functioning of "motion correction" method as well as a detailed account of the behavior and variation of the error due to different sensor components alone and in combination with each other. The analysis can be used to obtain insights in to optimal utilization of available components for achieving the best results.

  2. Using mixed methods to develop and evaluate complex interventions in palliative care research.

    PubMed

    Farquhar, Morag C; Ewing, Gail; Booth, Sara

    2011-12-01

    there is increasing interest in combining qualitative and quantitative research methods to provide comprehensiveness and greater knowledge yield. Mixed methods are valuable in the development and evaluation of complex interventions. They are therefore particularly valuable in palliative care research where the majority of interventions are complex, and the identification of outcomes particularly challenging. this paper aims to introduce the role of mixed methods in the development and evaluation of complex interventions in palliative care, and how they may be used in palliative care research. the paper defines mixed methods and outlines why and how mixed methods are used to develop and evaluate complex interventions, with a pragmatic focus on design and data collection issues and data analysis. Useful texts are signposted and illustrative examples provided of mixed method studies in palliative care, including a detailed worked example of the development and evaluation of a complex intervention in palliative care for breathlessness. Key challenges to conducting mixed methods in palliative care research are identified in relation to data collection, data integration in analysis, costs and dissemination and how these might be addressed. the development and evaluation of complex interventions in palliative care benefit from the application of mixed methods. Mixed methods enable better understanding of whether and how an intervention works (or does not work) and inform the design of subsequent studies. However, they can be challenging: mixed method studies in palliative care will benefit from working with agreed protocols, multidisciplinary teams and engaging staff with appropriate skill sets.

  3. Research progress in Asia on methods of processing laser-induced breakdown spectroscopy data

    NASA Astrophysics Data System (ADS)

    Guo, Yang-Min; Guo, Lian-Bo; Li, Jia-Ming; Liu, Hong-Di; Zhu, Zhi-Hao; Li, Xiang-You; Lu, Yong-Feng; Zeng, Xiao-Yan

    2016-10-01

    Laser-induced breakdown spectroscopy (LIBS) has attracted much attention in terms of both scientific research and industrial application. An important branch of LIBS research in Asia, the development of data processing methods for LIBS, is reviewed. First, the basic principle of LIBS and the characteristics of spectral data are briefly introduced. Next, two aspects of research on and problems with data processing methods are described: i) the basic principles of data preprocessing methods are elaborated in detail on the basis of the characteristics of spectral data; ii) the performance of data analysis methods in qualitative and quantitative analysis of LIBS is described. Finally, a direction for future development of data processing methods for LIBS is also proposed.

  4. Mapping forest inventory and analysis data attributes within the framework of double sampling for stratification design

    Treesearch

    David C. Chojnacky; Randolph H. Wynne; Christine E. Blinn

    2009-01-01

    Methodology is lacking to easily map Forest Inventory and Analysis (FIA) inventory statistics for all attribute variables without having to develop separate models and methods for each variable. We developed a mapping method that can directly transfer tabular data to a map on which pixels can be added any way desired to estimate carbon (or any other variable) for a...

  5. Development of Novel Noninvasive Methods of Stress Assessment in Baleen Whales

    DTIC Science & Technology

    2014-09-30

    large whales. Few methods exist for assessment of physiological stress levels of free-swimming cetaceans (Amaral 2010, ONR 2010, Hunt et al. 2013...hormone aldosterone . Our aim in this project is to further develop both techniques - respiratory hormone analysis and fecal hormone analysis - for use...noninvasive aldosterone assay (for both feces and blow) that can be used as an alternative measure of adrenal gland activation relative to stress

  6. Automated Analysis of Counselor Style and Effects: The Development and Evaluation of Methods and Materials to Assess the Stylistic Accuracy and Outcome Effectiveness of Counselor Verbal Behavior. Final Report.

    ERIC Educational Resources Information Center

    Pepyne, Edward W.

    This project attempts to develop, evaluate and implement methods and materials for the automated analysis of the stylistic characteristics of counselor verbal behavior and its effects on client verbal behavior within the counseling interview. To achieve this purpose, the project designed a system of computer programs, the DISCOURSE ANALYSIS…

  7. The CSM testbed software system: A development environment for structural analysis methods on the NAS CRAY-2

    NASA Technical Reports Server (NTRS)

    Gillian, Ronnie E.; Lotts, Christine G.

    1988-01-01

    The Computational Structural Mechanics (CSM) Activity at Langley Research Center is developing methods for structural analysis on modern computers. To facilitate that research effort, an applications development environment has been constructed to insulate the researcher from the many computer operating systems of a widely distributed computer network. The CSM Testbed development system was ported to the Numerical Aerodynamic Simulator (NAS) Cray-2, at the Ames Research Center, to provide a high end computational capability. This paper describes the implementation experiences, the resulting capability, and the future directions for the Testbed on supercomputers.

  8. A study of commuter airplane design optimization

    NASA Technical Reports Server (NTRS)

    Keppel, B. V.; Eysink, H.; Hammer, J.; Hawley, K.; Meredith, P.; Roskam, J.

    1978-01-01

    The usability of the general aviation synthesis program (GASP) was enhanced by the development of separate computer subroutines which can be added as a package to this assembly of computerized design methods or used as a separate subroutine program to compute the dynamic longitudinal, lateral-directional stability characteristics for a given airplane. Currently available analysis methods were evaluated to ascertain those most appropriate for the design functions which the GASP computerized design program performs. Methods for providing proper constraint and/or analysis functions for GASP were developed as well as the appropriate subroutines.

  9. Synthesis of aircraft structures using integrated design and analysis methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Goetz, R. C.

    1978-01-01

    A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.

  10. Wilsonian methods of concept analysis: a critique.

    PubMed

    Hupcey, J E; Morse, J M; Lenz, E R; Tasón, M C

    1996-01-01

    Wilsonian methods of concept analysis--that is, the method proposed by Wilson and Wilson-derived methods in nursing (as described by Walker and Avant; Chinn and Kramer [Jacobs]; Schwartz-Barcott and Kim; and Rodgers)--are discussed and compared in this article. The evolution and modifications of Wilson's method in nursing are described and research that has used these methods, assessed. The transformation of Wilson's method is traced as each author has adopted his techniques and attempted to modify the method to correct for limitations. We suggest that these adaptations and modifications ultimately erode Wilson's method. Further, the Wilson-derived methods have been overly simplified and used by nurse researchers in a prescriptive manner, and the results often do not serve the purpose of expanding nursing knowledge. We conclude that, considering the significance of concept development for the nursing profession, the development of new methods and a means for evaluating conceptual inquiry must be given priority.

  11. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  12. Finite element analysis and computer graphics visualization of flow around pitching and plunging airfoils

    NASA Technical Reports Server (NTRS)

    Bratanow, T.; Ecer, A.

    1973-01-01

    A general computational method for analyzing unsteady flow around pitching and plunging airfoils was developed. The finite element method was applied in developing an efficient numerical procedure for the solution of equations describing the flow around airfoils. The numerical results were employed in conjunction with computer graphics techniques to produce visualization of the flow. The investigation involved mathematical model studies of flow in two phases: (1) analysis of a potential flow formulation and (2) analysis of an incompressible, unsteady, viscous flow from Navier-Stokes equations.

  13. Spectrophotometric and HPLC determinations of anti-diabetic drugs, rosiglitazone maleate and metformin hydrochloride, in pure form and in pharmaceutical preparations.

    PubMed

    Onal, Armağan

    2009-12-01

    In this study, three spectrophotometric methods and one HPLC method were developed for analysis of anti-diabetic drugs in tablets. The two spectrophotometric methods were based on the reaction of rosiglitazone (RSG) with 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ) and bromocresol green (BCG). Linear relationship between the absorbance at lambda(max) and the drug concentration was found to be in the ranges 6.0-50.0 and 1.5-12 microg ml(-1) for DDQ and BCG methods, respectively. The third spectrophotometric method consists of a zero-crossing first-derivative spectrophotometric method for simultaneous analysis of RSG and metformin (MTF) in tablets. The calibration curves were linear within the concentration ranges of 5.0-50 microg ml(-1) for RSG and 1.0-10.0 microg ml(-1) for MTF. The fourth method is a rapid stability-indicating HPLC method developed for the determination of RSG. A linear response was observed within the concentration range of 0.25-2.5 microg ml(-1). The proposed methods have been successfully applied to the tablet analysis.

  14. Quasi-Experimental Analysis: A Mixture of Methods and Judgment.

    ERIC Educational Resources Information Center

    Cordray, David S.

    1986-01-01

    The role of human judgment in the development and synthesis of evidence has not been adequately developed or acknowledged within quasi-experimental analysis. Corrective solutions need to confront the fact that causal analysis within complex environments will require a more active assessment that entails reasoning and statistical modeling.…

  15. A novel quality by design approach for developing an HPLC method to analyze herbal extracts: A case study of sugar content analysis.

    PubMed

    Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu

    2018-01-01

    The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.

  16. Liquid chromatography/electrospray ionization/isotopic dilution mass spectrometry analysis of n-(phosphonomethyl) glycine and mass spectrometry analysis of aminomethyl phosphonic acid in environmental water and vegetation matrixes.

    PubMed

    Grey, L; Nguyen, B; Yang, P

    2001-01-01

    A liquid chromatography/electrospray/mass spectrometry (LC/ES/MS) method was developed for the analysis of glyphosate (n-phosphonomethyl glycine) and its metabolite, aminomethyl phosphonic acid (AMPA) using isotope-labelled glyphosate as a method surrogate. Optimized parameters were achieved to derivatize glyphosate and AMPA using 9-fluorenylmethyl chloroformate (FMOC-Cl) in borate buffer prior to a reversed-phase LC analysis. Method spike recovery data obtained using laboratory and real world sample matrixes indicated an excellent correlation between the recovery of the native and isotope-labelled glyphosate. Hence, the first performance-based, isotope dilution MS method with superior precision, accuracy, and data quality was developed for the analysis of glyphosate. There was, however, no observable correlation between the isotope-labelled glyphosate and AMPA. Thus, the use of this procedure for the accurate analysis of AMPA was not supported. Method detection limits established using standard U.S. Environmental Protection Agency protocol were 0.06 and 0.30 microg/L, respectively, for glyphosate and AMPA in water matrixes and 0.11 and 0.53 microg/g, respectively, in vegetation matrixes. Problems, solutions, and the method performance data related to the analysis of chlorine-treated drinking water samples are discussed. Applying this method to other environmental matrixes, e.g., soil, with minimum modifications is possible, assuring accurate, multimedia studies of glyphosate concentration in the environment and the delivery of useful multimedia information for regulatory applications.

  17. Development of quality assurance methods for epoxy graphite prepreg

    NASA Technical Reports Server (NTRS)

    Chen, J. S.; Hunter, A. B.

    1982-01-01

    Quality assurance methods for graphite epoxy/prepregs were developed. Liquid chromatography, differential scanning calorimetry, and gel permeation chromatography were investigated. These methods were applied to a second prepreg system. The resin matrix formulation was correlated with mechanical properties. Dynamic mechanical analysis and fracture toughness methods were investigated. The chromatography and calorimetry techniques were all successfully developed as quality assurance methods for graphite epoxy prepregs. The liquid chromatography method was the most sensitive to changes in resin formulation. The were also successfully applied to the second prepreg system.

  18. Risk analysis within environmental impact assessment of proposed construction activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeleňáková, Martina; Zvijáková, Lenka

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less

  19. Identification of hemoglobin variants by top-down mass spectrometry using selected diagnostic product ions.

    PubMed

    Coelho Graça, Didia; Hartmer, Ralf; Jabs, Wolfgang; Beris, Photis; Clerici, Lorella; Stoermer, Carsten; Samii, Kaveh; Hochstrasser, Denis; Tsybin, Yury O; Scherl, Alexander; Lescuyer, Pierre

    2015-04-01

    Hemoglobin disorder diagnosis is a complex procedure combining several analytical steps. Due to the lack of specificity of the currently used protein analysis methods, the identification of uncommon hemoglobin variants (proteoforms) can become a hard task to accomplish. The aim of this work was to develop a mass spectrometry-based approach to quickly identify mutated protein sequences within globin chain variants. To reach this goal, a top-down electron transfer dissociation mass spectrometry method was developed for hemoglobin β chain analysis. A diagnostic product ion list was established with a color code strategy allowing to quickly and specifically localize a mutation in the hemoglobin β chain sequence. The method was applied to the analysis of rare hemoglobin β chain variants and an (A)γ-β fusion protein. The results showed that the developed data analysis process allows fast and reliable interpretation of top-down electron transfer dissociation mass spectra by nonexpert users in the clinical area.

  20. Recent Advance in Liquid Chromatography/Mass Spectrometry Techniques for Environmental Analysis in Japan

    PubMed Central

    Suzuki, Shigeru

    2014-01-01

    The techniques and measurement methods developed in the Environmental Survey and Monitoring of Chemicals by Japan’s Ministry of the Environment, as well as a large amount of knowledge archived in the survey, have led to the advancement of environmental analysis. Recently, technologies such as non-target liquid chromatography/high resolution mass spectrometry and liquid chromatography with micro bore column have further developed the field. Here, the general strategy of a method developed for the liquid chromatography/mass spectrometry (LC/MS) analysis of environmental chemicals with a brief description is presented. Also, a non-target analysis for the identification of environmental pollutants using a provisional fragment database and “MsMsFilter,” an elemental composition elucidation tool, is presented. This analytical method is shown to be highly effective in the identification of a model chemical, the pesticide Bendiocarb. Our improved micro-liquid chromatography injection system showed substantially enhanced sensitivity to perfluoroalkyl substances, with peak areas 32–71 times larger than those observed in conventional LC/MS. PMID:26819891

  1. Multivariate Methods for Meta-Analysis of Genetic Association Studies.

    PubMed

    Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G

    2018-01-01

    Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.

  2. The Comparison Of In-Flight Pitot Static Calibration Method By Using Radio Altimeter As Reference with GPS and Tower Fly By Methods On CN235-100 MPA

    NASA Astrophysics Data System (ADS)

    Derajat; Hariowibowo, Hindawan

    2018-04-01

    The new proposed In-Flight Pitot Static Calibration Method has been carried out during Development and Qualification of CN235-100 MPA (Military Patrol Aircraft). This method is expected to reduce flight hours, less human resources required, no additional special equipment, simple analysis calculation and finally by using this method it is expected to automatically minimized operational cost. At The Indonesian Aerospace (IAe) Flight Test Center Division, the development and updating of new flight test technique and data analysis method as specially for flight physics test subject are still continued to be developed as long as it safety for flight and give additional value for the industrial side. More than 30 years, Flight Test Data Engineers at The Flight Test center Division work together with the Air Crew (Test Pilots, Co-Pilots, and Flight Test Engineers) to execute the flight test activity with standard procedure for both the existance or development test techniques and test data analysis. In this paper the approximation of mathematical model, data reduction and flight test technique of The In-Flight Pitot Static Calibration by using Radio Altimeter as reference will be described and the test results had been compared with another methods ie. By using Global Position System (GPS) and the traditional method (Tower Fly By Method) which were used previously during this Flight Test Program (Ref. [10]). The flight test data case are using CN235-100 MPA flight test data during development and Qualification Flight Test Program at Cazaux Airport, France, in June-November 2009 (Ref. [2]).

  3. Simultaneous quantitation of 14 active components in Yinchenhao decoction with an ultrahigh performance liquid chromatography-diode array detector: Method development and ingredient analysis of different commonly prepared samples.

    PubMed

    Yi, YaXiong; Zhang, Yong; Ding, Yue; Lu, Lu; Zhang, Tong; Zhao, Yuan; Xu, XiaoJun; Zhang, YuXin

    2016-11-01

    J. Sep. Sci. 2016, 39, 4147-4157 DOI: 10.1002/jssc.201600284 Yinchenhao decoction (YCHD) is a famous Chinese herbal formula recorded in the Shang Han Lun which was prescribed by Zhongjing Zhang during 150-219 AD. A novel quantitative analysis method was developed, based on ultrahigh performance liquid chromatography coupled with a diode array detector for the simultaneous determination of 14 main active components in Yinchenhao decoction. Furthermore, the method has been applied for compositional difference analysis of the 14 components in eight normal extraction samples of Yinchenhao decoction, with the aid of hierarchical clustering analysis and similarity analysis. The present research could help hospital, factory and lab choose the best way to make Yinchenhao decoction with better efficacy. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. The Flight Optimization System Weights Estimation Method

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.

    2017-01-01

    FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.

  5. A Finite-Volume "Shaving" Method for Interfacing NASA/DAO''s Physical Space Statistical Analysis System to the Finite-Volume GCM with a Lagrangian Control-Volume Vertical Coordinate

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)

    2001-01-01

    Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.

  6. Analysis of Perfluorinated Chemicals and Their Fluorinated Precursors in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...

  7. ANALYSIS OF SWINE LAGOONS AND GROUND WATER FOR ENVIRONMENTAL ESTROGENS

    EPA Science Inventory

    A method was developed for analysis of low levels of natural (estradiol, estrone, estriol) and synthetic (ethynylestradiol) estrogens in ground water and swine waste lagoon effluent. The method includes solid phase extraction of the estrogens, preparation of pentafluorobenzyl der...

  8. ANALYSIS OF ALDEHYDES AND KETONES IN THE GAS PHASE

    EPA Science Inventory

    The development and testing of a 2,4-dinitrophenylhydrazine-acetonitrile (DNPH-ACN) method for the analysis of aldehydes and ketones in ambient air are described. A discussion of interferences, preparation of calibration standards, analytical testing, fluorescence methods and car...

  9. MIXING QUANTIFICATION BY VISUAL IMAGING ANALYSIS

    EPA Science Inventory

    This paper reports on development of a method for quantifying two measures of mixing, the scale and intensity of segregation, through flow visualization, video recording, and software analysis. This non-intrusive method analyzes a planar cross section of a flowing system from an ...

  10. Research Trends in Evidence-Based Medicine: A Joinpoint Regression Analysis of More than 50 Years of Publication Data

    PubMed Central

    Hung, Bui The; Long, Nguyen Phuoc; Hung, Le Phi; Luan, Nguyen Thien; Anh, Nguyen Hoang; Nghi, Tran Diem; Van Hieu, Mai; Trang, Nguyen Thi Huyen; Rafidinarivo, Herizo Fabien; Anh, Nguyen Ky; Hawkes, David; Huy, Nguyen Tien; Hirayama, Kenji

    2015-01-01

    Background Evidence-based medicine (EBM) has developed as the dominant paradigm of assessment of evidence that is used in clinical practice. Since its development, EBM has been applied to integrate the best available research into diagnosis and treatment with the purpose of improving patient care. In the EBM era, a hierarchy of evidence has been proposed, including various types of research methods, such as meta-analysis (MA), systematic review (SRV), randomized controlled trial (RCT), case report (CR), practice guideline (PGL), and so on. Although there are numerous studies examining the impact and importance of specific cases of EBM in clinical practice, there is a lack of research quantitatively measuring publication trends in the growth and development of EBM. Therefore, a bibliometric analysis was constructed to determine the scientific productivity of EBM research over decades. Methods NCBI PubMed database was used to search, retrieve and classify publications according to research method and year of publication. Joinpoint regression analysis was undertaken to analyze trends in research productivity and the prevalence of individual research methods. Findings Analysis indicates that MA and SRV, which are classified as the highest ranking of evidence in the EBM, accounted for a relatively small but auspicious number of publications. For most research methods, the annual percent change (APC) indicates a consistent increase in publication frequency. MA, SRV and RCT show the highest rate of publication growth in the past twenty years. Only controlled clinical trials (CCT) shows a non-significant reduction in publications over the past ten years. Conclusions Higher quality research methods, such as MA, SRV and RCT, are showing continuous publication growth, which suggests an acknowledgement of the value of these methods. This study provides the first quantitative assessment of research method publication trends in EBM. PMID:25849641

  11. A Method for a Retrospective Analysis of Course Objectives: Have Pursued Objectives in Fact Been Attained? Twente Educational Report Number 7.

    ERIC Educational Resources Information Center

    Plomp, Tjeerd; van der Meer, Adri

    A method pertaining to the identification and analysis of course objectives is discussed. A framework is developed by which post facto objectives can be determined and students' attainment of the objectives can be assessed. The method can also be used for examining the quality of instruction. Using this method, it is possible to determine…

  12. Development of indirect EFBEM for radiating noise analysis including underwater problems

    NASA Astrophysics Data System (ADS)

    Kwon, Hyun-Wung; Hong, Suk-Yoon; Song, Jee-Hun

    2013-09-01

    For the analysis of radiating noise problems in medium-to-high frequency ranges, the Energy Flow Boundary Element Method (EFBEM) was developed. EFBEM is the analysis technique that applies the Boundary Element Method (BEM) to Energy Flow Analysis (EFA). The fundamental solutions representing spherical wave property for radiating noise problems in open field and considering the free surface effect in underwater are developed. Also the directivity factor is developed to express wave's directivity patterns in medium-to-high frequency ranges. Indirect EFBEM by using fundamental solutions and fictitious source was applied to open field and underwater noise problems successfully. Through numerical applications, the acoustic energy density distributions due to vibration of a simple plate model and a sphere model were compared with those of commercial code, and the comparison showed good agreement in the level and pattern of the energy density distributions.

  13. Theory and applications of structured light single pixel imaging

    NASA Astrophysics Data System (ADS)

    Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.

    2018-02-01

    Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.

  14. Development and validation of a multiclass method for the quantification of veterinary drug residues in honey and royal jelly by liquid chromatography-tandem mass spectrometry.

    PubMed

    Jin, Yue; Zhang, Jinzhen; Zhao, Wen; Zhang, Wenwen; Wang, Lin; Zhou, Jinhui; Li, Yi

    2017-04-15

    The aim of this study was to develop an analytical method for the analysis of a wide range of veterinary drugs in honey and royal jelly. A modified sample preparation procedure based on the quick, easy, cheap, effective, rugged and safe (QuEChERS) method was developed, followed by liquid chromatography tandem mass spectrometry determination. Use of the single sample preparation method for analysis of 42 veterinary drugs becomes more valuable because honey and royal jelly belong to completely different complex matrices. Another main advantage of the proposed method is its ability to identify and quantify 42 veterinary drugs with higher sensitivity than reference methods of China. This work has shown that the reported method was demonstrated to be convenient and reliable for the quick monitoring of veterinary drugs in honey and royal jelly samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Analysis of Ethanolamines: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS888

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Vu, A; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled 'Analysis of Diethanolamine, Triethanolamine, n-Methyldiethanolamine, and n-Ethyldiethanolamine in Water by Single Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry (LC/MS/MS): EPA Method MS888'. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in 'EPA Method MS888' for analysis of themore » listed ethanolamines in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of 'EPA Method MS888' can be determined.« less

  16. Boosting Sensitivity in Liquid Chromatography–Fourier Transform Ion Cyclotron Resonance–Tandem Mass Spectrometry for Product Ion Analysis of Monoterpene Indole Alkaloids

    PubMed Central

    Nakabayashi, Ryo; Tsugawa, Hiroshi; Kitajima, Mariko; Takayama, Hiromitsu; Saito, Kazuki

    2015-01-01

    In metabolomics, the analysis of product ions in tandem mass spectrometry (MS/MS) is noteworthy to chemically assign structural information. However, the development of relevant analytical methods are less advanced. Here, we developed a method to boost sensitivity in liquid chromatography–Fourier transform ion cyclotron resonance–tandem mass spectrometry analysis (MS/MS boost analysis). To verify the MS/MS boost analysis, both quercetin and uniformly labeled 13C quercetin were analyzed, revealing that the origin of the product ions is not the instrument, but the analyzed compounds resulting in sensitive product ions. Next, we applied this method to the analysis of monoterpene indole alkaloids (MIAs). The comparative analyses of MIAs having indole basic skeleton (ajmalicine, catharanthine, hirsuteine, and hirsutine) and oxindole skeleton (formosanine, isoformosanine, pteropodine, isopteropodine, rhynchophylline, isorhynchophylline, and mitraphylline) identified 86 and 73 common monoisotopic ions, respectively. The comparative analyses of the three pairs of stereoisomers showed more than 170 common monoisotopic ions in each pair. This method was also applied to the targeted analysis of MIAs in Catharanthus roseus and Uncaria rhynchophylla to profile indole and oxindole compounds using the product ions. This analysis is suitable for chemically assigning features of the metabolite groups, which contributes to targeted metabolome analysis. PMID:26734034

  17. Analysis of high vacuum systems using SINDA'85

    NASA Technical Reports Server (NTRS)

    Spivey, R. A.; Clanton, S. E.; Moore, J. D.

    1993-01-01

    The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.

  18. Bio++: a set of C++ libraries for sequence analysis, phylogenetics, molecular evolution and population genetics.

    PubMed

    Dutheil, Julien; Gaillard, Sylvain; Bazin, Eric; Glémin, Sylvain; Ranwez, Vincent; Galtier, Nicolas; Belkhir, Khalid

    2006-04-04

    A large number of bioinformatics applications in the fields of bio-sequence analysis, molecular evolution and population genetics typically share input/output methods, data storage requirements and data analysis algorithms. Such common features may be conveniently bundled into re-usable libraries, which enable the rapid development of new methods and robust applications. We present Bio++, a set of Object Oriented libraries written in C++. Available components include classes for data storage and handling (nucleotide/amino-acid/codon sequences, trees, distance matrices, population genetics datasets), various input/output formats, basic sequence manipulation (concatenation, transcription, translation, etc.), phylogenetic analysis (maximum parsimony, markov models, distance methods, likelihood computation and maximization), population genetics/genomics (diversity statistics, neutrality tests, various multi-locus analyses) and various algorithms for numerical calculus. Implementation of methods aims at being both efficient and user-friendly. A special concern was given to the library design to enable easy extension and new methods development. We defined a general hierarchy of classes that allow the developer to implement its own algorithms while remaining compatible with the rest of the libraries. Bio++ source code is distributed free of charge under the CeCILL general public licence from its website http://kimura.univ-montp2.fr/BioPP.

  19. Knowledge Consolidation Analysis: Toward a Methodology for Studying the Role of Argument in Technology Development

    ERIC Educational Resources Information Center

    Dyehouse, Jeremiah

    2007-01-01

    Researchers studying technology development often examine how rhetorical activity contributes to technologies' design, implementation, and stabilization. This article offers a possible methodology for studying one role of rhetorical activity in technology development: knowledge consolidation analysis. Applying this method to an exemplar case, the…

  20. Low-Impact Development Design—Integrating Suitability Analysis and Site Planning For Reduction Of Post-Development Stormwater Quantity

    EPA Science Inventory

    A land-suitability analysis (LSA) was integrated with open-space conservation principles, based on watershed physiographic and soil characteristics, to derive a low-impact development (LID) residential plan for a three hectare site in Coshocton OH, USA. The curve number method wa...

  1. Using Multidimensional Methods to Understand the Development, Interpretation and Enactment of Quality Assurance Policy within the Educational Development Community

    ERIC Educational Resources Information Center

    Smith, Karen

    2018-01-01

    Policy texts are representations of practice that both reflect and shape the world around them. There is, however, little higher education research that critically analyses the impact of higher education policy on educational developers and educational development practice. Extending methods from critical discourse analysis by combining textual…

  2. Guidelines for Analysis of Health Facilities Planning in Developing Countries. Volume 5: Health Facilities Planning. International Health Planning Methods Series.

    ERIC Educational Resources Information Center

    Porter, Dennis R.; And Others

    Intended to assist Agency for International Development (AID) officers, advisors, and health officials in incorporating health planning into national plans for economic development, this fifth of ten manuals in the International Health Planning Methods Series deals with health facilities planning in developing countries. While several specific…

  3. Integrative Analysis of High-throughput Cancer Studies with Contrasted Penalization

    PubMed Central

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Shia, BenChang; Ma, Shuangge

    2015-01-01

    In cancer studies with high-throughput genetic and genomic measurements, integrative analysis provides a way to effectively pool and analyze heterogeneous raw data from multiple independent studies and outperforms “classic” meta-analysis and single-dataset analysis. When marker selection is of interest, the genetic basis of multiple datasets can be described using the homogeneity model or the heterogeneity model. In this study, we consider marker selection under the heterogeneity model, which includes the homogeneity model as a special case and can be more flexible. Penalization methods have been developed in the literature for marker selection. This study advances from the published ones by introducing the contrast penalties, which can accommodate the within- and across-dataset structures of covariates/regression coefficients and, by doing so, further improve marker selection performance. Specifically, we develop a penalization method that accommodates the across-dataset structures by smoothing over regression coefficients. An effective iterative algorithm, which calls an inner coordinate descent iteration, is developed. Simulation shows that the proposed method outperforms the benchmark with more accurate marker identification. The analysis of breast cancer and lung cancer prognosis studies with gene expression measurements shows that the proposed method identifies genes different from those using the benchmark and has better prediction performance. PMID:24395534

  4. Development of Composite Materials with High Passive Damping Properties

    DTIC Science & Technology

    2006-05-15

    frequency response function analysis. Sound transmission through sandwich panels was studied using the statistical energy analysis (SEA). Modal density...2.2.3 Finite element models 14 2.2.4 Statistical energy analysis method 15 CHAPTER 3 ANALYSIS OF DAMPING IN SANDWICH MATERIALS. 24 3.1 Equation of...sheets and the core. 2.2.4 Statistical energy analysis method Finite element models are generally only efficient for problems at low and middle frequencies

  5. Fuzzy Decision Analysis for Integrated Environmental Vulnerability Assessment of the Mid-Atlantic Region

    Treesearch

    Liem T. Tran; C. Gregory Knight; Robert V. O' Neill; Elizabeth R. Smith; Kurt H. Riitters; James D. Wickham

    2002-01-01

    A fuzzy decision analysis method for integrating ecological indicators was developed. This was a combination of a fuzzy ranking method and the analytic hierarchy process (AHP). The method was capable of ranking ecosystems in terms of environmental conditions and suggesting cumulative impacts across a large region. Using data on land cover, population, roads, streams,...

  6. Computational Methods Development at Ames

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Smith, Charles A. (Technical Monitor)

    1998-01-01

    This viewgraph presentation outlines the development at Ames Research Center of advanced computational methods to provide appropriate fidelity computational analysis/design capabilities. Current thrusts of the Ames research include: 1) methods to enhance/accelerate viscous flow simulation procedures, and the development of hybrid/polyhedral-grid procedures for viscous flow; 2) the development of real time transonic flow simulation procedures for a production wind tunnel, and intelligent data management technology; and 3) the validation of methods and the flow physics study gives historical precedents to above research, and speculates on its future course.

  7. Chromatographic immunoassays: strategies and recent developments in the analysis of drugs and biological agents

    PubMed Central

    Matsuda, Ryan; Rodriguez, Elliott; Suresh, Doddavenkatanna; Hage, David S

    2015-01-01

    A chromatographic immunoassay is a technique in which an antibody or antibody-related agent is used as part of a chromatographic system for the isolation or measurement of a specific target. Various binding agents, detection methods, supports and assay formats have been developed for this group of methods, and applications have been reported that range from drugs, hormones and herbicides to peptides, proteins and bacteria. This review discusses the general principles and applications of chromatographic immunoassays, with an emphasis being given to methods and formats that have been developed for the analysis of drugs and biological agents. The relative advantages or limitations of each format are discussed. Recent developments and research in this field, as well as possible future directions, are also considered. PMID:26571109

  8. Status and Prospects for Developing Electromagnetic Methods and Facilities for Engineer Reconnaissance in Russia

    NASA Astrophysics Data System (ADS)

    Potekaev, A. I.; Donchenko, V. A.; Zambalov, S. D.; Parvatov, G. N.; Smirnov, I. M.; Svetlichnyi, V. A.; Yakubov, V. P.; Yakovlev, I. A.

    2018-03-01

    An analysis of the most effective methods, techniques and scientific-research developments of induction mine detectors is performed, their comparative tactical-technical characteristics are reported, and priority avenues for further research are outlined.

  9. Eleventh NASTRAN User's Colloquium

    NASA Technical Reports Server (NTRS)

    1983-01-01

    NASTRAN (NASA STRUCTURAL ANALYSIS) is a large, comprehensive, nonproprietary, general purpose finite element computer code for structural analysis which was developed under NASA sponsorship. The Eleventh Colloquium provides some comprehensive general papers on the application of finite element methods in engineering, comparisons with other approaches, unique applications, pre- and post-processing or auxiliary programs, and new methods of analysis with NASTRAN.

  10. A New Approach to Aircraft Robust Performance Analysis

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Tierno, Jorge E.

    2004-01-01

    A recently developed algorithm for nonlinear system performance analysis has been applied to an F16 aircraft to begin evaluating the suitability of the method for aerospace problems. The algorithm has a potential to be much more efficient than the current methods in performance analysis for aircraft. This paper is the initial step in evaluating this potential.

  11. An instructional guide for leaf color analysis using digital imaging software

    Treesearch

    Paula F. Murakami; Michelle R. Turner; Abby K. van den Berg; Paul G. Schaberg

    2005-01-01

    Digital color analysis has become an increasingly popular and cost-effective method utilized by resource managers and scientists for evaluating foliar nutrition and health in response to environmental stresses. We developed and tested a new method of digital image analysis that uses Scion Image or NIH image public domain software to quantify leaf color. This...

  12. Manipulating Ratio Spectra for the Spectrophotometric Analysis of Diclofenac Sodium and Pantoprazole Sodium in Laboratory Mixtures and Tablet Formulation

    PubMed Central

    Bhatt, Nejal M.; Chavada, Vijay D.; Sanyal, Mallika; Shrivastav, Pranav S.

    2014-01-01

    Objective. Three sensitive, selective, and precise spectrophotometric methods based on manipulation of ratio spectra, have been developed and validated for the determination of diclofenac sodium and pantoprazole sodium. Materials and Methods. The first method is based on ratio spectra peak to peak measurement using the amplitudes at 251 and 318 nm; the second method involves the first derivative of the ratio spectra (Δλ = 4 nm) using the peak amplitudes at 326.0 nm for diclofenac sodium and 337.0 nm for pantoprazole sodium. The third is the method of mean centering of ratio spectra using the values at 318.0 nm for both the analytes. Results. All the three methods were linear over the concentration range of 2.0–24.0 μg/mL for diclofenac sodium and 2.0–20.0 μg/mL for pantoprazole sodium. The methods were validated according to the ICH guidelines and accuracy, precision, repeatability, and robustness are found to be within the acceptable limit. The results of single factor ANOVA analysis indicated that there is no significant difference among the developed methods. Conclusions. The developed methods provided simple resolution of this binary combination from laboratory mixtures and pharmaceutical preparations and can be conveniently adopted for routine quality control analysis. PMID:24701171

  13. Equity in Pharmaceutical Pricing and Reimbursement: Crossing the Income Divide in Asia Pacific.

    PubMed

    Daems, Rutger; Maes, Edith; Glaetzer, Christoph

    2013-05-01

    The article takes a three-dimensional approach (triangulation) in defining international pricing policy for pharmaceuticals using cost-effectiveness analysis (CEA), willingness-to-pay (WTP) analysis, and ability-to-pay (ATP) analysis. It attempts to find a balance between the various economic methods of which some focus on effectiveness while others are geared toward incorporating equity in the equation. A critical review of the first two established economic methods and their ability to evaluate not only "efficacy" but also "fairness" in pricing decisions identifies a gap in the latter. Therefore, a third analytic method is presented that measures the ATP based on a country's score in the human development index of the United Nations Development Program for 120 countries. This approach allows practicing differential pricing among and within countries. To refine this equity-driven pricing concept, two additional parameters can be added to the model: the Oxford "Multidimensional Poverty Index" and the "Out-of-Pocket" or "Self Pay" health expenditure as reported by the World Bank. There is no hierarchy between the above three pricing methods. Because one method provides further insight into the other, however, it is recommended to start with CEA followed by WTP analysis. These types of analysis are closely linked in that the first provides the CE ratio for the compound investigated and the other sets the anticipated ceiling threshold of the payer's WTP (in a particular country). The ATP method provides a supplementary "equity" check and facilitates the process of equity-based differential pricing. A third method should be used in conjunction with the standard CEA and WTP analysis that measures the ATP with the human development index as yardstick to provide sustainable and equitable access to medicines. We recommend that ATP analysis becomes an additional practice in policy decision making and in defining international pricing strategies for pharmaceuticals. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  14. Development of a quantitative method for the analysis of cocaine analogue impregnated into textiles by Raman spectroscopy.

    PubMed

    Xiao, Linda; Alder, Rhiannon; Mehta, Megha; Krayem, Nadine; Cavasinni, Bianca; Laracy, Sean; Cameron, Shane; Fu, Shanlin

    2018-04-01

    Cocaine trafficking in the form of textile impregnation is routinely encountered as a concealment method. Raman spectroscopy has been a popular and successful testing method used for in situ screening of cocaine in textiles and other matrices. Quantitative analysis of cocaine in these matrices using Raman spectroscopy has not been reported to date. This study aimed to develop a simple Raman method for quantifying cocaine using atropine as the model analogue in various types of textiles. Textiles were impregnated with solutions of atropine in methanol. The impregnated atropine was extracted using less hazardous acidified water with the addition of potassium thiocyanate (KSCN) as an internal standard for Raman analysis. Despite the presence of background matrix signals arising from the textiles, the cocaine analogue could easily be identified by its characteristic Raman bands. The successful use of KSCN normalised the analyte signal response due to different textile matrix background interferences and thus removed the need for a matrix-matched calibration. The method was linear over a concentration range of 6.25-37.5 mg/cm 2 with a coefficient of determination (R 2 ) at 0.975 and acceptable precision and accuracy. A simple and accurate Raman spectroscopy method for the analysis and quantification of a cocaine analogue impregnated in textiles has been developed and validated for the first time. This proof-of-concept study has demonstrated that atropine can act as an ideal model compound to study the problem of cocaine impregnation in textile. The method has the potential to be further developed and implemented in real world forensic cases. Copyright © 2017 John Wiley & Sons, Ltd.

  15. ANALYSIS OF SWINE LAGOONS AND GROUND WATER FOR ENVIRONMENTAL ESTROGENS

    EPA Science Inventory

    A method was developed for analysis of low levels of natural (estradiol, estrone, estriol) and synthetic (ethinyl estradiol) estrogens in ground water and swine waste lagoon effluent. The method includes solid phase extraction of the estrogens, preparation of pentafluorobenzyl de...

  16. Method Analysis of Microbial-Resistant Gypsum Products

    EPA Science Inventory

    Method Analysis of Microbial-Resistant Gypsum ProductsD.A. Betancourt1, T.R.Dean1, A. Evans2, and G.Byfield2 1. US Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory; RTP, NC 277112. RTI International, RTP, NCSeveral...

  17. COMPENDIUM OF SELECTED METHODS FOR SAMPLING AND ANALYSIS AT GEOTHERMAL FACILITIES

    EPA Science Inventory

    The establishment of generally accepted methods for characterizing geothermal emissions has been hampered by the independent natures of both geothermal industrial development and sampling/analysis procedures despite three workshops on the latter (Las Vegas 1975, 1977, 1980). An i...

  18. Radiation and scattering from printed antennas on cylindrically conformal platforms

    NASA Technical Reports Server (NTRS)

    Kempel, Leo C.; Volakis, John L.; Bindiganavale, Sunil

    1994-01-01

    The goal was to develop suitable methods and software for the analysis of antennas on cylindrical coated and uncoated platforms. Specifically, the finite element boundary integral and finite element ABC methods were employed successfully and associated software were developed for the analysis and design of wraparound and discrete cavity-backed arrays situated on cylindrical platforms. This work led to the successful implementation of analysis software for such antennas. Developments which played a role in this respect are the efficient implementation of the 3D Green's function for a metallic cylinder, the incorporation of the fast Fourier transform in computing the matrix-vector products executed in the solver of the finite element-boundary integral system, and the development of a new absorbing boundary condition for terminating the finite element mesh on cylindrical surfaces.

  19. [Improvement of 2-mercaptoimidazoline analysis in rubber products containing chlorine].

    PubMed

    Kaneko, Reiko; Haneishi, Nahoko; Kawamura, Yoko

    2012-01-01

    An improved analysis method for 2-mercaptoimidazoline in rubber products containing chlorine was developed. 2-Mercaptoimidazoline (20 µg/mL) is detected by means of TLC with two developing solvents in the official method. But, this method is not quantitative. Instead, we employed HPLC using water-methanol (9 : 1) as the mobile phase. This procedure decreased interfering peaks, and the quantitation limit was 2 µg/mL of standard solution. 2-Mercaptoimidazoline was confirmed by GC-MS (5 µg/mL) and LC/MS (1 µg/mL) in the scan mode. For preparation of test solution, a soaking extraction method, in which 20 mL of methanol was added to the sample and allowed to stand overnight at about 40°C, was used. This gave similar values to the Soxhlet extraction method (official method) and was more convenient. The results indicate that our procedure is suitable for analysis of 2-mercaptoimidazoline. When 2-mercaptoimidazoline is detected, it is confirmed by either GC/MS or LC/MS.

  20. Towards an Interoperability Ontology for Software Development Tools

    DTIC Science & Technology

    2003-03-01

    The description of feature models was tied to the introduction of the Feature-Oriented Domain Analysis ( FODA *) [KANG90] approach in the late eighties...Feature-oriented domain analysis ( FODA ) is a domain analysis method developed at the Software...ese obstacles was to construct a “pilot” ontology that is extensible. We applied the Feature-Oriented Domain Analysis approach to capture the

  1. The Health Services Mobility Study Method of Task Analysis and Curriculum Design. Research Report No. 11. Volume 3: Using the Computer to Develop Job Ladders.

    ERIC Educational Resources Information Center

    Gilpatrick, Eleanor

    This document is volume 3 of a four-volume report which describes the components of the Health Services Mobility Study (HSMS) method of task analysis, job ladder design, and curriculum development. Divided into four chapters, volume 3 is a manual for using HSMS computer based statistical procedures to design job structures and job ladders. Chapter…

  2. Aerodynamic design and analysis system for supersonic aircraft. Part 1: General description and theoretical development

    NASA Technical Reports Server (NTRS)

    Middleton, W. D.; Lundry, J. L.

    1975-01-01

    An integrated system of computer programs has been developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. This part presents a general description of the system and describes the theoretical methods used.

  3. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    ERIC Educational Resources Information Center

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  4. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  5. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens

    We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less

  6. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    DOE PAGES

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; ...

    2016-12-15

    We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less

  7. Blade loss transient dynamics analysis, volume 2. Task 2: Theoretical and analytical development. Task 3: Experimental verification

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.

  8. Commercialization of NESSUS: Status

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Millwater, Harry R.

    1991-01-01

    A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.

  9. An inverse method for the aerodynamic design of three-dimensional aircraft engine nacelles

    NASA Technical Reports Server (NTRS)

    Bell, R. A.; Cedar, R. D.

    1991-01-01

    A fast, efficient and user friendly inverse design system for 3-D nacelles was developed. The system is a product of a 2-D inverse design method originally developed at NASA-Langley and the CFL3D analysis code which was also developed at NASA-Langley and modified for nacelle analysis. The design system uses a predictor/corrector design approach in which an analysis code is used to calculate the flow field for an initial geometry, the geometry is then modified based on the difference between the calculated and target pressures. A detailed discussion of the design method, the process of linking it to the modified CFL3D solver and its extension to 3-D is presented. This is followed by a number of examples of the use of the design system for the design of both axisymmetric and 3-D nacelles.

  10. GSA-PCA: gene set generation by principal component analysis of the Laplacian matrix of a metabolic network

    PubMed Central

    2012-01-01

    Background Gene Set Analysis (GSA) has proven to be a useful approach to microarray analysis. However, most of the method development for GSA has focused on the statistical tests to be used rather than on the generation of sets that will be tested. Existing methods of set generation are often overly simplistic. The creation of sets from individual pathways (in isolation) is a poor reflection of the complexity of the underlying metabolic network. We have developed a novel approach to set generation via the use of Principal Component Analysis of the Laplacian matrix of a metabolic network. We have analysed a relatively simple data set to show the difference in results between our method and the current state-of-the-art pathway-based sets. Results The sets generated with this method are semi-exhaustive and capture much of the topological complexity of the metabolic network. The semi-exhaustive nature of this method has also allowed us to design a hypergeometric enrichment test to determine which genes are likely responsible for set significance. We show that our method finds significant aspects of biology that would be missed (i.e. false negatives) and addresses the false positive rates found with the use of simple pathway-based sets. Conclusions The set generation step for GSA is often neglected but is a crucial part of the analysis as it defines the full context for the analysis. As such, set generation methods should be robust and yield as complete a representation of the extant biological knowledge as possible. The method reported here achieves this goal and is demonstrably superior to previous set analysis methods. PMID:22876834

  11. Proposed method for determining the thickness of glass in solar collector panels

    NASA Technical Reports Server (NTRS)

    Moore, D. M.

    1980-01-01

    An analytical method was developed for determining the minimum thickness for simply supported, rectangular glass plates subjected to uniform normal pressure environmental loads such as wind, earthquake, snow, and deadweight. The method consists of comparing an analytical prediction of the stress in the glass panel to a glass breakage stress determined from fracture mechanics considerations. Based on extensive analysis using the nonlinear finite element structural analysis program ARGUS, design curves for the structural analysis of simply supported rectangular plates were developed. These curves yield the center deflection, center stress and corner stress as a function of a dimensionless parameter describing the load intensity. A method of estimating the glass breakage stress as a function of a specified failure rate, degree of glass temper, design life, load duration time, and panel size is also presented.

  12. Dynamic Pressure Distribution due to Horizontal Acceleration in Spherical LNG Tank with Cylindrical Central Part

    NASA Astrophysics Data System (ADS)

    Ko, Dae-Eun; Shin, Sang-Hoon

    2017-11-01

    Spherical LNG tanks having many advantages such as structural safety are used as a cargo containment system of LNG carriers. However, it is practically difficult to fabricate perfectly spherical tanks of different sizes in the yard. The most effective method of manufacturing LNG tanks of various capacities is to insert a cylindrical part at the center of existing spherical tanks. While a simplified high-precision analysis method for the initial design of the spherical tanks has been developed for both static and dynamic loads, in the case of spherical tanks with a cylindrical central part, the analysis method available only considers static loads. The purpose of the present study is to derive the dynamic pressure distribution due to horizontal acceleration, which is essential for developing an analysis method that considers dynamic loads as well.

  13. Development of a low-cost method of analysis for the qualitative and quantitative analysis of butyltins in environmental samples.

    PubMed

    Bangkedphol, Sornnarin; Keenan, Helen E; Davidson, Christine; Sakultantimetha, Arthit; Songsasen, Apisit

    2008-12-01

    Most analytical methods for butyltins are based on high resolution techniques with complicated sample preparation. For this study, a simple application of an analytical method was developed using High Performance Liquid Chromatography (HPLC) with UV detection. The developed method was studied to determine tributyltin (TBT), dibutyltin (DBT) and monobutyltin (MBT) in sediment and water samples. The separation was performed in isocratic mode on an ultra cyanopropyl column with a mobile phase of hexane containing 5% THF and 0.03% acetic acid. This method was confirmed using standard GC/MS techniques and verified by statistical paired t-test method. Under the experimental conditions used, the limit of detection (LOD) of TBT and DBT were 0.70 and 0.50 microg/mL, respectively. The optimised extraction method for butyltins in water and sediment samples involved using hexane containing 0.05-0.5% tropolone and 0.2% sodium chloride in water at pH 1.7. The quantitative extraction of butyltin compounds in a certified reference material (BCR-646) and naturally contaminated samples was achieved with recoveries ranging from 95 to 108% and at %RSD 0.02-1.00%. This HPLC method and optimum extraction conditions were used to determine the contamination level of butyltins in environmental samples collected from the Forth and Clyde canal, Scotland, UK. The values obtained severely exceeded the Environmental Quality Standard (EQS) values. Although high resolution methods are utilised extensively for this type of research, the developed method is cheaper in both terms of equipment and running costs, faster in analysis time and has comparable detection limits to the alternative methods. This is advantageous not just as a confirmatory technique but also to enable further research in this field.

  14. Integrating computer programs for engineering analysis and design

    NASA Technical Reports Server (NTRS)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  15. Rapid analysis of pharmaceutical drugs using LIBS coupled with multivariate analysis.

    PubMed

    Tiwari, P K; Awasthi, S; Kumar, R; Anand, R K; Rai, P K; Rai, A K

    2018-02-01

    Type 2 diabetes drug tablets containing voglibose having dose strengths of 0.2 and 0.3 mg of various brands have been examined, using laser-induced breakdown spectroscopy (LIBS) technique. The statistical methods such as the principal component analysis (PCA) and the partial least square regression analysis (PLSR) have been employed on LIBS spectral data for classifying and developing the calibration models of drug samples. We have developed the ratio-based calibration model applying PLSR in which relative spectral intensity ratios H/C, H/N and O/N are used. Further, the developed model has been employed to predict the relative concentration of element in unknown drug samples. The experiment has been performed in air and argon atmosphere, respectively, and the obtained results have been compared. The present model provides rapid spectroscopic method for drug analysis with high statistical significance for online control and measurement process in a wide variety of pharmaceutical industrial applications.

  16. The application of computer image analysis in life sciences and environmental engineering

    NASA Astrophysics Data System (ADS)

    Mazur, R.; Lewicki, A.; Przybył, K.; Zaborowicz, M.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.

    2014-04-01

    The main aim of the article was to present research on the application of computer image analysis in Life Science and Environmental Engineering. The authors used different methods of computer image analysis in developing of an innovative biotest in modern biomonitoring of water quality. Created tools were based on live organisms such as bioindicators Lemna minor L. and Hydra vulgaris Pallas as well as computer image analysis method in the assessment of negatives reactions during the exposition of the organisms to selected water toxicants. All of these methods belong to acute toxicity tests and are particularly essential in ecotoxicological assessment of water pollutants. Developed bioassays can be used not only in scientific research but are also applicable in environmental engineering and agriculture in the study of adverse effects on water quality of various compounds used in agriculture and industry.

  17. Methods for network meta-analysis of continuous outcomes using individual patient data: a case study in acupuncture for chronic pain.

    PubMed

    Saramago, Pedro; Woods, Beth; Weatherly, Helen; Manca, Andrea; Sculpher, Mark; Khan, Kamran; Vickers, Andrew J; MacPherson, Hugh

    2016-10-06

    Network meta-analysis methods, which are an extension of the standard pair-wise synthesis framework, allow for the simultaneous comparison of multiple interventions and consideration of the entire body of evidence in a single statistical model. There are well-established advantages to using individual patient data to perform network meta-analysis and methods for network meta-analysis of individual patient data have already been developed for dichotomous and time-to-event data. This paper describes appropriate methods for the network meta-analysis of individual patient data on continuous outcomes. This paper introduces and describes network meta-analysis of individual patient data models for continuous outcomes using the analysis of covariance framework. Comparisons are made between this approach and change score and final score only approaches, which are frequently used and have been proposed in the methodological literature. A motivating example on the effectiveness of acupuncture for chronic pain is used to demonstrate the methods. Individual patient data on 28 randomised controlled trials were synthesised. Consistency of endpoints across the evidence base was obtained through standardisation and mapping exercises. Individual patient data availability avoided the use of non-baseline-adjusted models, allowing instead for analysis of covariance models to be applied and thus improving the precision of treatment effect estimates while adjusting for baseline imbalance. The network meta-analysis of individual patient data using the analysis of covariance approach is advocated to be the most appropriate modelling approach for network meta-analysis of continuous outcomes, particularly in the presence of baseline imbalance. Further methods developments are required to address the challenge of analysing aggregate level data in the presence of baseline imbalance.

  18. Needle Trap Device as a New Sampling and Preconcentration Approach for Volatile Organic Compounds of Herbal Medicines and its Application to the Analysis of Volatile Components in Viola tianschanica.

    PubMed

    Qin, Yan; Pang, Yingming; Cheng, Zhihong

    2016-11-01

    The needle trap device (NTD) technique is a new microextraction method for sampling and preconcentration of volatile organic compounds (VOCs). Previous NTD studies predominantly focused on analysis of environmental volatile compounds in the gaseous and liquid phases. Little work has been done on its potential application in biological samples and no work has been reported on analysis of bioactive compounds in essential oils from herbal medicines. The main purpose of the present study is to develop a NTD sampling method for profiling VOCs in biological samples using herbal medicines as a case study. A combined method of NTD sample preparation and gas chromatography-mass spectrometry was developed for qualitative analysis of VOCs in Viola tianschanica. A 22-gauge stainless steel, triple-bed needle packed with Tenax, Carbopack X and Carboxen 1000 sorbents was used for analysis of VOCs in the herb. Furthermore, different parameters affecting the extraction efficiency and capacity were studied. The peak capacity obtained by NTDs was 104, more efficient than those of the static headspace (46) and hydrodistillation (93). This NTD method shows potential to trap a wide range of VOCs including the lower and higher volatile components, while the static headspace and hydrodistillation only detects lower volatile components, and semi-volatile and higher volatile components, respectively. The developed NTD sample preparation method is a more rapid, simpler, convenient, and sensitive extraction/desorption technique for analysis of VOCs in herbal medicines than the conventional methods such as static headspace and hydrodistillation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Development of methodology for horizontal axis wind turbine dynamic analysis

    NASA Technical Reports Server (NTRS)

    Dugundji, J.

    1982-01-01

    Horizontal axis wind turbine dynamics were studied. The following findings are summarized: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotor; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastics; and (7) development of simple models for stability and response of wind turbines on flexible towers.

  20. Evaluation of next generation sequencing for the analysis of Eimeria communities in wildlife.

    PubMed

    Vermeulen, Elke T; Lott, Matthew J; Eldridge, Mark D B; Power, Michelle L

    2016-05-01

    Next-generation sequencing (NGS) techniques are well-established for studying bacterial communities but not yet for microbial eukaryotes. Parasite communities remain poorly studied, due in part to the lack of reliable and accessible molecular methods to analyse eukaryotic communities. We aimed to develop and evaluate a methodology to analyse communities of the protozoan parasite Eimeria from populations of the Australian marsupial Petrogale penicillata (brush-tailed rock-wallaby) using NGS. An oocyst purification method for small sample sizes and polymerase chain reaction (PCR) protocol for the 18S rRNA locus targeting Eimeria was developed and optimised prior to sequencing on the Illumina MiSeq platform. A data analysis approach was developed by modifying methods from bacterial metagenomics and utilising existing Eimeria sequences in GenBank. Operational taxonomic unit (OTU) assignment at a high similarity threshold (97%) was more accurate at assigning Eimeria contigs into Eimeria OTUs but at a lower threshold (95%) there was greater resolution between OTU consensus sequences. The assessment of two amplification PCR methods prior to Illumina MiSeq, single and nested PCR, determined that single PCR was more sensitive to Eimeria as more Eimeria OTUs were detected in single amplicons. We have developed a simple and cost-effective approach to a data analysis pipeline for community analysis of eukaryotic organisms using Eimeria communities as a model. The pipeline provides a basis for evaluation using other eukaryotic organisms and potential for diverse community analysis studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Application of integrated fluid-thermal-structural analysis methods

    NASA Technical Reports Server (NTRS)

    Wieting, Allan R.; Dechaumphai, Pramote; Bey, Kim S.; Thornton, Earl A.; Morgan, Ken

    1988-01-01

    Hypersonic vehicles operate in a hostile aerothermal environment which has a significant impact on their aerothermostructural performance. Significant coupling occurs between the aerodynamic flow field, structural heat transfer, and structural response creating a multidisciplinary interaction. Interfacing state-of-the-art disciplinary analysis methods is not efficient, hence interdisciplinary analysis methods integrated into a single aerothermostructural analyzer are needed. The NASA Langley Research Center is developing such methods in an analyzer called LIFTS (Langley Integrated Fluid-Thermal-Structural) analyzer. The evolution and status of LIFTS is reviewed and illustrated through applications.

  2. MDAS: an integrated system for metabonomic data analysis.

    PubMed

    Liu, Juan; Li, Bo; Xiong, Jiang-Hui

    2009-03-01

    Metabonomics, the latest 'omics' research field, shows great promise as a tool in biomarker discovery, drug efficacy and toxicity analysis, disease diagnosis and prognosis. One of the major challenges now facing researchers is how to process this data to yield useful information about a biological system, e.g., the mechanism of diseases. Traditional methods employed in metabonomic data analysis use multivariate analysis methods developed independently in chemometrics research. Additionally, with the development of machine learning approaches, some methods such as SVMs also show promise for use in metabonomic data analysis. Aside from the application of general multivariate analysis and machine learning methods to this problem, there is also a need for an integrated tool customized for metabonomic data analysis which can be easily used by biologists to reveal interesting patterns in metabonomic data.In this paper, we present a novel software tool MDAS (Metabonomic Data Analysis System) for metabonomic data analysis which integrates traditional chemometrics methods and newly introduced machine learning approaches. MDAS contains a suite of functional models for metabonomic data analysis and optimizes the flow of data analysis. Several file formats can be accepted as input. The input data can be optionally preprocessed and can then be processed with operations such as feature analysis and dimensionality reduction. The data with reduced dimensionalities can be used for training or testing through machine learning models. The system supplies proper visualization for data preprocessing, feature analysis, and classification which can be a powerful function for users to extract knowledge from the data. MDAS is an integrated platform for metabonomic data analysis, which transforms a complex analysis procedure into a more formalized and simplified one. The software package can be obtained from the authors.

  3. Geometrical optics analysis of the structural imperfection of retroreflection corner cubes with a nonlinear conjugate gradient method.

    PubMed

    Kim, Hwi; Min, Sung-Wook; Lee, Byoungho

    2008-12-01

    Geometrical optics analysis of the structural imperfection of retroreflection corner cubes is described. In the analysis, a geometrical optics model of six-beam reflection patterns generated by an imperfect retroreflection corner cube is developed, and its structural error extraction is formulated as a nonlinear optimization problem. The nonlinear conjugate gradient method is employed for solving the nonlinear optimization problem, and its detailed implementation is described. The proposed method of analysis is a mathematical basis for the nondestructive optical inspection of imperfectly fabricated retroreflection corner cubes.

  4. Finite element modeling of truss structures with frequency-dependent material damping

    NASA Technical Reports Server (NTRS)

    Lesieutre, George A.

    1991-01-01

    A physically motivated modelling technique for structural dynamic analysis that accommodates frequency dependent material damping was developed. Key features of the technique are the introduction of augmenting thermodynamic fields (AFT) to interact with the usual mechanical displacement field, and the treatment of the resulting coupled governing equations using finite element analysis methods. The AFT method is fully compatible with current structural finite element analysis techniques. The method is demonstrated in the dynamic analysis of a 10-bay planar truss structure, a structure representative of those contemplated for use in future space systems.

  5. A simplified competition data analysis for radioligand specific activity determination.

    PubMed

    Venturino, A; Rivera, E S; Bergoc, R M; Caro, R A

    1990-01-01

    Non-linear regression and two-step linear fit methods were developed to determine the actual specific activity of 125I-ovine prolactin by radioreceptor self-displacement analysis. The experimental results obtained by the different methods are superposable. The non-linear regression method is considered to be the most adequate procedure to calculate the specific activity, but if its software is not available, the other described methods are also suitable.

  6. An exact computational method for performance analysis of sequential test algorithms for detecting network intrusions

    NASA Astrophysics Data System (ADS)

    Chen, Xinjia; Lacy, Fred; Carriere, Patrick

    2015-05-01

    Sequential test algorithms are playing increasingly important roles for quick detecting network intrusions such as portscanners. In view of the fact that such algorithms are usually analyzed based on intuitive approximation or asymptotic analysis, we develop an exact computational method for the performance analysis of such algorithms. Our method can be used to calculate the probability of false alarm and average detection time up to arbitrarily pre-specified accuracy.

  7. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  8. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGES

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  9. EnvironmentalWaveletTool: Continuous and discrete wavelet analysis and filtering for environmental time series

    NASA Astrophysics Data System (ADS)

    Galiana-Merino, J. J.; Pla, C.; Fernandez-Cortes, A.; Cuezva, S.; Ortiz, J.; Benavente, D.

    2014-10-01

    A MATLAB-based computer code has been developed for the simultaneous wavelet analysis and filtering of several environmental time series, particularly focused on the analyses of cave monitoring data. The continuous wavelet transform, the discrete wavelet transform and the discrete wavelet packet transform have been implemented to provide a fast and precise time-period examination of the time series at different period bands. Moreover, statistic methods to examine the relation between two signals have been included. Finally, the entropy of curves and splines based methods have also been developed for segmenting and modeling the analyzed time series. All these methods together provide a user-friendly and fast program for the environmental signal analysis, with useful, practical and understandable results.

  10. [A reliability growth assessment method and its application in the development of equipment in space cabin].

    PubMed

    Chen, J D; Sun, H L

    1999-04-01

    Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.

  11. Land cover mapping and change detection in urban watersheds using QuickBird high spatial resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Hester, David Barry

    The objective of this research was to develop methods for urban land cover analysis using QuickBird high spatial resolution satellite imagery. Such imagery has emerged as a rich commercially available remote sensing data source and has enjoyed high-profile broadcast news media and Internet applications, but methods of quantitative analysis have not been thoroughly explored. The research described here consists of three studies focused on the use of pan-sharpened 61-cm spatial resolution QuickBird imagery, the spatial resolution of which is the highest of any commercial satellite. In the first study, a per-pixel land cover classification method is developed for use with this imagery. This method utilizes a per-pixel classification approach to generate an accurate six-category high spatial resolution land cover map of a developing suburban area. The primary objective of the second study was to develop an accurate land cover change detection method for use with QuickBird land cover products. This work presents an efficient fuzzy framework for transforming map uncertainty into accurate and meaningful high spatial resolution land cover change analysis. The third study described here is an urban planning application of the high spatial resolution QuickBird-based land cover product developed in the first study. This work both meaningfully connects this exciting new data source to urban watershed management and makes an important empirical contribution to the study of suburban watersheds. Its analysis of residential roads and driveways as well as retail parking lots sheds valuable light on the impact of transportation-related land use on the suburban landscape. Broadly, these studies provide new methods for using state-of-the-art remote sensing data to inform land cover analysis and urban planning. These methods are widely adaptable and produce land cover products that are both meaningful and accurate. As additional high spatial resolution satellites are launched and the cost of high resolution imagery continues to decline, this research makes an important contribution to this exciting era in the science of remote sensing.

  12. [Causal analysis approaches in epidemiology].

    PubMed

    Dumas, O; Siroux, V; Le Moual, N; Varraso, R

    2014-02-01

    Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the formulation of causal hypotheses, which will be a basis for all methodological choices. Beyond this step, statistical analysis tools recently developed offer new possibilities to delineate complex relationships, in particular in life course epidemiology. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  13. CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 12, December 2006

    DTIC Science & Technology

    2006-12-01

    Feature-Oriented Domain Analysis ( FODA ) FODA is a domain analysis and engineer- ing method that focuses on developing reusable assets [9]. By examining...Eliciting Security Requirements This article describes an approach for doing trade-off analysis among requirements elicitation methods. by Dr. Nancy R...high-level requirements are addressed and met in the requirements work products. 3. Unclear requirements Mitigation Perform requirements analysis and

  14. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  15. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGES

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  16. Independent component analysis-based algorithm for automatic identification of Raman spectra applied to artistic pigments and pigment mixtures.

    PubMed

    González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio

    2015-03-01

    A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.

  17. Gait Analysis Using Wearable Sensors

    PubMed Central

    Tao, Weijun; Liu, Tao; Zheng, Rencheng; Feng, Hutian

    2012-01-01

    Gait analysis using wearable sensors is an inexpensive, convenient, and efficient manner of providing useful information for multiple health-related applications. As a clinical tool applied in the rehabilitation and diagnosis of medical conditions and sport activities, gait analysis using wearable sensors shows great prospects. The current paper reviews available wearable sensors and ambulatory gait analysis methods based on the various wearable sensors. After an introduction of the gait phases, the principles and features of wearable sensors used in gait analysis are provided. The gait analysis methods based on wearable sensors is divided into gait kinematics, gait kinetics, and electromyography. Studies on the current methods are reviewed, and applications in sports, rehabilitation, and clinical diagnosis are summarized separately. With the development of sensor technology and the analysis method, gait analysis using wearable sensors is expected to play an increasingly important role in clinical applications. PMID:22438763

  18. Analysis of Perfluorinated Chemicals in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A fast, rigorous method was developed to maximize the extraction efficacy for ten perfluorocarboxylic acids and perfluorooctanesulfonate from wastewater-treatment sludge and to quantitate using liquid chromatography, tandem-mass spectrometry (LC/MS/MS). First, organic solvents w...

  19. Assessment of Complement Activation by Nanoparticles: Development of a SPR Based Method and Comparison with Current High Throughput Methods.

    PubMed

    Coty, Jean-Baptiste; Noiray, Magali; Vauthier, Christine

    2018-04-26

    A Surface Plasmon Resonance chip (SPR) was developed to study the activation of complement system triggered by nanomaterials in contact with human serum, which is an important concern today to warrant safety of nanomedicines. The developed chip was tested for its specificity in complex medium and its longevity of use. It was then employed to assess the release of complement fragments upon incubation of nanoparticles in serum. A comparison was made with other current methods assessing complement activation (μC-IE, ELISA). The SPR chip was found to give a consistent response for C3a release upon activation by nanoparticles. Results were similar to those obtained by μC-IE. However, ELISA detection of iC3b fragments showed an explained high non-specific background. The impact of sample preparation preceding the analysis was assessed with the newly develop SPR method. The removal of nanoparticles before analysis showed an important modification in the obtained response, possibly leading to false negative results. The SPR chip developed in this work allows for an automated assessment of complement activation triggered by nanoparticles with possibility of multiplexed analysis. The design of the chip proved to give consistent results of complement activation by nanoparticles.

  20. Development of a Probabilistic Component Mode Synthesis Method for the Analysis of Non-Deterministic Substructures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1995-01-01

    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature, researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. This paper presents a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  1. Chemical Fingerprinting of Materials Developed Due to Environmental Issues

    NASA Technical Reports Server (NTRS)

    Smith, Doris A.; McCool, A. (Technical Monitor)

    2000-01-01

    Instrumental chemical analysis methods are developed and used to chemically fingerprint new and modified External Tank materials made necessary by changing environmental requirements. Chemical fingerprinting can detect and diagnose variations in material composition. To chemically characterize each material, fingerprint methods are selected from an extensive toolbox based on the material's chemistry and the ability of the specific methods to detect the material's critical ingredients. Fingerprint methods have been developed for a variety of materials including Thermal Protection System foams, adhesives, primers, and composites.

  2. INTEGRATED ENVIRONMENTAL ASSESSMENT OF THE MID-ATLANTIC REGION WITH ANALYTICAL NETWORK PROCESS

    EPA Science Inventory

    A decision analysis method for integrating environmental indicators was developed. This was a combination of Principal Component Analysis (PCA) and the Analytic Network Process (ANP). Being able to take into account interdependency among variables, the method was capable of ran...

  3. Analysis methods for tocopherols and tocotrienols

    USDA-ARS?s Scientific Manuscript database

    Tocopherols and tocotrienols, sometimes called tocochromanols or tocols, are also collectively termed Vitamin E. Vitamins A, D, E, and K, are referred to as fat soluble vitamins. Since the discovery of Vitamin E in 1922, many methods have been developed for the analysis of tocopherols and tocotrie...

  4. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  5. Development of a numerical model for vehicle-bridge interaction analysis of railway bridges

    NASA Astrophysics Data System (ADS)

    Kim, Hee Ju; Cho, Eun Sang; Ham, Jun Su; Park, Ki Tae; Kim, Tae Heon

    2016-04-01

    In the field of civil engineering, analyzing dynamic response was main concern for a long time. These analysis methods can be divided into moving load analysis method and moving mass analysis method, and formulating each an equation of motion has recently been studied after dividing vehicles and bridges. In this study, the numerical method is presented, which can consider the various train types and can solve the equations of motion for a vehicle-bridge interaction analysis by non-iteration procedure through formulating the coupled equations for motion. Also, 3 dimensional accurate numerical models was developed by KTX-vehicle in order to analyze dynamic response characteristics. The equations of motion for the conventional trains are derived, and the numerical models of the conventional trains are idealized by a set of linear springs and dashpots with 18 degrees of freedom. The bridge models are simplified by the 3 dimensional space frame element which is based on the Euler-Bernoulli theory. The rail irregularities of vertical and lateral directions are generated by PSD functions of the Federal Railroad Administration (FRA).

  6. [Applications of meta-analysis in multi-omics].

    PubMed

    Han, Mingfei; Zhu, Yunping

    2014-07-01

    As a statistical method integrating multi-features and multi-data, meta-analysis was introduced to the field of life science in the 1990s. With the rapid advances in high-throughput technologies, life omics, the core of which are genomics, transcriptomics and proteomics, is becoming the new hot spot of life science. Although the fast output of massive data has promoted the development of omics study, it results in excessive data that are difficult to integrate systematically. In this case, meta-analysis is frequently applied to analyze different types of data and is improved continuously. Here, we first summarize the representative meta-analysis methods systematically, and then study the current applications of meta-analysis in various omics fields, finally we discuss the still-existing problems and the future development of meta-analysis.

  7. Molecularly imprinted membrane extraction combined with high-performance liquid chromatography for selective analysis of cloxacillin from shrimp samples.

    PubMed

    Du, Wei; Sun, Min; Guo, Pengqi; Chang, Chun; Fu, Qiang

    2018-09-01

    Nowadays, the abuse of antibiotics in aquaculture has generated considerable problems for food safety. Therefore, it is imperative to develop a simple and selective method for monitoring illegal use of antibiotics in aquatic products. In this study, a method combined molecularly imprinted membranes (MIMs) extraction and liquid chromatography was developed for the selective analysis of cloxacillin from shrimp samples. The MIMs was synthesized by UV photopolymerization, and characterized by scanning electron microscope, Fourier transform infrared spectra, thermo-gravimetric analysis and swelling test. The results showed that the MIMs exhibited excellent permselectivity, high adsorption capacity and fast adsorption rate for cloxacillin. Finally, the method was utilized to determine cloxacillin from shrimp samples, with good accuracies and acceptable relative standard deviation values for precision. The proposed method was a promising alternative for selective analysis of cloxacillin in shrimp samples, due to the easy-operation and excellent selectivity. Copyright © 2018. Published by Elsevier Ltd.

  8. N-of-1-pathways MixEnrich: advancing precision medicine via single-subject analysis in discovering dynamic changes of transcriptomes.

    PubMed

    Li, Qike; Schissler, A Grant; Gardeux, Vincent; Achour, Ikbel; Kenost, Colleen; Berghout, Joanne; Li, Haiquan; Zhang, Hao Helen; Lussier, Yves A

    2017-05-24

    Transcriptome analytic tools are commonly used across patient cohorts to develop drugs and predict clinical outcomes. However, as precision medicine pursues more accurate and individualized treatment decisions, these methods are not designed to address single-patient transcriptome analyses. We previously developed and validated the N-of-1-pathways framework using two methods, Wilcoxon and Mahalanobis Distance (MD), for personal transcriptome analysis derived from a pair of samples of a single patient. Although, both methods uncover concordantly dysregulated pathways, they are not designed to detect dysregulated pathways with up- and down-regulated genes (bidirectional dysregulation) that are ubiquitous in biological systems. We developed N-of-1-pathways MixEnrich, a mixture model followed by a gene set enrichment test, to uncover bidirectional and concordantly dysregulated pathways one patient at a time. We assess its accuracy in a comprehensive simulation study and in a RNA-Seq data analysis of head and neck squamous cell carcinomas (HNSCCs). In presence of bidirectionally dysregulated genes in the pathway or in presence of high background noise, MixEnrich substantially outperforms previous single-subject transcriptome analysis methods, both in the simulation study and the HNSCCs data analysis (ROC Curves; higher true positive rates; lower false positive rates). Bidirectional and concordant dysregulated pathways uncovered by MixEnrich in each patient largely overlapped with the quasi-gold standard compared to other single-subject and cohort-based transcriptome analyses. The greater performance of MixEnrich presents an advantage over previous methods to meet the promise of providing accurate personal transcriptome analysis to support precision medicine at point of care.

  9. A methodology for commonality analysis, with applications to selected space station systems

    NASA Technical Reports Server (NTRS)

    Thomas, Lawrence Dale

    1989-01-01

    The application of commonality in a system represents an attempt to reduce costs by reducing the number of unique components. A formal method for conducting commonality analysis has not been established. In this dissertation, commonality analysis is characterized as a partitioning problem. The cost impacts of commonality are quantified in an objective function, and the solution is that partition which minimizes this objective function. Clustering techniques are used to approximate a solution, and sufficient conditions are developed which can be used to verify the optimality of the solution. This method for commonality analysis is general in scope. It may be applied to the various types of commonality analysis required in the conceptual, preliminary, and detail design phases of the system development cycle.

  10. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  11. Developments in mycotoxin analysis: an update for 2010 - 2011

    USDA-ARS?s Scientific Manuscript database

    This review highlights developments in mycotoxin analysis and sampling over a period between mid-2010 and mid-2011. It covers the major mycotoxins aflatoxins, Alternaria toxins, ergot alkaloids, fumonisins, ochratoxin, patulin, trichothecenes, and zearalenone. Analytical methods for mycotoxins conti...

  12. Developments in mycotoxin analysis: an update for 2009 - 2010

    USDA-ARS?s Scientific Manuscript database

    This review highlights developments in mycotoxin analysis and sampling over a period between mid-2009 and mid-2010. It covers the major mycotoxins aflatoxins, Alternaria toxins, ergot alkaloids, fumonisins, ochratoxin, patulin, trichothecenes, and zearalenone. New and improved methods for mycotoxins...

  13. DEVELOPMENT OF METHOD 535 FOR THE DETERMINATION OF CHLOROACETANILIDE AND OTHER ACETAMIDE HERBICIDE DEGRADATES IN DRINKING WATER BY SOLID PHASE EXTRACTION AND LIQUID CHROMATOGRAPHY/TANDEM MASS SPECTROMETRY

    EPA Science Inventory

    EPA Method 535 has been developed in order to provide a method for the analysis of "Alachlor ESA and other acetanilide degradation products" which are listed on U.S. EPA's 1998 Drinking Water Contaminant Candidate List. Method 535 uses solid phase extraction with a nonporous gr...

  14. Development, verification, and application of a simplified method to estimate total-streambed scour at bridge sites in Illinois

    USGS Publications Warehouse

    Holmes, Robert R.; Dunn, Chad J.

    1996-01-01

    A simplified method to estimate total-streambed scour was developed for application to bridges in the State of Illinois. Scour envelope curves, developed as empirical relations between calculated total scour and bridge-site chracteristics for 213 State highway bridges in Illinois, are used in the method to estimate the 500-year flood scour. These 213 bridges, geographically distributed throughout Illinois, had been previously evaluated for streambed scour with the application of conventional hydraulic and scour-analysis methods recommended by the Federal Highway Administration. The bridge characteristics necessary for application of the simplified bridge scour-analysis method can be obtained from an office review of bridge plans, examination of topographic maps, and reconnaissance-level site inspection. The estimates computed with the simplified method generally resulted in a larger value of 500-year flood total-streambed scour than with the more detailed conventional method. The simplified method was successfully verified with a separate data set of 106 State highway bridges, which are geographically distributed throughout Illinois, and 15 county highway bridges.

  15. A qualitative and quantitative HPTLC densitometry method for the analysis of cannabinoids in Cannabis sativa L.

    PubMed

    Fischedick, Justin T; Glas, Ronald; Hazekamp, Arno; Verpoorte, Rob

    2009-01-01

    Cannabis and cannabinoid based medicines are currently under serious investigation for legitimate development as medicinal agents, necessitating new low-cost, high-throughput analytical methods for quality control. The goal of this study was to develop and validate, according to ICH guidelines, a simple rapid HPTLC method for the quantification of Delta(9)-tetrahydrocannabinol (Delta(9)-THC) and qualitative analysis of other main neutral cannabinoids found in cannabis. The method was developed and validated with the use of pure cannabinoid reference standards and two medicinal cannabis cultivars. Accuracy was determined by comparing results obtained from the HTPLC method with those obtained from a validated HPLC method. Delta(9)-THC gives linear calibration curves in the range of 50-500 ng at 206 nm with a linear regression of y = 11.858x + 125.99 and r(2) = 0.9968. Results have shown that the HPTLC method is reproducible and accurate for the quantification of Delta(9)-THC in cannabis. The method is also useful for the qualitative screening of the main neutral cannabinoids found in cannabis cultivars.

  16. Application of artificial neural networks in nonlinear analysis of trusses

    NASA Technical Reports Server (NTRS)

    Alam, J.; Berke, L.

    1991-01-01

    A method is developed to incorporate neural network model based upon the Backpropagation algorithm for material response into nonlinear elastic truss analysis using the initial stiffness method. Different network configurations are developed to assess the accuracy of neural network modeling of nonlinear material response. In addition to this, a scheme based upon linear interpolation for material data, is also implemented for comparison purposes. It is found that neural network approach can yield very accurate results if used with care. For the type of problems under consideration, it offers a viable alternative to other material modeling methods.

  17. A study on industrial accident rate forecasting and program development of estimated zero accident time in Korea.

    PubMed

    Kim, Tae-gu; Kang, Young-sig; Lee, Hyung-won

    2011-01-01

    To begin a zero accident campaign for industry, the first thing is to estimate the industrial accident rate and the zero accident time systematically. This paper considers the social and technical change of the business environment after beginning the zero accident campaign through quantitative time series analysis methods. These methods include sum of squared errors (SSE), regression analysis method (RAM), exponential smoothing method (ESM), double exponential smoothing method (DESM), auto-regressive integrated moving average (ARIMA) model, and the proposed analytic function method (AFM). The program is developed to estimate the accident rate, zero accident time and achievement probability of an efficient industrial environment. In this paper, MFC (Microsoft Foundation Class) software of Visual Studio 2008 was used to develop a zero accident program. The results of this paper will provide major information for industrial accident prevention and be an important part of stimulating the zero accident campaign within all industrial environments.

  18. Heuristic Task Analysis on E-Learning Course Development: A Formative Research Study

    ERIC Educational Resources Information Center

    Lee, Ji-Yeon; Reigeluth, Charles M.

    2009-01-01

    Utilizing heuristic task analysis (HTA), a method developed for eliciting, analyzing, and representing expertise in complex cognitive tasks, a formative research study was conducted on the task of e-learning course development to further improve the HTA process. Three instructional designers from three different post-secondary institutions in the…

  19. Analysis of the Technical Writing Profession through the DACUM Process.

    ERIC Educational Resources Information Center

    Nolan, Timothy; Green, Marc

    To help develop a curriculum program for technical writers, Cincinnati Technical College used the Developing a Curriculum (DACUM) method to produce a technical writing skills profile. DACUM develops an occupation analysis through a modified brainstorming process by a panel of expert workers under the direction of a qualified coordinator. This…

  20. The need for a usable assessment tool to analyse the efficacy of emergency care systems in developing countries: proposal to use the TEWS methodology.

    PubMed

    Sun, Jared H; Twomey, Michele; Tran, Jeffrey; Wallis, Lee A

    2012-11-01

    Ninety percent of emergency incidents occur in developing countries, and this is only expected to get worse as these nations develop. As a result, governments in developing countries are establishing emergency care systems. However, there is currently no widely-usable, objective method to monitor or research the rapid growth of emergency care in the developing world. Analysis of current quantitative methods to assess emergency care in developing countries, and the proposal of a more appropriate method. Currently accepted methods to quantitatively assess the efficacy of emergency care systems cannot be performed in most developing countries due to weak record-keeping infrastructure and the inappropriateness of applying Western derived coefficients to developing country conditions. As a result, although emergency care in the developing world is rapidly growing, researchers and clinicians are unable to objectively measure its progress or determine which policies work best in their respective countries. We propose the TEWS methodology, a simple analytical tool that can be handled by low-resource, developing countries. By relying on the most basic universal parameters, simplest calculations and straightforward protocol, the TEWS methodology allows for widespread analysis of emergency care in the developing world. This could become essential in the establishment and growth of new emergency care systems worldwide.

  1. Application of ECT inspection to the first wall of a fusion reactor with wavelet analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, G.; Yoshida, Y.; Miya, K.

    1994-12-31

    The first wall of a fusion reactor will be subjected to intensive loads during fusion operations. Since these loads may cause defects in the first wall, nondestructive evaluation techniques of the first wall should be developed. In this paper, we try to apply eddy current testing (ECT) technique to the inspection of the first wall. A method based on current vector potential and wavelet analysis is proposed. Owing to the use of wavelet analysis, a new theory developed recently, the accuracy of the present method is shown to be better than a conventional one.

  2. Stress analysis of ribbon parachutes

    NASA Technical Reports Server (NTRS)

    Reynolds, D. T.; Mullins, W. M.

    1975-01-01

    An analytical method has been developed for determining the internal load distribution for ribbon parachutes subjected to known riser and aerodynamic forces. Finite elements with non-linear elastic properties represent the parachute structure. This method is an extension of the analysis previously developed by the authors and implemented in the digital computer program CANO. The present analysis accounts for the effect of vertical ribbons in the solution for canopy shape and stress distribution. Parametric results are presented which relate the canopy stress distribution to such factors as vertical ribbon strength, number of gores, and gore shape in a ribbon parachute.

  3. Emerging and recurrent issues in drug development.

    PubMed

    Anello, C

    This paper reviews several emerging and recurrent issues relating to the drug development process. These emerging issues include changes to the FDA regulatory environment, internationalization of drug development, advances in computer technology and visualization tools, and efforts to incorporate meta-analysis methodology. Recurrent issues include: renewed interest in statistical methods for handling subgroups in the design and analysis of clinical trials; renewed interest in alternatives to the 'intention-to-treat' analysis in the presence of non-compliance in randomized clinical trials; renewed interest in methodology to address the multiplicities resulting from a variety of sources inherent in the drug development process, and renewed interest in methods to assure data integrity. These emerging and recurrent issues provide a continuing challenge to the international community of statisticians involved in drug development. Moreover, the involvement of statisticians with different perspectives continues to enrich the field and contributes to improvement in the public health.

  4. Integrating Allergen Analysis Within a Risk Assessment Framework: Approaches to Development of Targeted Mass Spectrometry Methods for Allergen Detection and Quantification in the iFAAM Project.

    PubMed

    Nitride, Chiara; Lee, Victoria; Baricevic-Jones, Ivona; Adel-Patient, Karine; Baumgartner, Sabine; Mills, E N Clare

    2018-01-01

    Allergen analysis is central to implementing and monitoring food allergen risk assessment and management processes by the food industry, but current methods for the determination of allergens in foods give highly variable results. The European Union-funded "Integrated Approaches to Food Allergen and Allergy Risk Management" (iFAAM) project has been working to address gaps in knowledge regarding food allergen management and analysis, including the development of novel MS and immuno-based allergen determination methods. Common allergenic food ingredients (peanut, hazelnut, walnut, cow's milk [Bos domesticus], and hen's egg [Gallus domesticus]) and common food matrixes (chocolate dessert and cookie) have been used for both clinical studies and analytical method development to ensure that the new methods are clinically relevant. Allergen molecules have been used as analytical targets and allergenic ingredients incurred into matrixes at levels close to reference doses that may trigger the use of precautionary allergen labeling. An interlaboratory method comparison has been undertaken for the determination of peanut in chocolate dessert using MS and immuno-based methods. The iFAAM approach has highlighted the need for methods to report test results in allergenic protein. This will allow food business operators to use them in risk assessments that are founded on clinical study data in which protein has been used as a measure of allergenic potency.

  5. Thematic Analysis of the Children's Drawings on Museum Visit: Adaptation of the Kuhn's Method

    ERIC Educational Resources Information Center

    Kisovar-Ivanda, Tamara

    2014-01-01

    Researchers are using techniques that allow children to express their perspectives. In 2003, Kuhn developed the method of data collection and analysis which combined thematic drawing and focused, episodic interview. In this article the Kuhn's method is adjusted using the draw and write technique as a research methodology. Reflections on the…

  6. Social Phenomenological Analysis as a Research Method in Art Education: Developing an Empirical Model for Understanding Gallery Talks

    ERIC Educational Resources Information Center

    Hofmann, Fabian

    2016-01-01

    Social phenomenological analysis is presented as a research method to study gallery talks or guided tours in art museums. The research method is based on the philosophical considerations of Edmund Husserl and sociological/social science concepts put forward by Max Weber and Alfred Schuetz. Its starting point is the everyday lifeworld; the…

  7. High-resolution gas chromatography/mas spectrometry method for characterization and quantitative analysis of ginkgolic acids in ginkgo biloba plants, extracts, and dietary supplements

    USDA-ARS?s Scientific Manuscript database

    A high resolution GC/MS with Selected Ion Monitor (SIM) method focusing on the characterization and quantitative analysis of ginkgolic acids (GAs) in Ginkgo biloba L. plant materials, extracts and commercial products was developed and validated. The method involved sample extraction with (1:1) meth...

  8. Recent statistical methods for orientation data

    NASA Technical Reports Server (NTRS)

    Batschelet, E.

    1972-01-01

    The application of statistical methods for determining the areas of animal orientation and navigation are discussed. The method employed is limited to the two-dimensional case. Various tests for determining the validity of the statistical analysis are presented. Mathematical models are included to support the theoretical considerations and tables of data are developed to show the value of information obtained by statistical analysis.

  9. The Analysis of Likert Scales Using State Multipoles: An Application of Quantum Methods to Behavioral Sciences Data

    ERIC Educational Resources Information Center

    Camparo, James; Camparo, Lorinda B.

    2013-01-01

    Though ubiquitous, Likert scaling's traditional mode of analysis is often unable to uncover all of the valid information in a data set. Here, the authors discuss a solution to this problem based on methodology developed by quantum physicists: the state multipole method. The authors demonstrate the relative ease and value of this method by…

  10. Analysis of Carbamate Pesticides: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS666

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for analysis of aldicarb, bromadiolone, carbofuran, oxamyl, and methomyl in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS666. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in MS666 for analysis of carbamatemore » pesticides in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS666 can be determined.« less

  11. Analysis of Thiodiglycol: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS777

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for the analysis of thiodiglycol, the breakdown product of the sulfur mustard HD, in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS777 (hereafter referred to as EPA CRL SOP MS777). This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to verifymore » the analytical procedures described in MS777 for analysis of thiodiglycol in aqueous samples. The gathered data from this study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS777 can be determined.« less

  12. FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES

    DTIC Science & Technology

    2017-06-01

    FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and

  13. COMPARE : a method for analyzing investment alternatives in industrial wood and bark energy systems

    Treesearch

    Peter J. Ince

    1983-01-01

    COMPARE is a FORTRAN computer program resulting from a study to develop methods for comparative economic analysis of alternatives in industrial wood and bark energy systems. COMPARE provides complete guidelines for economic analysis of wood and bark energy systems. As such, COMPARE can be useful to those who have only basic familiarity with investment analysis of wood...

  14. Summary of research in applied mathematics, numerical analysis, and computer sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  15. Using Personal Ads and Online Self-Help Groups to Teach Content Analysis in a Research Methods Course

    ERIC Educational Resources Information Center

    Finn, Jerry; Dillon, Caroline

    2007-01-01

    This paper describes methods for teaching content analysis as part of the Research sequence in social work education. Teaching content analysis is used to develop research skills as well as to promote students' knowledge and critical thinking and about new information technology resources that are being increasingly used by the general public. The…

  16. Flow Injection Technique for Biochemical Analysis with Chemiluminescence Detection in Acidic Media

    PubMed Central

    Chen, Jing; Fang, Yanjun

    2007-01-01

    A review with 90 references is presented to show the development of acidic chemiluminescence methods for biochemical analysis by use of flow injection technique in the last 10 years. A brief discussion of both the chemiluminescence and flow injection technique is given. The proposed methods for biochemical analysis are described and compared according to the used chemiluminescence system.

  17. Reliability Validation and Improvement Framework

    DTIC Science & Technology

    2012-11-01

    systems . Steps in that direction include the use of the Architec- ture Tradeoff Analysis Method ® (ATAM®) developed at the Carnegie Mellon...embedded software • cyber - physical systems (CPSs) to indicate that the embedded software interacts with, manag - es, and controls a physical system [Lee...the use of formal static analysis methods to increase our confidence in system operation beyond testing. However, analysis results

  18. Teaching learning methods of an entrepreneurship curriculum.

    PubMed

    Esmi, Keramat; Marzoughi, Rahmatallah; Torkzadeh, Jafar

    2015-10-01

    One of the most significant elements of entrepreneurship curriculum design is teaching-learning methods, which plays a key role in studies and researches related to such a curriculum. It is the teaching method, and systematic, organized and logical ways of providing lessons that should be consistent with entrepreneurship goals and contents, and should also be developed according to the learners' needs. Therefore, the current study aimed to introduce appropriate, modern, and effective methods of teaching entrepreneurship and their validation. This is a mixed method research of a sequential exploratory kind conducted through two stages: a) developing teaching methods of entrepreneurship curriculum, and b) validating developed framework. Data were collected through "triangulation" (study of documents, investigating theoretical basics and the literature, and semi-structured interviews with key experts). Since the literature on this topic is very rich, and views of the key experts are vast, directed and summative content analysis was used. In the second stage, qualitative credibility of research findings was obtained using qualitative validation criteria (credibility, confirmability, and transferability), and applying various techniques. Moreover, in order to make sure that the qualitative part is reliable, reliability test was used. Moreover, quantitative validation of the developed framework was conducted utilizing exploratory and confirmatory factor analysis methods and Cronbach's alpha. The data were gathered through distributing a three-aspect questionnaire (direct presentation teaching methods, interactive, and practical-operational aspects) with 29 items among 90 curriculum scholars. Target population was selected by means of purposive sampling and representative sample. Results obtained from exploratory factor analysis showed that a three factor structure is an appropriate method for describing elements of teaching-learning methods of entrepreneurship curriculum. Moreover, the value for Kaiser Meyer Olkin measure of sampling adequacy equaled 0.72 and the value for Bartlett's test of variances homogeneity was significant at the 0.0001 level. Except for internship element, the rest had a factor load of higher than 0.3. Also, the results of confirmatory factor analysis showed the model appropriateness, and the criteria for qualitative accreditation were acceptable. Developed model can help instructors in selecting an appropriate method of entrepreneurship teaching, and it can also make sure that the teaching is on the right path. Moreover, the model is comprehensive and includes all the effective teaching methods in entrepreneurship education. It is also based on qualities, conditions, and requirements of Higher Education Institutions in Iranian cultural environment.

  19. WEC Design Response Toolbox v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey

    2016-03-30

    The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.

  20. Monitoring for airborne allergens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burge, H.A.

    1992-07-01

    Monitoring for allergens can provide some information on the kinds and levels of exposure experienced by local patient populations, providing volumetric methods are used for sample collection and analysis is accurate and consistent. Such data can also be used to develop standards for the specific environment and to begin to develop predictive models. Comparing outdoor allergen aerosols between different monitoring sites requires identical collection and analysis methods and some kind of rational standard, whether arbitrary, or based on recognized health effects.32 references.

  1. Continuous monitoring of seasonal phenological development by BBCH code

    NASA Astrophysics Data System (ADS)

    Cornelius, Christine; Estrella, Nicole; Menzel, Annette

    2010-05-01

    Phenology, the science of recurrent seasonal natural events, is a proxy for changes in ecosystems due to recent global climate change. Phenological studies mostly deal with data considering the beginning of different development stages e.g. budburst or the beginning of flowering. Just few studies focus on the end of phenological stages, such as the end of flowering or seed dispersal. Information about the entire development cycle of plants, including data of the end of stages, are received by observing plants according to the extended BBCH-scale (MEIER 1997). The scale is a standardized growth stage key which allows a less labor intensive, weekly observation rhythm. Every week frequencies of all occurring phenological stages are noted. These frequencies then constitute the basis to interpolate the development of each phenological stage, even though it was not being seen during field work. Due to the lack of studies using this kind of key for observations over the entire development cycle there is no common methodology to analyze the data. So our objective was to find a method of analysis, with which onset dates as well as endpoints of each development stage could be defined. Three different methods of analysis were compared. Results show that there is no significant difference in onset dates of phenological stages between all methods tested. However, the method of pooled pre/post stage development seems to be most suitable for climate change studies, followed by the method of cumulative stage development and the method of weighted plant development.

  2. Using Mixed Methods to Analyze Video Data: A Mathematics Teacher Professional Development Example

    ERIC Educational Resources Information Center

    DeCuir-Gunby, Jessica T.; Marshall, Patricia L.; McCulloch, Allison W.

    2012-01-01

    This article uses data from 65 teachers participating in a K-2 mathematics professional development research project as an example of how to analyze video recordings of teachers' classroom lessons using mixed methods. Through their discussion, the authors demonstrate how using a mixed methods approach to classroom video analysis allows researchers…

  3. Graphics Flutter Analysis Methods, an interactive computing system at Lockheed-California Company

    NASA Technical Reports Server (NTRS)

    Radovcich, N. A.

    1975-01-01

    An interactive computer graphics system, Graphics Flutter Analysis Methods (GFAM), was developed to complement FAMAS, a matrix-oriented batch computing system, and other computer programs in performing complex numerical calculations using a fully integrated data management system. GFAM has many of the matrix operation capabilities found in FAMAS, but on a smaller scale, and is utilized when the analysis requires a high degree of interaction between the engineer and computer, and schedule constraints exclude the use of batch entry programs. Applications of GFAM to a variety of preliminary design, development design, and project modification programs suggest that interactive flutter analysis using matrix representations is a feasible and cost effective computing tool.

  4. Analysis of methods of processing of expert information by optimization of administrative decisions

    NASA Astrophysics Data System (ADS)

    Churakov, D. Y.; Tsarkova, E. G.; Marchenko, N. D.; Grechishnikov, E. V.

    2018-03-01

    In the real operation the measure definition methodology in case of expert estimation of quality and reliability of application-oriented software products is offered. In operation methods of aggregation of expert estimates on the example of a collective choice of an instrumental control projects in case of software development of a special purpose for needs of institutions are described. Results of operation of dialogue decision making support system are given an algorithm of the decision of the task of a choice on the basis of a method of the analysis of hierarchies and also. The developed algorithm can be applied by development of expert systems to the solution of a wide class of the tasks anyway connected to a multicriteria choice.

  5. A MULTI-RESIDUE METHOD FOR THE ANALYSIS OF INSECTICIDES COLLECTED ON COTTON SURFACE WIPES

    EPA Science Inventory

    A method was developed for the extraction, clean-up, and analysis of multiple pesticides from cotton wipe media used in human exposure studies to collect residues from residential hard surfaces. Measurements of pesticides are critical for estimating dermal and indirect ingestion ...

  6. Multiplexed microsatellite recovery using massively parallel sequencing

    Treesearch

    T.N. Jennings; B.J. Knaus; T.D. Mullins; S.M. Haig; R.C. Cronn

    2011-01-01

    Conservation and management of natural populations requires accurate and inexpensive genotyping methods. Traditional microsatellite, or simple sequence repeat (SSR), marker analysis remains a popular genotyping method because of the comparatively low cost of marker development, ease of analysis and high power of genotype discrimination. With the availability of...

  7. Nonlinear analysis of structures. [within framework of finite element method

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.

    1974-01-01

    The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.

  8. Simplified method to solve sound transmission through structures lined with elastic porous material.

    PubMed

    Lee, J H; Kim, J

    2001-11-01

    An approximate analysis method is developed to calculate sound transmission through structures lined with porous material. Because the porous material has both the solid phase and fluid phase, three wave components exist in the material, which makes the related analysis very complicated. The main idea in developing the approximate method is very simple: modeling the porous material using only the strongest of the three waves, which in effect idealizes the material as an equivalent fluid. The analysis procedure has to be conducted in two steps. In the first step, sound transmission through a flat double panel with a porous liner of infinite extents, which has the same cross sectional construction as the actual structure, is solved based on the full theory and the strongest wave component is identified. In the second step sound transmission through the actual structure is solved modeling the porous material as an equivalent fluid while using the actual geometry of the structure. The development and validation of the method are discussed in detail. As an application example, the transmission loss through double walled cylindrical shells with a porous core is calculated utilizing the simplified method.

  9. Development, validation and comparison of NIR and Raman methods for the identification and assay of poor-quality oral quinine drops.

    PubMed

    Mbinze, J K; Sacré, P-Y; Yemoa, A; Mavar Tayey Mbay, J; Habyalimana, V; Kalenda, N; Hubert, Ph; Marini, R D; Ziemons, E

    2015-01-01

    Poor quality antimalarial drugs are one of the public's major health problems in Africa. The depth of this problem may be explained in part by the lack of effective enforcement and the lack of efficient local drug analysis laboratories. To tackle part of this issue, two spectroscopic methods with the ability to detect and to quantify quinine dihydrochloride in children's oral drops formulations were developed and validated. Raman and near infrared (NIR) spectroscopy were selected for the drug analysis due to their low cost, non-destructive and rapid characteristics. Both of the methods developed were successfully validated using the total error approach in the range of 50-150% of the target concentration (20%W/V) within the 10% acceptance limits. Samples collected on the Congolese pharmaceutical market were analyzed by both techniques to detect potentially substandard drugs. After a comparison of the analytical performance of both methods, it has been decided to implement the method based on NIR spectroscopy to perform the routine analysis of quinine oral drop samples in the Quality Control Laboratory of Drugs at the University of Kinshasa (DRC). Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Sample size and power considerations in network meta-analysis

    PubMed Central

    2012-01-01

    Background Network meta-analysis is becoming increasingly popular for establishing comparative effectiveness among multiple interventions for the same disease. Network meta-analysis inherits all methodological challenges of standard pairwise meta-analysis, but with increased complexity due to the multitude of intervention comparisons. One issue that is now widely recognized in pairwise meta-analysis is the issue of sample size and statistical power. This issue, however, has so far only received little attention in network meta-analysis. To date, no approaches have been proposed for evaluating the adequacy of the sample size, and thus power, in a treatment network. Findings In this article, we develop easy-to-use flexible methods for estimating the ‘effective sample size’ in indirect comparison meta-analysis and network meta-analysis. The effective sample size for a particular treatment comparison can be interpreted as the number of patients in a pairwise meta-analysis that would provide the same degree and strength of evidence as that which is provided in the indirect comparison or network meta-analysis. We further develop methods for retrospectively estimating the statistical power for each comparison in a network meta-analysis. We illustrate the performance of the proposed methods for estimating effective sample size and statistical power using data from a network meta-analysis on interventions for smoking cessation including over 100 trials. Conclusion The proposed methods are easy to use and will be of high value to regulatory agencies and decision makers who must assess the strength of the evidence supporting comparative effectiveness estimates. PMID:22992327

  11. Comprehensive evaluation of global energy interconnection development index

    NASA Astrophysics Data System (ADS)

    Liu, Lin; Zhang, Yi

    2018-04-01

    Under the background of building global energy interconnection and realizing green and low-carbon development, this article constructed the global energy interconnection development index system which based on the current situation of global energy interconnection development. Through using the entropy method for the weight analysis of global energy interconnection development index, and then using gray correlation method to analyze the selected countries, this article got the global energy interconnection development index ranking and level classification.

  12. Heading in the right direction: thermodynamics-based network analysis and pathway engineering.

    PubMed

    Ataman, Meric; Hatzimanikatis, Vassily

    2015-12-01

    Thermodynamics-based network analysis through the introduction of thermodynamic constraints in metabolic models allows a deeper analysis of metabolism and guides pathway engineering. The number and the areas of applications of thermodynamics-based network analysis methods have been increasing in the last ten years. We review recent applications of these methods and we identify the areas that such analysis can contribute significantly, and the needs for future developments. We find that organisms with multiple compartments and extremophiles present challenges for modeling and thermodynamics-based flux analysis. The evolution of current and new methods must also address the issues of the multiple alternatives in flux directionalities and the uncertainties and partial information from analytical methods. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Developments in mycotoxin analysis: an update for 2012 – 2013

    USDA-ARS?s Scientific Manuscript database

    This review highlights developments in mycotoxin analysis and sampling over a period between mid-2012 and mid-2013. It covers the major mycotoxins: aflatoxins, Alternaria toxins, ergot alkaloids, fumonisins, ochratoxin, patulin, trichothecenes, and zearalenone. A wide range of analytical methods for...

  14. Factor Analysis Methods and Validity Evidence: A Systematic Review of Instrument Development across the Continuum of Medical Education

    ERIC Educational Resources Information Center

    Wetzel, Angela Payne

    2011-01-01

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across…

  15. Checking Equity: Why Differential Item Functioning Analysis Should Be a Routine Part of Developing Conceptual Assessments

    ERIC Educational Resources Information Center

    Martinková, Patricia; Drabinová, Adéla; Liaw, Yuan-Ling; Sanders, Elizabeth A.; McFarland, Jenny L.; Price, Rebecca M.

    2017-01-01

    We provide a tutorial on differential item functioning (DIF) analysis, an analytic method useful for identifying potentially biased items in assessments. After explaining a number of methodological approaches, we test for gender bias in two scenarios that demonstrate why DIF analysis is crucial for developing assessments, particularly because…

  16. Development of test methods for scale model simulation of aerial applications in the NASA Langley Vortex Research Facility. [agricultural aircraft

    NASA Technical Reports Server (NTRS)

    Jordan, F. L., Jr.

    1980-01-01

    As part of basic research to improve aerial applications technology, methods were developed at the Langley Vortex Research Facility to simulate and measure deposition patterns of aerially-applied sprays and granular materials by means of tests with small-scale models of agricultural aircraft and dynamically-scaled test particles. Interactions between the aircraft wake and the dispersed particles are being studied with the objective of modifying wake characteristics and dispersal techniques to increase swath width, improve deposition pattern uniformity, and minimize drift. The particle scaling analysis, test methods for particle dispersal from the model aircraft, visualization of particle trajectories, and measurement and computer analysis of test deposition patterns are described. An experimental validation of the scaling analysis and test results that indicate improved control of chemical drift by use of winglets are presented to demonstrate test methods.

  17. An improved method for the analysis of sennosides in Cassia angustifolia by high-performance liquid chromatography.

    PubMed

    Bala, S; Uniyal, G C; Dubey, T; Singh, S P

    2001-01-01

    A reversed-phase column liquid chromatographic method for the analysis of sennosides A and B present in leaf and pod extracts of Cassia angustifolia has been developed using a Symmetry C18 column and a linear binary gradient profile. The method can be utilised for the quantitative determination of other sennosides as a baseline resolution for most of the constituents was achieved. The method is economical in terms of the time taken and the amount of solvent used (25 mL) for each analysis. The validity of the method with respect to analysis was confirmed by comparing the UV spectra of each peak with those of reference compounds using a photodiode array detector.

  18. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.

    2003-01-01

    An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

  19. Partial spline models for the inclusion of tropopause and frontal boundary information in otherwise smooth two- and three-dimensional objective analysis

    NASA Technical Reports Server (NTRS)

    Shiau, Jyh-Jen; Wahba, Grace; Johnson, Donald R.

    1986-01-01

    A new method, based on partial spline models, is developed for including specified discontinuities in otherwise smooth two- and three-dimensional objective analyses. The method is appropriate for including tropopause height information in two- and three-dimensinal temperature analyses, using the O'Sullivan-Wahba physical variational method for analysis of satellite radiance data, and may in principle be used in a combined variational analysis of observed, forecast, and climate information. A numerical method for its implementation is described and a prototype two-dimensional analysis based on simulated radiosonde and tropopause height data is shown. The method may also be appropriate for other geophysical problems, such as modeling the ocean thermocline, fronts, discontinuities, etc.

  20. Analysis Study of Stevioside and Rebaudioside A from Stevia rebaudiana Bertoni by Normal Phase SPE and RP-HPLC

    NASA Astrophysics Data System (ADS)

    Martono, Y.; Rohman, A.; Riyanto, S.; Martono, S.

    2018-04-01

    Solid Phase Extraction (SPE) method using silica as sorbent for stevioside and rebaudiosida A analysis in Stevia rebaudiana Bertoni leaf have not been performed. The aim of this study is to develop SPE method using silica as sorbent for Reverse Phase-High Performance Liquid Chromatography (RP-HPLC) analysis of stevioside and rebaudiosida A in S. rebaudiana leaf. The results of this study indicate that the optimal conditions for normal phase SPE (silica) are conditioned with 3.0 mL of hexane. The sample loading volume is 0.1 mL. Cartridge is eluted with 1.0 mL acetonitrile: water (80: 20, v/v) to separate both analytes. The cartridge is washed with chloroform and water of 0.3 mL respectively. The developed SPE sample preparation method meets the accuracy and precision test and can be used for the analysis of stevioside and rebaudioside A by RP-HPLC.

  1. Development of a computer technique for the prediction of transport aircraft flight profile sonic boom signatures. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Coen, Peter G.

    1991-01-01

    A new computer technique for the analysis of transport aircraft sonic boom signature characteristics was developed. This new technique, based on linear theory methods, combines the previously separate equivalent area and F function development with a signature propagation method using a single geometry description. The new technique was implemented in a stand-alone computer program and was incorporated into an aircraft performance analysis program. Through these implementations, both configuration designers and performance analysts are given new capabilities to rapidly analyze an aircraft's sonic boom characteristics throughout the flight envelope.

  2. Data analysis software for the autoradiographic enhancement process. Volumes 1, 2, and 3, and appendix

    NASA Technical Reports Server (NTRS)

    Singh, S. P.

    1979-01-01

    The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.

  3. Quantitative analysis of boeravinones in the roots of Boerhaavia Diffusa by UPLC/PDA.

    PubMed

    Bairwa, Khemraj; Srivastava, Amit; Jachak, Sanjay Madhukar

    2014-01-01

    Boerhaavia diffusa is a perennial herb belonging to Nyctaginaceae. Various classes of chemical constituents such as phenolics (boeravinones), terpenoids and organic acids have been reported in B. diffusa roots. As boeravinones have been proposed as putative active constituents for the anti-cancer, spasmolytic and anti-inflammatory activities exhibited by B. diffusa extracts, it is worthwhile developing and validating an ultra-performance liquid chromatography (UPLC) method for analysis of boeravinones in B. diffusa roots. To develop and validate a simple, accurate, robust and rapid UPLC analytical method for quality control of B. diffusa roots. Samples for analysis were prepared by refluxing powdered root material with methanol for 2 h. The extracts were concentrated, dried and stored at -20°C until their use. A UPLC with photodiode array (PDA) method was developed and validated for the quantification of boeravinones in the roots of B. diffusa. The separation of boeravinones was achieved using a BEH Shield C18 -column (2.1 × 100 mm, 1.7 µm) with gradient elution of methanol and water (0.1% acetic acid), at a flow rate of 0.4 mL/min and detection was carried out at λmax 273 nm. The UPLC method developed showed good linearity (r(2)  ≥ 0.9999), accuracy and precision. The UPLC method developed provided a selective, sensitive and rapid analytical method for the quantification of boeravinones in B. diffusa roots. All the validation parameters were found to be within the permissible limits as per International Conference on Harmonisation guidelines. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Transient analysis mode participation for modal survey target mode selection using MSC/NASTRAN DMAP

    NASA Technical Reports Server (NTRS)

    Barnett, Alan R.; Ibrahim, Omar M.; Sullivan, Timothy L.; Goodnight, Thomas W.

    1994-01-01

    Many methods have been developed to aid analysts in identifying component modes which contribute significantly to component responses. These modes, typically targeted for dynamic model correlation via a modal survey, are known as target modes. Most methods used to identify target modes are based on component global dynamic behavior. It is sometimes unclear if these methods identify all modes contributing to responses important to the analyst. These responses are usually those in areas of hardware design concerns. One method used to check the completeness of target mode sets and identify modes contributing significantly to important component responses is mode participation. With this method, the participation of component modes in dynamic responses is quantified. Those modes which have high participation are likely modal survey target modes. Mode participation is most beneficial when it is used with responses from analyses simulating actual flight events. For spacecraft, these responses are generated via a structural dynamic coupled loads analysis. Using MSC/NASTRAN DMAP, a method has been developed for calculating mode participation based on transient coupled loads analysis results. The algorithm has been implemented to be compatible with an existing coupled loads methodology and has been used successfully to develop a set of modal survey target modes.

  5. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  6. Analysis of Flexible Bars and Frames with Large Displacements of Nodes By Finite Element Method in the Form of Classical Mixed Method

    NASA Astrophysics Data System (ADS)

    Ignatyev, A. V.; Ignatyev, V. A.; Onischenko, E. V.

    2017-11-01

    This article is the continuation of the work made bt the authors on the development of the algorithms that implement the finite element method in the form of a classical mixed method for the analysis of geometrically nonlinear bar systems [1-3]. The paper describes an improved algorithm of the formation of the nonlinear governing equations system for flexible plane frames and bars with large displacements of nodes based on the finite element method in a mixed classical form and the use of the procedure of step-by-step loading. An example of the analysis is given.

  7. Element Library for Three-Dimensional Stress Analysis by the Integrated Force Method

    NASA Technical Reports Server (NTRS)

    Kaljevic, Igor; Patnaik, Surya N.; Hopkins, Dale A.

    1996-01-01

    The Integrated Force Method, a recently developed method for analyzing structures, is extended in this paper to three-dimensional structural analysis. First, a general formulation is developed to generate the stress interpolation matrix in terms of complete polynomials of the required order. The formulation is based on definitions of the stress tensor components in term of stress functions. The stress functions are written as complete polynomials and substituted into expressions for stress components. Then elimination of the dependent coefficients leaves the stress components expressed as complete polynomials whose coefficients are defined as generalized independent forces. Such derived components of the stress tensor identically satisfy homogenous Navier equations of equilibrium. The resulting element matrices are invariant with respect to coordinate transformation and are free of spurious zero-energy modes. The formulation provides a rational way to calculate the exact number of independent forces necessary to arrive at an approximation of the required order for complete polynomials. The influence of reducing the number of independent forces on the accuracy of the response is also analyzed. The stress fields derived are used to develop a comprehensive finite element library for three-dimensional structural analysis by the Integrated Force Method. Both tetrahedral- and hexahedral-shaped elements capable of modeling arbitrary geometric configurations are developed. A number of examples with known analytical solutions are solved by using the developments presented herein. The results are in good agreement with the analytical solutions. The responses obtained with the Integrated Force Method are also compared with those generated by the standard displacement method. In most cases, the performance of the Integrated Force Method is better overall.

  8. Influence of ECG sampling rate in fetal heart rate variability analysis.

    PubMed

    De Jonckheere, J; Garabedian, C; Charlier, P; Champion, C; Servan-Schreiber, E; Storme, L; Debarge, V; Jeanne, M; Logier, R

    2017-07-01

    Fetal hypoxia results in a fetal blood acidosis (pH<;7.10). In such a situation, the fetus develops several adaptation mechanisms regulated by the autonomic nervous system. Many studies demonstrated significant changes in heart rate variability in hypoxic fetuses. So, fetal heart rate variability analysis could be of precious help for fetal hypoxia prediction. Commonly used fetal heart rate variability analysis methods have been shown to be sensitive to the ECG signal sampling rate. Indeed, a low sampling rate could induce variability in the heart beat detection which will alter the heart rate variability estimation. In this paper, we introduce an original fetal heart rate variability analysis method. We hypothesize that this method will be less sensitive to ECG sampling frequency changes than common heart rate variability analysis methods. We then compared the results of this new heart rate variability analysis method with two different sampling frequencies (250-1000 Hz).

  9. PROOF OF CONCEPT FOR A HUMAN RELIABILITY ANALYSIS METHOD FOR HEURISTIC USABILITY EVALUATION OF SOFTWARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe

    2005-09-01

    An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings withmore » HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.« less

  10. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  11. GenomeFingerprinter: the genome fingerprint and the universal genome fingerprint analysis for systematic comparative genomics.

    PubMed

    Ai, Yuncan; Ai, Hannan; Meng, Fanmei; Zhao, Lei

    2013-01-01

    No attention has been paid on comparing a set of genome sequences crossing genetic components and biological categories with far divergence over large size range. We define it as the systematic comparative genomics and aim to develop the methodology. First, we create a method, GenomeFingerprinter, to unambiguously produce a set of three-dimensional coordinates from a sequence, followed by one three-dimensional plot and six two-dimensional trajectory projections, to illustrate the genome fingerprint of a given genome sequence. Second, we develop a set of concepts and tools, and thereby establish a method called the universal genome fingerprint analysis (UGFA). Particularly, we define the total genetic component configuration (TGCC) (including chromosome, plasmid, and phage) for describing a strain as a systematic unit, the universal genome fingerprint map (UGFM) of TGCC for differentiating strains as a universal system, and the systematic comparative genomics (SCG) for comparing a set of genomes crossing genetic components and biological categories. Third, we construct a method of quantitative analysis to compare two genomes by using the outcome dataset of genome fingerprint analysis. Specifically, we define the geometric center and its geometric mean for a given genome fingerprint map, followed by the Euclidean distance, the differentiate rate, and the weighted differentiate rate to quantitatively describe the difference between two genomes of comparison. Moreover, we demonstrate the applications through case studies on various genome sequences, giving tremendous insights into the critical issues in microbial genomics and taxonomy. We have created a method, GenomeFingerprinter, for rapidly computing, geometrically visualizing, intuitively comparing a set of genomes at genome fingerprint level, and hence established a method called the universal genome fingerprint analysis, as well as developed a method of quantitative analysis of the outcome dataset. These have set up the methodology of systematic comparative genomics based on the genome fingerprint analysis.

  12. Analysis of imazaquin in soybeans by solid-phase extraction and high-performance liquid chromatography.

    PubMed

    Guo, C; Hu, J-Y; Chen, X-Y; Li, J-Z

    2008-02-01

    An analytical method for the determination imazaquin residues in soybeans was developed. The developed liquid/liquid partition and strong anion exchange solid-phase extraction procedures provide the effective cleanup, removing the greatest number of sample matrix interferences. By optimizing mobile-phase pH water/acetonitrile conditions with phosphoric acid, using a C-18 reverse-phase chromatographic column and employing ultraviolet detection, excellent peak resolution was achieved. The combined cleanup and chromatographic method steps reported herein were sensitive and reliable for determining the imazaquin residues in soybean samples. This method is characterized by recovery >88.4%, precision <6.7% CV, and sensitivity of 0.005 ppm, in agreement with directives for method validation in residue analysis. Imazaquin residues in soybeans were further confirmed by high performance liquid chromatography-mass spectrometry (LC-MS). The proposed method was successfully applied to the analysis of imazaquin residues in soybean samples grown in an experimental field after treatments of imazaquin formulation.

  13. Development and Validation of an HPLC Method for Karanjin in Pongamia pinnata linn. Leaves.

    PubMed

    Katekhaye, S; Kale, M S; Laddha, K S

    2012-01-01

    A rapid, simple and specific reversed-phase HPLC method has been developed for analysis of karanjin in Pongamia pinnata Linn. leaves. HPLC analysis was performed on a C(18) column using an 85:13.5:1.5 (v/v) mixtures of methanol, water and acetic acid as isocratic mobile phase at a flow rate of 1 ml/min. UV detection was at 300 nm. The method was validated for accuracy, precision, linearity, specificity. Validation revealed the method is specific, accurate, precise, reliable and reproducible. Good linear correlation coefficients (r(2)>0.997) were obtained for calibration plots in the ranges tested. Limit of detection was 4.35 μg and limit of quantification was 16.56 μg. Intra and inter-day RSD of retention times and peak areas was less than 1.24% and recovery was between 95.05 and 101.05%. The established HPLC method is appropriate enabling efficient quantitative analysis of karanjin in Pongamia pinnata leaves.

  14. Development and Validation of an HPLC Method for Karanjin in Pongamia pinnata linn. Leaves

    PubMed Central

    Katekhaye, S; Kale, M. S.; Laddha, K. S.

    2012-01-01

    A rapid, simple and specific reversed-phase HPLC method has been developed for analysis of karanjin in Pongamia pinnata Linn. leaves. HPLC analysis was performed on a C18 column using an 85:13.5:1.5 (v/v) mixtures of methanol, water and acetic acid as isocratic mobile phase at a flow rate of 1 ml/min. UV detection was at 300 nm. The method was validated for accuracy, precision, linearity, specificity. Validation revealed the method is specific, accurate, precise, reliable and reproducible. Good linear correlation coefficients (r2>0.997) were obtained for calibration plots in the ranges tested. Limit of detection was 4.35 μg and limit of quantification was 16.56 μg. Intra and inter-day RSD of retention times and peak areas was less than 1.24% and recovery was between 95.05 and 101.05%. The established HPLC method is appropriate enabling efficient quantitative analysis of karanjin in Pongamia pinnata leaves. PMID:23204626

  15. Neutral monosaccharide composition analysis of plant-derived oligo- and polysaccharides by high performance liquid chromatography.

    PubMed

    Yan, Jun; Shi, Songshan; Wang, Hongwei; Liu, Ruimin; Li, Ning; Chen, Yonglin; Wang, Shunchun

    2016-01-20

    A novel analytical method for neutral monosaccharide composition analysis of plant-derived oligo- and polysaccharides was developed using hydrophilic interaction liquid chromatography coupled to a charged aerosol detector. The effects of column type, additives, pH and column temperature on retention and separation were evaluated. Additionally, the method could distinguish potential impurities in samples, including chloride, sulfate and sodium, from sugars. The results of validation demonstrated that this method had good linearity (R(2) ≥ 0.9981), high precision (relative standard deviation ≤ 4.43%), and adequate accuracy (94.02-103.37% recovery) and sensitivity (detection limit: 15-40 ng). Finally, the monosaccharide compositions of the polysaccharide from Eclipta prostrasta L. and stachyose were successfully profiled through this method. This report represents the first time that all of these common monosaccharides could be well-separated and determined simultaneously by high performance liquid chromatography without additional derivatization. This newly developed method is convenient, efficient and reliable for monosaccharide analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. A theoretical treatment of technical risk in modern propulsion system design

    NASA Astrophysics Data System (ADS)

    Roth, Bryce Alexander

    2000-09-01

    A prevalent trend in modern aerospace systems is increasing complexity and cost, which in turn drives increased risk. Consequently, there is a clear and present need for the development of formalized methods to analyze the impact of risk on the design of aerospace vehicles. The objective of this work is to develop such a method that enables analysis of risk via a consistent, comprehensive treatment of aerothermodynamic and mass properties aspects of vehicle design. The key elements enabling the creation of this methodology are recent developments in the analytical estimation of work potential based on the second law of thermodynamics. This dissertation develops the theoretical foundation of a vehicle analysis method based on work potential and validates it using the Northrop F-5E with GE J85-GE-21 engines as a case study. Although the method is broadly applicable, emphasis is given to aircraft propulsion applications. Three work potential figures of merit are applied using this method: exergy, available energy, and thrust work potential. It is shown that each possesses unique properties making them useful for specific vehicle analysis tasks, though the latter two are actually special cases of exergy. All three are demonstrated on the analysis of the J85-GE-21 propulsion system, resulting in a comprehensive description of propulsion system thermodynamic loss. This "loss management" method is used to analyze aerodynamic drag loss of the F-5E and is then used in conjunction with the propulsive loss model to analyze the usage of fuel work potential throughout the F-5E design mission. The results clearly show how and where work potential is used during flight and yield considerable insight as to where the greatest opportunity for design improvement is. Next, usage of work potential is translated into fuel weight so that the aerothermodynamic performance of the F-5E can be expressed entirely in terms of vehicle gross weight. This technique is then applied as a means to quantify the impact of engine cycle technologies on the F-5E airframe. Finally, loss management methods are used in conjunction with probabilistic analysis methods to quantify the impact of risk on F-5E aerothermodynamic performance.

  17. 40 CFR 766.16 - Developing the analytical test method.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Resolution Gas Chromatography (HRGC) with High Resolution Mass Spectrometry (HRMS) is the method of choice... meet the requirements of the chemical matrix. (d) Analysis. The method of choice is High Resolution Gas...

  18. 40 CFR 766.16 - Developing the analytical test method.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Resolution Gas Chromatography (HRGC) with High Resolution Mass Spectrometry (HRMS) is the method of choice... meet the requirements of the chemical matrix. (d) Analysis. The method of choice is High Resolution Gas...

  19. Developing Measures of Job Performance for Support Staff in Housing Services for People with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Hatton, Chris; Wigham, Sarah; Craig, Jaime

    2009-01-01

    Background: There is an absence of research concerning the assessment of housing support worker job performance, particularly in the development of job performance measures that reflect the priorities of people with intellectual disabilities and their families. Method: A worker-oriented job analysis method was used to develop four short job…

  20. Recent Advances in Clinical Natural Language Processing in Support of Semantic Analysis

    PubMed Central

    Mowery, D.; South, B. R.; Kvist, M.; Dalianis, H.

    2015-01-01

    Summary Objectives We present a review of recent advances in clinical Natural Language Processing (NLP), with a focus on semantic analysis and key subtasks that support such analysis. Methods We conducted a literature review of clinical NLP research from 2008 to 2014, emphasizing recent publications (2012-2014), based on PubMed and ACL proceedings as well as relevant referenced publications from the included papers. Results Significant articles published within this time-span were included and are discussed from the perspective of semantic analysis. Three key clinical NLP subtasks that enable such analysis were identified: 1) developing more efficient methods for corpus creation (annotation and de-identification), 2) generating building blocks for extracting meaning (morphological, syntactic, and semantic subtasks), and 3) leveraging NLP for clinical utility (NLP applications and infrastructure for clinical use cases). Finally, we provide a reflection upon most recent developments and potential areas of future NLP development and applications. Conclusions There has been an increase of advances within key NLP subtasks that support semantic analysis. Performance of NLP semantic analysis is, in many cases, close to that of agreement between humans. The creation and release of corpora annotated with complex semantic information models has greatly supported the development of new tools and approaches. Research on non-English languages is continuously growing. NLP methods have sometimes been successfully employed in real-world clinical tasks. However, there is still a gap between the development of advanced resources and their utilization in clinical settings. A plethora of new clinical use cases are emerging due to established health care initiatives and additional patient-generated sources through the extensive use of social media and other devices. PMID:26293867

  1. External Standards or Standard Addition? Selecting and Validating a Method of Standardization

    NASA Astrophysics Data System (ADS)

    Harvey, David T.

    2002-05-01

    A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.

  2. DETERMINATION OF PERCHLORATE AT PARTS-PER-BILLION LEVELS IN PLANTS BY ION CHROMATOGRAPHY

    EPA Science Inventory

    A method for the analysis of perchlorate in plants was developed, based on dry weight, and applied to the analysis of plant organs, foodstuffs, and plant products. The method reduced greatly the ionic interferences in water extracts of plant materials. The high background conduct...

  3. FECAL SOURCE TRACKING BY ANTIBIOTIC RESISTANCE ANALYSIS ON A WATERSHED EXHIBITING LOW RESISTANCE

    EPA Science Inventory

    The ongoing development of microbial source tracking has made it possible to identify contamination sources with varying accuracy, depending on the method used. The purpose of this study was done to test the efficiency of the antibiotic resistance analysis (ARA) method under low ...

  4. METHOD FOR THE DETERMINATION OF PERCHLORATE ANION IN PLANT AND SOLID MATRICES BY ION CHROMATOGRAPHY

    EPA Science Inventory

    A standardized method for the analysis of perchlorate in plants was developed, based on dry weight, and applied to the analysis of plant organs, foodstuffs, and plant products. The procedure greatly reduced the ionic interferences in water extracts of plant materials. Ion chro...

  5. CHIRAL METHODS AND ANALYSIS OF PCB 95 AND CIS -PERMETHRIN IN ENVIRONMENTAL SAMPLES FROM THE CTEPP STUDY

    EPA Science Inventory

    The creation of chiral chromatography techniques significantly advanced the development of methods for the analysis of individual enantiomers of chiral compounds. These techniques are being employed at the US EPA for human exposure and ecological research studies with indoor samp...

  6. GC/FT-IR ANALYSIS OF THE THERMALLY LABILE COMPOUND TRIS (2,3-DIBROMOPROPYL) PHOSPHATE

    EPA Science Inventory

    A fast and convenient GC method has been developed for a compound [tris(2,3-dibromopropyl)phosphate] that poses a difficult analytical problem for both GC (thermal instability/low volatility) and LC (not amenable to commonly available, sensitive detectors) analysis. his method em...

  7. ANALYSIS OF FERRIC AND FERROUS IONS IN SOIL EXTRACTS BY ION CHROMATOGRAPHY

    EPA Science Inventory

    A method using ion chromatography (IC) for the analysis of ferrous (Fe 2+) and ferric (Fe 3+) ions in soil extracts has been developed. This method uses an ion exchange column with detection at 520 nm after post-column derivatization. Selectivity is achieved by using an anionic...

  8. GEOS-2 C-band radar system project. Spectral analysis as related to C-band radar data analysis

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Work performed on spectral analysis of data from the C-band radars tracking GEOS-2 and on the development of a data compaction method for the GEOS-2 C-band radar data is described. The purposes of the spectral analysis study were to determine the optimum data recording and sampling rates for C-band radar data and to determine the optimum method of filtering and smoothing the data. The optimum data recording and sampling rate is defined as the rate which includes an optimum compromise between serial correlation and the effects of frequency folding. The goal in development of a data compaction method was to reduce to a minimum the amount of data stored, while maintaining all of the statistical information content of the non-compacted data. A digital computer program for computing estimates of the power spectral density function of sampled data was used to perform the spectral analysis study.

  9. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods

    PubMed Central

    Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.

    2012-01-01

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394

  10. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods.

    PubMed

    Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P

    2012-06-05

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.

  11. Linear least-squares method for global luminescent oil film skin friction field analysis

    NASA Astrophysics Data System (ADS)

    Lee, Taekjin; Nonomura, Taku; Asai, Keisuke; Liu, Tianshu

    2018-06-01

    A data analysis method based on the linear least-squares (LLS) method was developed for the extraction of high-resolution skin friction fields from global luminescent oil film (GLOF) visualization images of a surface in an aerodynamic flow. In this method, the oil film thickness distribution and its spatiotemporal development are measured by detecting the luminescence intensity of the thin oil film. From the resulting set of GLOF images, the thin oil film equation is solved to obtain an ensemble-averaged (steady) skin friction field as an inverse problem. In this paper, the formulation of a discrete linear system of equations for the LLS method is described, and an error analysis is given to identify the main error sources and the relevant parameters. Simulations were conducted to evaluate the accuracy of the LLS method and the effects of the image patterns, image noise, and sample numbers on the results in comparison with the previous snapshot-solution-averaging (SSA) method. An experimental case is shown to enable the comparison of the results obtained using conventional oil flow visualization and those obtained using both the LLS and SSA methods. The overall results show that the LLS method is more reliable than the SSA method and the LLS method can yield a more detailed skin friction topology in an objective way.

  12. Direct rapid analysis of trace bioavailable soil macronutrients by chemometrics-assisted energy dispersive X-ray fluorescence and scattering spectrometry.

    PubMed

    Kaniu, M I; Angeyo, K H; Mwala, A K; Mangala, M J

    2012-06-04

    Precision agriculture depends on the knowledge and management of soil quality (SQ), which calls for affordable, simple and rapid but accurate analysis of bioavailable soil nutrients. Conventional SQ analysis methods are tedious and expensive. We demonstrate the utility of a new chemometrics-assisted energy dispersive X-ray fluorescence and scattering (EDXRFS) spectroscopy method we have developed for direct rapid analysis of trace 'bioavailable' macronutrients (i.e. C, N, Na, Mg, P) in soils. The method exploits, in addition to X-ray fluorescence, the scatter peaks detected from soil pellets to develop a model for SQ analysis. Spectra were acquired from soil samples held in a Teflon holder analyzed using (109)Cd isotope source EDXRF spectrometer for 200 s. Chemometric techniques namely principal component analysis (PCA), partial least squares (PLS) and artificial neural networks (ANNs) were utilized for pattern recognition based on fluorescence and Compton scatter peaks regions, and to develop multivariate quantitative calibration models based on Compton scatter peak respectively. SQ analyses were realized with high CMD (R(2)>0.9) and low SEP (0.01% for N and Na, 0.05% for C, 0.08% for Mg and 1.98 μg g(-1) for P). Comparison of predicted macronutrients with reference standards using a one-way ANOVA test showed no statistical difference at 95% confidence level. To the best of the authors' knowledge, this is the first time that an XRF method has demonstrated utility in trace analysis of macronutrients in soil or related matrices. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Conceptual framework on the application of biomechanical measurement methods in driving behavior study

    NASA Astrophysics Data System (ADS)

    Sanjaya, Kadek Heri; Sya'bana, Yukhi Mustaqim Kusuma

    2017-01-01

    Research on eco-friendly vehicle development in Indonesia has largely neglected ergonomic study, despite the fact that traffic accidents have resulted in greater economic cost than fuel subsidy. We have performed a biomechanical experiment on human locomotion earlier. In this article, we describe the importance of implementing the biomechanical measurement methods in transportation ergonomic study. The instruments such as electromyogram (EMG), load cell, pressure sensor, and motion analysis methods as well as cross-correlation function analysis were explained, then the possibility of their application in driving behavior study is described. We describe the potentials and challenges of the biomechanical methods concerning the future vehicle development. The methods provide greater advantages in objective and accurate measurement not only in human task performance but also its correlation with vehicle performance.

  14. Development of Methods for Sampling and Analysis of Particulate and Gaseous Fluorides from Stationary Sources.

    ERIC Educational Resources Information Center

    Peters, E. T.; And Others

    A study was conducted which has resulted in the development of tentative sampling and analysis of fluorides emitted from various stationary sources. The study was directed toward developing and understanding the kinds of species which are present in each source emission. The report presents the following information: review of the various unit…

  15. Comparative Analysis of Various Single-tone Frequency Estimation Techniques in High-order Instantaneous Moments Based Phase Estimation Method

    NASA Astrophysics Data System (ADS)

    Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod

    2010-04-01

    For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.

  16. Numerical bifurcation analysis of immunological models with time delays

    NASA Astrophysics Data System (ADS)

    Luzyanina, Tatyana; Roose, Dirk; Bocharov, Gennady

    2005-12-01

    In recent years, a large number of mathematical models that are described by delay differential equations (DDEs) have appeared in the life sciences. To analyze the models' dynamics, numerical methods are necessary, since analytical studies can only give limited results. In turn, the availability of efficient numerical methods and software packages encourages the use of time delays in mathematical modelling, which may lead to more realistic models. We outline recently developed numerical methods for bifurcation analysis of DDEs and illustrate the use of these methods in the analysis of a mathematical model of human hepatitis B virus infection.

  17. Investigation of safety analysis methods using computer vision techniques

    NASA Astrophysics Data System (ADS)

    Shirazi, Mohammad Shokrolah; Morris, Brendan Tran

    2017-09-01

    This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.

  18. Linking stressors and ecological responses

    USGS Publications Warehouse

    Gentile, J.H.; Solomon, K.R.; Butcher, J.B.; Harrass, M.; Landis, W.G.; Power, M.; Rattner, B.A.; Warren-Hicks, W.J.; Wenger, R.; Foran, Jeffery A.; Ferenc, Susan A.

    1999-01-01

    To characterize risk, it is necessary to quantify the linkages and interactions between chemical, physical and biological stressors and endpoints in the conceptual framework for ecological risk assessment (ERA). This can present challenges in a multiple stressor analysis, and it will not always be possible to develop a quantitative stressor-response profile. This review commences with a conceptual representation of the problem of developing a linkage analysis for multiple stressors and responses. The remainder of the review surveys a variety of mathematical and statistical methods (e.g., ranking methods, matrix models, multivariate dose-response for mixtures, indices, visualization, simulation modeling and decision-oriented methods) for accomplishing the linkage analysis for multiple stressors. Describing the relationships between multiple stressors and ecological effects are critical components of 'effects assessment' in the ecological risk assessment framework.

  19. Green Curriculum Analysis in Technological Education

    ERIC Educational Resources Information Center

    Chakraborty, Arpita; Singh, Manvendra Pratap; Roy, Mousumi

    2018-01-01

    With rapid industrialization and technological development, India is facing adverse affects of unsustainable pattern of production and consumption. Education for sustainable development has been widely recognized to reduce the threat of environmental degradation and resource depletion. This paper used the content analysis method to explore the…

  20. Making Social Work Count: A Curriculum Innovation to Teach Quantitative Research Methods and Statistical Analysis to Undergraduate Social Work Students in the United Kingdom

    ERIC Educational Resources Information Center

    Teater, Barbra; Roy, Jessica; Carpenter, John; Forrester, Donald; Devaney, John; Scourfield, Jonathan

    2017-01-01

    Students in the United Kingdom (UK) are found to lack knowledge and skills in quantitative research methods. To address this gap, a quantitative research method and statistical analysis curriculum comprising 10 individual lessons was developed, piloted, and evaluated at two universities The evaluation found that BSW students' (N = 81)…

  1. Some New Mathematical Methods for Variational Objective Analysis

    NASA Technical Reports Server (NTRS)

    Wahba, G.; Johnson, D. R.

    1984-01-01

    New and/or improved variational methods for simultaneously combining forecast, heterogeneous observational data, a priori climatology, and physics to obtain improved estimates of the initial state of the atmosphere for the purpose of numerical weather prediction are developed. Cross validated spline methods are applied to atmospheric data for the purpose of improved description and analysis of atmospheric phenomena such as the tropopause and frontal boundary surfaces.

  2. Development and comparison of advanced reduced-basis methods for the transient structural analysis of unconstrained structures

    NASA Technical Reports Server (NTRS)

    Mcgowan, David M.; Bostic, Susan W.; Camarda, Charles J.

    1993-01-01

    The development of two advanced reduced-basis methods, the force derivative method and the Lanczos method, and two widely used modal methods, the mode displacement method and the mode acceleration method, for transient structural analysis of unconstrained structures is presented. Two example structural problems are studied: an undamped, unconstrained beam subject to a uniformly distributed load which varies as a sinusoidal function of time and an undamped high-speed civil transport aircraft subject to a normal wing tip load which varies as a sinusoidal function of time. These example problems are used to verify the methods and to compare the relative effectiveness of each of the four reduced-basis methods for performing transient structural analyses on unconstrained structures. The methods are verified with a solution obtained by integrating directly the full system of equations of motion, and they are compared using the number of basis vectors required to obtain a desired level of accuracy and the associated computational times as comparison criteria.

  3. Multi-disciplinary optimization of aeroservoelastic systems

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1991-01-01

    New methods were developed for efficient aeroservoelastic analysis and optimization. The main target was to develop a method for investigating large structural variations using a single set of modal coordinates. This task was accomplished by basing the structural modal coordinates on normal modes calculated with a set of fictitious masses loading the locations of anticipated structural changes. The following subject areas are covered: (1) modal coordinates for aeroelastic analysis with large local structural variations; and (2) time simulation of flutter with large stiffness changes.

  4. Shape design sensitivity analysis using domain information

    NASA Technical Reports Server (NTRS)

    Seong, Hwal-Gyeong; Choi, Kyung K.

    1985-01-01

    A numerical method for obtaining accurate shape design sensitivity information for built-up structures is developed and demonstrated through analysis of examples. The basic character of the finite element method, which gives more accurate domain information than boundary information, is utilized for shape design sensitivity improvement. A domain approach for shape design sensitivity analysis of built-up structures is derived using the material derivative idea of structural mechanics and the adjoint variable method of design sensitivity analysis. Velocity elements and B-spline curves are introduced to alleviate difficulties in generating domain velocity fields. The regularity requirements of the design velocity field are studied.

  5. Parameterization, sensitivity analysis, and inversion: an investigation using groundwater modeling of the surface-mined Tivoli-Guidonia basin (Metropolitan City of Rome, Italy)

    NASA Astrophysics Data System (ADS)

    La Vigna, Francesco; Hill, Mary C.; Rossetto, Rudy; Mazza, Roberto

    2016-09-01

    With respect to model parameterization and sensitivity analysis, this work uses a practical example to suggest that methods that start with simple models and use computationally frugal model analysis methods remain valuable in any toolbox of model development methods. In this work, groundwater model calibration starts with a simple parameterization that evolves into a moderately complex model. The model is developed for a water management study of the Tivoli-Guidonia basin (Rome, Italy) where surface mining has been conducted in conjunction with substantial dewatering. The approach to model development used in this work employs repeated analysis using sensitivity and inverse methods, including use of a new observation-stacked parameter importance graph. The methods are highly parallelizable and require few model runs, which make the repeated analyses and attendant insights possible. The success of a model development design can be measured by insights attained and demonstrated model accuracy relevant to predictions. Example insights were obtained: (1) A long-held belief that, except for a few distinct fractures, the travertine is homogeneous was found to be inadequate, and (2) The dewatering pumping rate is more critical to model accuracy than expected. The latter insight motivated additional data collection and improved pumpage estimates. Validation tests using three other recharge and pumpage conditions suggest good accuracy for the predictions considered. The model was used to evaluate management scenarios and showed that similar dewatering results could be achieved using 20 % less pumped water, but would require installing newly positioned wells and cooperation between mine owners.

  6. [Qualitative research in health services research - discussion paper, Part 2: Qualitative research in health services research in Germany - an overview].

    PubMed

    Karbach, U; Stamer, M; Holmberg, C; Güthlin, C; Patzelt, C; Meyer, T

    2012-08-01

    This is the second part of a 3-part discussion paper by the working group on "Qualitative Methods" in the German network of health services research (DNVF) that shall contribute to the development of a memorandum concerning qualitative health services research. It aims to depict the different types of qualitative research that are conducted in health services research in Germany. In addition, the authors present a specific set of qualitative data collection and analysis tools to demonstrate the potential of qualitative research for health services research. QUALITATIVE RESEARCH IN HEALTH SERVICES RESEARCH - AN OVERVIEW: To give an overview of the types of qualitative research conducted in German health services research, the abstracts of the 8th German Conference on Health Services Research were filtered to identify qualitative or mixed-methods studies. These were then analysed by looking at the context which was studied, who was studied, the aims of the studies, and what type of methods were used. Those methods that were mentioned most often for data collection and analysis are described in detail. QUALITATIVE RESEARCH AT THE CONFERENCE FOR HEALTH SERVICES RESEARCH 2009: Approximately a fifth of all abstracts (n=74) had a qualitative (n=47) or a mixed-methods approach combining quantitative and qualitative methods (n=27). Research aims included needs assessment (41%), survey development (36%), evaluation (22%), and theorizing (1%). Data collection mostly consisted of one-on-one interviews (n=45) and group discussions (n=29). Qualitative content analysis was named in 35 abstracts, 30 abstracts did not reference their method of analysis. In addition to a quantitative summary of the abstract findings, the diversity of fields addressed by qualitative methods is highlighted. Although drawing conclusions on the use of qualitative methods in German health services research from the analysis of conference abstracts is not possible, the overview we present demonstrates the diversity of methods used for data collection and analysis and showed that a few select methods are extensively used. One of the tasks a memorandum of qualitative health services research should accomplish is to highlight underutilized research methods, which may help to develop the potential of qualitative methodology in German health services research. © Georg Thieme Verlag KG Stuttgart · New York.

  7. Soil analysis based on sa,ples withdrawn from different volumes: correlation versus calibration

    Treesearch

    Lucian Weilopolski; Kurt Johnsen; Yuen Zhang

    2010-01-01

    Soil, particularly in forests, is replete with spatial variation with respect to soil C. Th e present standard chemical method for soil analysis by dry combustion (DC) is destructive, and comprehensive sampling is labor intensive and time consuming. Th ese, among other factors, are contributing to the development of new methods for soil analysis. Th ese include a near...

  8. Task 2 Report: Algorithm Development and Performance Analysis

    DTIC Science & Technology

    1993-07-01

    separated peaks ............................................. 39 7-16 Example ILGC data for schedule 3 phosphites showing an analysis method which integrates...more closely follows the baseline ................. 40 7-18 Example R.GC data for schedule 3 phosphites showing an analysis method resulting in unwanted...much of the ambiguity that can arise in GC/MS with trace environmental samples, for example. Correlated chromatography, on the other hand, separates the

  9. Pathway analysis with next-generation sequencing data.

    PubMed

    Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao

    2015-04-01

    Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.

  10. Thermal-hydraulic analysis capabilities and methods development at NYPA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feltus, M.A.

    1987-01-01

    The operation of a nuclear power plant must be regularly supported by various thermal-hydraulic (T/H) analyses that may include final safety analysis report (FSAR) design basis calculations and licensing evaluations and conservative and best-estimate analyses. The development of in-house T/H capabilities provides the following advantages: (a) it leads to a better understanding of the plant design basis and operating characteristics; (b) methods developed can be used to optimize plant operations and enhance plant safety; (c) such a capability can be used for design reviews, checking vendor calculations, and evaluating proposed plant modifications; and (d) in-house capability reduces the cost ofmore » analysis. This paper gives an overview of the T/H capabilities and current methods development activity within the engineering department of the New York Power Authority (NYPA) and will focus specifically on reactor coolant system (RCS) transients and plant dynamic response for non-loss-of-coolant accident events. This paper describes NYPA experience in performing T/H analyses in support of pressurized water reactor plant operation.« less

  11. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. A performance analysis method for distributed real-time robotic systems: A case study of remote teleoperation

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Sanderson, A. C.

    1994-01-01

    Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

  13. Advanced methods of structural and trajectory analysis for transport aircraft

    NASA Technical Reports Server (NTRS)

    Ardema, Mark D.

    1995-01-01

    This report summarizes the efforts in two areas: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of trajectory optimization. The majority of the effort was spent in the structural weight area. A draft of 'Analytical Fuselage and Wing Weight Estimation of Transport Aircraft', resulting from this research, is included as an appendix.

  14. Development and application of a multilocus sequence analysis method for the identification of genotypes within genus Bradyrhizobium and for establishing nodule occupancy of soybean (Glycine max L. Merr)

    USDA-ARS?s Scientific Manuscript database

    A Multilocus Sequence Typing (MLST) method based on allelic variation of 7 chromosomal loci was developed for characterizing genotypes within the genus Bradyrhizobium. With the method 29 distinct multilocus genotypes (GTs) were identified among 191 culture collection soybean strains. The occupancy ...

  15. CFD Analysis of the SBXC Glider Airframe

    DTIC Science & Technology

    2016-06-01

    mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the previous research data...greater than 15 m/s. 14. SUBJECT TERMS finite element method, computational fluid dynamics, Y Plus, mesh element quality, aerodynamic data, fluid...based mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the

  16. Analysis of Bisphenol A, Alkylphenols, and Alkylphenol Ethoxylates in NIST SRM 2585 and Indoor House Dust by Gas Chromatography-Tandem Mass Spectrometry (GC/MS/MS).

    PubMed

    Fan, Xinghua; Kubwabo, Cariton; Wu, Fang; Rasmussen, Pat E

    2018-06-26

    Background: Ingestion of house dust has been demonstrated to be an important exposure pathway to several contaminants in young children. These compounds include bisphenol A (BPA), alkylphenols (APs), and alkylphenol ethoxylates (APEOs). Analysis of these compounds in house dust is challenging because of the complex composition of the sample matrix. Objective: The objective was to develop a simple and sensitive method to measure BPA, APs, and APEOs in indoor house dust. Methods: An integrated method that involved solvent extraction using sonication, sample cleanup by solid-phase extraction, derivatization by 2,2,2-trifluoro- N -methyl- N -(trimethylsilyl)acetamide, and analysis by GC coupled with tandem MS was developed for the simultaneous determination of BPA, APs, and APEOs in NIST Standard Reference Material (SRM) 2585 (Organic contaminants in house dust) and in settled house dust samples. Results: Target analytes included BPA, 4- tert -octylphenol (OP), OP monoethoxylate, OP diethoxylate, 4- n -nonylphenol (4 n NP), 4 n NP monoethoxylate (4 n NP 1 EO), branched nonylphenol (NP), NP monoethoxylate, NP diethoxylate, NP triethoxylate, and NP tetraethoxylate. The method was sensitive, with method detection limits ranging from 0.05 to 5.1 μg/g, and average recoveries between 82 and 115%. All target analytes were detected in SRM 2585 and house dust except 4 n NP and 4 n NP 1 EO. Conclusions: The method is simple and fast, with high sensitivity and good reproducibility. It is applicable to the analysis of target analytes in similar matrixes, such as sediments, soil, and biosolids. Highlights: Values measured in SRM 2585 will be useful for future research in method development and method comparison.

  17. Laboratory and 3-D distinct element analysis of the failure mechanism of a slope under external surcharge

    NASA Astrophysics Data System (ADS)

    Li, N.; Cheng, Y. M.

    2015-01-01

    Landslide is a major disaster resulting in considerable loss of human lives and property damages in hilly terrain in Hong Kong, China and many other countries. The factor of safety and the critical slip surface for slope stabilization are the main considerations for slope stability analysis in the past, while the detailed post-failure conditions of the slopes have not been considered in sufficient detail. There is however increasing interest in the consequences after the initiation of failure that includes the development and propagation of the failure surfaces, the amount of failed mass and runoff and the affected region. To assess the development of slope failure in more detail and to consider the potential danger of slopes after failure has initiated, the slope stability problem under external surcharge is analyzed by the distinct element method (DEM) and a laboratory model test in the present research. A more refined study about the development of failure, microcosmic failure mechanisms and the post-failure mechanisms of slopes will be carried out. The numerical modeling method and the various findings from the present work can provide an alternate method of analysis of slope failure, which can give additional information not available from the classical methods of analysis.

  18. Laboratory and 3-D-distinct element analysis of failure mechanism of slope under external surcharge

    NASA Astrophysics Data System (ADS)

    Li, N.; Cheng, Y. M.

    2014-09-01

    Landslide is a major disaster resulting in considerable loss of human lives and property damages in hilly terrain in Hong Kong, China and many other countries. The factor of safety and the critical slip surface for slope stabilization are the main considerations for slope stability analysis in the past, while the detailed post-failure conditions of the slopes have not been considered in sufficient details. There are however increasing interest on the consequences after the initiation of failure which includes the development and propagation of the failure surfaces, the amount of failed mass and runoff and the affected region. To assess the development of slope failure in more details and to consider the potential danger of slopes after failure has initiated, the slope stability problem under external surcharge is analyzed by the distinct element method (DEM) and laboratory model test in the present research. A more refined study about the development of failure, microcosmic failure mechanism and the post-failure mechanism of slope will be carried out. The numerical modeling method and the various findings from the present work can provide an alternate method of analysis of slope failure which can give additional information not available from the classical methods of analysis.

  19. Recent mass spectrometry-based techniques and considerations for disulfide bond characterization in proteins.

    PubMed

    Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather

    2018-04-01

    Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass spectrometry data, and software tools.

  20. Bandwidth and Detection of Packet Length Covert Channels

    DTIC Science & Technology

    2011-03-01

    Shared Resource Matrix ( SRM ): Develop a matrix of all resources on one side and on the other all the processes. Then, determine which process uses which...system calls. This method is similar to that of the SRM . Covert channels have also been created by modulating packet timing, data and headers of net- work...analysis, noninterference analysis, SRM method, and the covert flow tree method [4]. These methods can be used during the design phase of a system. Less

  1. Determination of dasatinib in the tablet dosage form by ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis.

    PubMed

    Gonzalez, Aroa Garcia; Taraba, Lukáš; Hraníček, Jakub; Kozlík, Petr; Coufal, Pavel

    2017-01-01

    Dasatinib is a novel oral prescription drug proposed for treating adult patients with chronic myeloid leukemia. Three analytical methods, namely ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis, were developed, validated, and compared for determination of the drug in the tablet dosage form. The total analysis time of optimized ultra high performance liquid chromatography and capillary zone electrophoresis methods was 2.0 and 2.2 min, respectively. Direct ultraviolet detection with detection wavelength of 322 nm was employed in both cases. The optimized sequential injection analysis method was based on spectrophotometric detection of dasatinib after a simple colorimetric reaction with folin ciocalteau reagent forming a blue-colored complex with an absorbance maximum at 745 nm. The total analysis time was 2.5 min. The ultra high performance liquid chromatography method provided the lowest detection and quantitation limits and the most precise and accurate results. All three newly developed methods were demonstrated to be specific, linear, sensitive, precise, and accurate, providing results satisfactorily meeting the requirements of the pharmaceutical industry, and can be employed for the routine determination of the active pharmaceutical ingredient in the tablet dosage form. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Method of Evaluating the Life Cycle Cost of Small Earth Dams Considering the Risk of Heavy Rainfall and Selection Method of the Optimum Countermeasure

    NASA Astrophysics Data System (ADS)

    Hori, Toshikazu; Mohri, Yoshiyuki; Matsushima, Kenichi; Ariyoshi, Mitsuru

    In recent years the increase in the number of heavy rainfall occurrences such as through unpredictable cloudbursts have resulted in the safety of the embankments of small earth dams needing to be improved. However, the severe financial condition of the government and local autonomous bodies necessitate the cost of improving them to be reduced. This study concerns the development of a method of evaluating the life cycle cost of small earth dams considered to pose a risk and in order to improve the safety of the downstream areas of small earth dams at minimal cost. Use of a safety evaluation method that is based on a combination of runoff analysis, saturated and unsaturated seepage analysis, and slope stability analysis enables the probability of a dam breach and its life cycle cost with the risk of heavy rainfall taken into account to be calculated. Moreover, use of the life cycle cost evaluation method will lead to the development of a technique for selecting the method of the optimal improvement or countermeasures against heavy rainfall.

  3. Development and validation of a multiresidue method for the analysis of polybrominated diphenyl ethers, new brominated and organophosphorus flame retardants in sediment, sludge and dust.

    PubMed

    Cristale, Joyce; Lacorte, Silvia

    2013-08-30

    This study presents a multiresidue method for simultaneous extraction, clean-up and analysis of priority and emerging flame retardants in sediment, sewage sludge and dust. Studied compounds included eight polybrominated diphenyl ethers congeners, nine new brominated flame retardants and ten organophosphorus flame retardants. The analytical method was based on ultrasound-assisted extraction with ethyl acetate/cyclohexane (5:2, v/v), clean-up with Florisil cartridges and analysis by gas chromatography coupled to tandem mass spectrometry (GC-EI-MS/MS). Method development and validation protocol included spiked samples, certified reference material (for dust), and participation in an interlaboratory calibration. The method proved to be efficient and robust for extraction and determination of three families of flame retardants families in the studied solid matrices. The method was applied to river sediment, sewage sludge and dust samples, and allowed detection of 24 among the 27 studied flame retardants. Organophosphate esters, BDE-209 and decabromodiphenyl ethane were the most ubiquitous contaminants detected. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Application of Plackett-Burman and Doehlert designs for optimization of selenium analysis in plasma with electrothermal atomic absorption spectrometry.

    PubMed

    El Ati-Hellal, Myriam; Hellal, Fayçal; Hedhili, Abderrazek

    2014-10-01

    The aim of this study was the optimization of selenium determination in plasma samples with electrothermal atomic absorption spectrometry using experimental design methodology. 11 variables being able to influence selenium analysis in human blood plasma by electrothermal atomic absorption spectrometry (ETAAS) were evaluated with Plackett-Burman experimental design. These factors were selected from sample preparation, furnace program and chemical modification steps. Both absorbance and background signals were chosen as responses in the screening approach. Doehlert design was used for method optimization. Results showed that only ashing temperature has a statistically significant effect on the selected responses. Optimization with Doehlert design allowed the development of a reliable method for selenium analysis with ETAAS. Samples were diluted 1/10 with 0.05% (v/v) TritonX-100+2.5% (v/v) HNO3 solution. Optimized ashing and atomization temperatures for nickel modifier were 1070°C and 2270°C, respectively. A detection limit of 2.1μgL(-1) Se was obtained. Accuracy of the method was checked by the analysis of selenium in Seronorm™ Trace element quality control serum level 1. The developed procedure was applied for the analysis of total selenium in fifteen plasma samples with standard addition method. Concentrations ranged between 24.4 and 64.6μgL(-1), with a mean of 42.6±4.9μgL(-1). The use of experimental designs allowed the development of a cheap and accurate method for selenium analysis in plasma that could be applied routinely in clinical laboratories. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  5. Detecting spatio-temporal modes in multivariate data by entropy field decomposition

    NASA Astrophysics Data System (ADS)

    Frank, Lawrence R.; Galinsky, Vitaly L.

    2016-09-01

    A new data analysis method that addresses a general problem of detecting spatio-temporal variations in multivariate data is presented. The method utilizes two recent and complimentary general approaches to data analysis, information field theory (IFT) and entropy spectrum pathways (ESPs). Both methods reformulate and incorporate Bayesian theory, thus use prior information to uncover underlying structure of the unknown signal. Unification of ESP and IFT creates an approach that is non-Gaussian and nonlinear by construction and is found to produce unique spatio-temporal modes of signal behavior that can be ranked according to their significance, from which space-time trajectories of parameter variations can be constructed and quantified. Two brief examples of real world applications of the theory to the analysis of data bearing completely different, unrelated nature, lacking any underlying similarity, are also presented. The first example provides an analysis of resting state functional magnetic resonance imaging data that allowed us to create an efficient and accurate computational method for assessing and categorizing brain activity. The second example demonstrates the potential of the method in the application to the analysis of a strong atmospheric storm circulation system during the complicated stage of tornado development and formation using data recorded by a mobile Doppler radar. Reference implementation of the method will be made available as a part of the QUEST toolkit that is currently under development at the Center for Scientific Computation in Imaging.

  6. Teaching Analytical Method Development in an Undergraduate Instrumental Analysis Course

    ERIC Educational Resources Information Center

    Lanigan, Katherine C.

    2008-01-01

    Method development and assessment, central components of carrying out chemical research, require problem-solving skills. This article describes a pedagogical approach for teaching these skills through the adaptation of published experiments and application of group-meeting style discussions to the curriculum of an undergraduate instrumental…

  7. Advanced Small Modular Reactor Economics Model Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Thomas J.

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis ofmore » the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.« less

  8. [Amperometric biosensor for lactate analysis in wines and grape must during fermentation].

    PubMed

    Shkotova, L V; Horiushkina, T B; Slast'ia, E A; Soldatkin, O P; Tranh-Minh, S; Chovelon, J M; Dziadevych, S V

    2005-01-01

    The amperometric biosensor based on lactate oxidase for determination of lactate has been developed, and two methods of immobilization of lactate oxidase on the surface of industrial screen-printed platinum electrodes SensLab were compared. A sensor with immobilized in the Resydrol polymer lactate oxidase by the method of physical adsorption is characterized of narrow dynamic range and greater response value in comparison with a biosensor based on immobilised in poly(3,4-ethylenedioxythiophene) lactate oxidase by the method of electrochemical polymerization. Operational stability of the biosensor developed was studied and it was shown, that the immobilization method does not influence their stability. The analysis of the lactate in wine and during wine fermentation has been conducted. High correlation of the data obtained by means of amperometric lactate biosensor and a standard method of an ionic chromatography has been shown. The developed biosensor could be applied in the food industry for the control and optimization of the wine fermentation process, and quality control of wine.

  9. The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine

    Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less

  10. The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions

    DOE PAGES

    Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine; ...

    2017-07-18

    Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less

  11. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1982-01-01

    The computational methods used to predict and optimize the thermal structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a different yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  12. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1983-01-01

    The computational methods used to predict and optimize the thermal-structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a difficult yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally-useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  13. Pulse analysis of acoustic emission signals

    NASA Technical Reports Server (NTRS)

    Houghton, J. R.; Packman, P. F.

    1977-01-01

    A method for the signature analysis of pulses in the frequency domain and the time domain is presented. Fourier spectrum, Fourier transfer function, shock spectrum and shock spectrum ratio were examined in the frequency domain analysis and pulse shape deconvolution was developed for use in the time domain analysis. Comparisons of the relative performance of each analysis technique are made for the characterization of acoustic emission pulses recorded by a measuring system. To demonstrate the relative sensitivity of each of the methods to small changes in the pulse shape, signatures of computer modeled systems with analytical pulses are presented. Optimization techniques are developed and used to indicate the best design parameter values for deconvolution of the pulse shape. Several experiments are presented that test the pulse signature analysis methods on different acoustic emission sources. These include acoustic emission associated with (a) crack propagation, (b) ball dropping on a plate, (c) spark discharge, and (d) defective and good ball bearings. Deconvolution of the first few micro-seconds of the pulse train is shown to be the region in which the significant signatures of the acoustic emission event are to be found.

  14. Pulse analysis of acoustic emission signals

    NASA Technical Reports Server (NTRS)

    Houghton, J. R.; Packman, P. F.

    1977-01-01

    A method for the signature analysis of pulses in the frequency domain and the time domain is presented. Fourier spectrum, Fourier transfer function, shock spectrum and shock spectrum ratio were examined in the frequency domain analysis, and pulse shape deconvolution was developed for use in the time domain analysis. Comparisons of the relative performance of each analysis technique are made for the characterization of acoustic emission pulses recorded by a measuring system. To demonstrate the relative sensitivity of each of the methods to small changes in the pulse shape, signatures of computer modeled systems with analytical pulses are presented. Optimization techniques are developed and used to indicate the best design parameters values for deconvolution of the pulse shape. Several experiments are presented that test the pulse signature analysis methods on different acoustic emission sources. These include acoustic emissions associated with: (1) crack propagation, (2) ball dropping on a plate, (3) spark discharge and (4) defective and good ball bearings. Deconvolution of the first few micro-seconds of the pulse train are shown to be the region in which the significant signatures of the acoustic emission event are to be found.

  15. Chipster: user-friendly analysis software for microarray and other high-throughput data.

    PubMed

    Kallio, M Aleksi; Tuimala, Jarno T; Hupponen, Taavi; Klemelä, Petri; Gentile, Massimiliano; Scheinin, Ilari; Koski, Mikko; Käki, Janne; Korpelainen, Eija I

    2011-10-14

    The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available.

  16. Chipster: user-friendly analysis software for microarray and other high-throughput data

    PubMed Central

    2011-01-01

    Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available. PMID:21999641

  17. Automated Analysis of Child Phonetic Production Using Naturalistic Recordings

    ERIC Educational Resources Information Center

    Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill

    2014-01-01

    Purpose: Conventional resource-intensive methods for child phonetic development studies are often impractical for sampling and analyzing child vocalizations in sufficient quantity. The purpose of this study was to provide new information on early language development by an automated analysis of child phonetic production using naturalistic…

  18. Relativity Concept Inventory: Development, Analysis, and Results

    ERIC Educational Resources Information Center

    Aslanides, J. S.; Savage, C. M.

    2013-01-01

    We report on a concept inventory for special relativity: the development process, data analysis methods, and results from an introductory relativity class. The Relativity Concept Inventory tests understanding of relativistic concepts. An unusual feature is confidence testing for each question. This can provide additional information; for example,…

  19. Using participative inquiry in usability analysis to align a development team's mental model with its users' needs

    NASA Technical Reports Server (NTRS)

    Kneifel, A. A.; Guerrero, C.

    2003-01-01

    In this web site usability case study, two methods of participative inquiry are used to align a development team's objectives with their users' needs and to promote the team awareness of the benefit of qualitative usability analysis.

  20. Development of an SPE/CE method for analyzing HAAs

    USGS Publications Warehouse

    Zhang, L.; Capel, P.D.; Hozalski, R.M.

    2007-01-01

    The haloacetic acid (HAA) analysis methods approved by the US Environmental Protection Agency involve extraction and derivatization of HAAs (typically to their methyl ester form) and analysis by gas chromatography (GC) with electron capture detection (ECD). Concerns associated with these methods include the time and effort of the derivatization process, use of potentially hazardous chemicals or conditions during methylation, poor recoveries because of low extraction efficiencies for some HAAs or matrix effects from sulfate, and loss of tribromoacetic acid because of decarboxylation. The HAA analysis method introduced here uses solid-phase extraction (SPE) followed by capillary electrophoresis (CE) analysis. The method is accurate, reproducible, sensitive, relatively safe, and easy to perform, and avoids the use of large amounts of solvent for liquid-liquid extraction and the potential hazards and hassles of derivatization. The cost of analyzing HAAs using this method should be lower than the currently approved methods, and utilities with a GC/ECD can perform the analysis in-house.

  1. Correlative and multivariate analysis of increased radon concentration in underground laboratory.

    PubMed

    Maletić, Dimitrije M; Udovičić, Vladimir I; Banjanac, Radomir M; Joković, Dejan R; Dragić, Aleksandar L; Veselinović, Nikola B; Filipović, Jelena

    2014-11-01

    The results of analysis using correlative and multivariate methods, as developed for data analysis in high-energy physics and implemented in the Toolkit for Multivariate Analysis software package, of the relations of the variation of increased radon concentration with climate variables in shallow underground laboratory is presented. Multivariate regression analysis identified a number of multivariate methods which can give a good evaluation of increased radon concentrations based on climate variables. The use of the multivariate regression methods will enable the investigation of the relations of specific climate variable with increased radon concentrations by analysis of regression methods resulting in 'mapped' underlying functional behaviour of radon concentrations depending on a wide spectrum of climate variables. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Non-contact method of search and analysis of pulsating vessels

    NASA Astrophysics Data System (ADS)

    Avtomonov, Yuri N.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Despite the variety of existing methods of recording the human pulse and a solid history of their development, there is still considerable interest in this topic. The development of new non-contact methods, based on advanced image processing, caused a new wave of interest in this issue. We present a simple but quite effective method for analyzing the mechanical pulsations of blood vessels lying close to the surface of the skin. Our technique is a modification of imaging (or remote) photoplethysmography (i-PPG). We supplemented this method with the addition of a laser light source, which made it possible to use other methods of searching for the proposed pulsation zone. During the testing of the method, several series of experiments were carried out with both artificial oscillating objects as well as with the target signal source (human wrist). The obtained results show that our method allows correct interpretation of complex data. To summarize, we proposed and tested an alternative method for the search and analysis of pulsating vessels.

  3. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights inmore » the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.« less

  4. Fusing Symbolic and Numerical Diagnostic Computations

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.

  5. Porous Graphitic Carbon Liquid Chromatography-Mass Spectrometry Analysis of Drought Stress-Responsive Raffinose Family Oligosaccharides in Plant Tissues.

    PubMed

    Jorge, Tiago F; Florêncio, Maria H; António, Carla

    2017-01-01

    Drought is a major limiting factor in agriculture and responsible for dramatic crop yield losses worldwide. The adjustment of the metabolic status via accumulation of drought stress-responsive osmolytes is one of the many strategies that some plants have developed to cope with water deficit conditions. Osmolytes are highly polar compounds, analysis of whcih is difficult with typical reversed-phase chromatography. Porous graphitic carbon (PGC) has shown to be a suitable alternative to reversed-phase stationary phases for the analysis of highly polar compounds typically found in the plant metabolome. In this chapter, we describe the development and validation of a PGC-based liquid chromatography tandem mass spectrometry (LC-MS n ) method suitable for the target analysis of water-soluble carbohydrates, such as raffinose family oligosaccharides (RFOs). We present detailed information regarding PGC column equilibration, LC-MS n system operation, data analysis, and important notes to be considered during the steps of method development and validation.

  6. A method for the design of transonic flexible wings

    NASA Technical Reports Server (NTRS)

    Smith, Leigh Ann; Campbell, Richard L.

    1990-01-01

    Methodology was developed for designing airfoils and wings at transonic speeds which includes a technique that can account for static aeroelastic deflections. This procedure is capable of designing either supercritical or more conventional airfoil sections. Methods for including viscous effects are also illustrated and are shown to give accurate results. The methodology developed is an interactive system containing three major parts. A design module was developed which modifies airfoil sections to achieve a desired pressure distribution. This design module works in conjunction with an aerodynamic analysis module, which for this study is a small perturbation transonic flow code. Additionally, an aeroelastic module is included which determines the wing deformation due to the calculated aerodynamic loads. Because of the modular nature of the method, it can be easily coupled with any aerodynamic analysis code.

  7. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  8. The order and priority of research and design method application within an assistive technology new product development process: a summative content analysis of 20 case studies.

    PubMed

    Torrens, George Edward

    2018-01-01

    Summative content analysis was used to define methods and heuristics from each case study. The review process was in two parts: (1) A literature review to identify conventional research methods and (2) a summative content analysis of published case studies, based on the identified methods and heuristics to suggest an order and priority of where and when were used. Over 200 research and design methods and design heuristics were identified. From the review of the 20 case studies 42 were identified as being applied. The majority of methods and heuristics were applied in phase two, market choice. There appeared a disparity between the limited numbers of methods frequently used, under 10 within the 20 case studies, when hundreds were available. Implications for Rehabilitation The communication highlights a number of issues that have implication for those involved in assistive technology new product development: •The study defined over 200 well-established research and design methods and design heuristics that are available for use by those who specify and design assistive technology products, which provide a comprehensive reference list for practitioners in the field; •The review within the study suggests only a limited number of research and design methods are regularly used by industrial design focused assistive technology new product developers; and, •Debate is required within the practitioners working in this field to reflect on how a wider range of potentially more effective methods and heuristics may be incorporated into daily working practice.

  9. A quality quantitative method of silicon direct bonding based on wavelet image analysis

    NASA Astrophysics Data System (ADS)

    Tan, Xiao; Tao, Zhi; Li, Haiwang; Xu, Tiantong; Yu, Mingxing

    2018-04-01

    The rapid development of MEMS (micro-electro-mechanical systems) has received significant attention from researchers in various fields and subjects. In particular, the MEMS fabrication process is elaborate and, as such, has been the focus of extensive research inquiries. However, in MEMS fabrication, component bonding is difficult to achieve and requires a complex approach. Thus, improvements in bonding quality are relatively important objectives. A higher quality bond can only be achieved with improved measurement and testing capabilities. In particular, the traditional testing methods mainly include infrared testing, tensile testing, and strength testing, despite the fact that using these methods to measure bond quality often results in low efficiency or destructive analysis. Therefore, this paper focuses on the development of a precise, nondestructive visual testing method based on wavelet image analysis that is shown to be highly effective in practice. The process of wavelet image analysis includes wavelet image denoising, wavelet image enhancement, and contrast enhancement, and as an end result, can display an image with low background noise. In addition, because the wavelet analysis software was developed with MATLAB, it can reveal the bonding boundaries and bonding rates to precisely indicate the bond quality at all locations on the wafer. This work also presents a set of orthogonal experiments that consist of three prebonding factors, the prebonding temperature, the positive pressure value and the prebonding time, which are used to analyze the prebonding quality. This method was used to quantify the quality of silicon-to-silicon wafer bonding, yielding standard treatment quantities that could be practical for large-scale use.

  10. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. W. Parry; J.A Forester; V.N. Dang

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less

  11. Development and validation of an improved method for the determination of chloropropanols in paperboard food packaging by GC-MS.

    PubMed

    Mezouari, S; Liu, W Yun; Pace, G; Hartman, T G

    2015-01-01

    The objective of this study was to develop an improved analytical method for the determination of 3-chloro-1,2-propanediol (3-MCPD) and 1,3-dichloropropanol (1,3-DCP) in paper-type food packaging. The established method includes aqueous extraction, matrix spiking of a deuterated surrogate internal standard (3-MCPD-d₅), clean-up using Extrelut solid-phase extraction, derivatisation using a silylation reagent, and GC-MS analysis of the chloropropanols as their corresponding trimethyl silyl ethers. The new method is applicable to food-grade packaging samples using European Commission standard aqueous extraction and aqueous food stimulant migration tests. In this improved method, the derivatisation procedure was optimised; the cost and time of the analysis were reduced by using 10 times less sample, solvents and reagents than in previously described methods. Overall the validation data demonstrate that the method is precise and reliable. The limit of detection (LOD) of the aqueous extract was 0.010 mg kg(-1) (w/w) for both 3-MCPD and 1,3-DCP. Analytical precision had a relative standard deviation (RSD) of 3.36% for 3-MCPD and an RSD of 7.65% for 1,3-DCP. The new method was satisfactorily applied to the analysis of over 100 commercial paperboard packaging samples. The data are being used to guide the product development of a next generation of wet-strength resins with reduced chloropropanol content, and also for risk assessments to calculate the virtual safe dose (VSD).

  12. A method for the determination of syringe needle punctures in rubber stoppers using stereoscopic light microscopy.

    PubMed

    Platek, S Frank; Keisler, Mark A; Ranieri, Nicola; Reynolds, Todd W; Crowe, John B

    2002-09-01

    The ability to accurately determine the number of syringe needle penetration holes through the rubber stoppers in pharmaceutical vials and rubber septa in intravenous (i.v.) line and bag ports has been a critical factor in a number of forensic cases involving the thefts of controlled substances or suspected homicide by lethal injection. In the early 1990s, the microscopy and microanalysis group of the U.S. Food and Drug Administration's Forensic Chemistry Center (FCC) developed and implemented a method (unpublished) to locate needle punctures in rubber pharmaceutical vial stoppers. In 1996, as part of a multiple homicide investigation, the Indiana State Police Laboratory (ISPL) contacted the FCC for information on a method to identify and count syringe needle punctures through rubber stoppers in pharmaceutical vials. In a joint project and investigation using the FCC's needle hole location method and applying a method of puncture site mapping developed by the ISPL, a systematic method was developed to locate, identify, count, and map syringe punctures in rubber bottle stoppers or i.v. bag ports using microscopic analysis. The method requires documentation of punctures on both sides of the rubber stoppers and microscopic analysis of each suspect puncture site. The final result of an analysis using the method is a detailed diagram of puncture holes on both sides of a questioned stopper and a record of the minimum number of puncture holes through a stopper.

  13. Linnorm: improved statistical analysis for single cell RNA-seq expression data

    PubMed Central

    Yip, Shun H.; Wang, Panwen; Kocher, Jean-Pierre A.; Sham, Pak Chung

    2017-01-01

    Abstract Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. PMID:28981748

  14. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  15. Teaching learning methods of an entrepreneurship curriculum

    PubMed Central

    ESMI, KERAMAT; MARZOUGHI, RAHMATALLAH; TORKZADEH, JAFAR

    2015-01-01

    Introduction One of the most significant elements of entrepreneurship curriculum design is teaching-learning methods, which plays a key role in studies and researches related to such a curriculum. It is the teaching method, and systematic, organized and logical ways of providing lessons that should be consistent with entrepreneurship goals and contents, and should also be developed according to the learners’ needs. Therefore, the current study aimed to introduce appropriate, modern, and effective methods of teaching entrepreneurship and their validation Methods This is a mixed method research of a sequential exploratory kind conducted through two stages: a) developing teaching methods of entrepreneurship curriculum, and b) validating developed framework. Data were collected through “triangulation” (study of documents, investigating theoretical basics and the literature, and semi-structured interviews with key experts). Since the literature on this topic is very rich, and views of the key experts are vast, directed and summative content analysis was used. In the second stage, qualitative credibility of research findings was obtained using qualitative validation criteria (credibility, confirmability, and transferability), and applying various techniques. Moreover, in order to make sure that the qualitative part is reliable, reliability test was used. Moreover, quantitative validation of the developed framework was conducted utilizing exploratory and confirmatory factor analysis methods and Cronbach’s alpha. The data were gathered through distributing a three-aspect questionnaire (direct presentation teaching methods, interactive, and practical-operational aspects) with 29 items among 90 curriculum scholars. Target population was selected by means of purposive sampling and representative sample. Results Results obtained from exploratory factor analysis showed that a three factor structure is an appropriate method for describing elements of teaching-learning methods of entrepreneurship curriculum. Moreover, the value for Kaiser Meyer Olkin measure of sampling adequacy equaled 0.72 and the value for Bartlett’s test of variances homogeneity was significant at the 0.0001 level. Except for internship element, the rest had a factor load of higher than 0.3. Also, the results of confirmatory factor analysis showed the model appropriateness, and the criteria for qualitative accreditation were acceptable. Conclusion Developed model can help instructors in selecting an appropriate method of entrepreneurship teaching, and it can also make sure that the teaching is on the right path. Moreover, the model is comprehensive and includes all the effective teaching methods in entrepreneurship education. It is also based on qualities, conditions, and requirements of Higher Education Institutions in Iranian cultural environment. PMID:26457314

  16. A method to estimate weight and dimensions of large and small gas turbine engines

    NASA Technical Reports Server (NTRS)

    Onat, E.; Klees, G. W.

    1979-01-01

    A computerized method was developed to estimate weight and envelope dimensions of large and small gas turbine engines within + or - 5% to 10%. The method is based on correlations of component weight and design features of 29 data base engines. Rotating components were estimated by a preliminary design procedure which is sensitive to blade geometry, operating conditions, material properties, shaft speed, hub tip ratio, etc. The development and justification of the method selected, and the various methods of analysis are discussed.

  17. Guidelines for Analysis of Communicable Disease Control Planning in Developing Countries. Volume 1: Communicable Diseases Control Planning. International Health Planning Methods Series.

    ERIC Educational Resources Information Center

    Chin, James

    Intended to assist Agency for International Development (AID) officers, advisors, and health officials in incorporating health planning into national plans for economic development, this first of ten manuals in the International Health Planning Methods Series deals with planning and evaluation of communicable disease control programs. The first…

  18. Uncovering a Hidden Professional Agenda for Teacher Educators: A Mixed Method Study on Flemish Teacher Educators and Their Professional Development

    ERIC Educational Resources Information Center

    Tack, Hanne; Valcke, Martin; Rots, Isabel; Struyven, Katrien; Vanderlinde, Ruben

    2018-01-01

    Taking into account the pressing need to understand more about what teacher educators' professional development characterises, this article adopts a mixed method approach to explore Flemish (Dutch-speaking part of Belgium) teacher educators' professional development needs and opportunities. Analysis results of a large-scale survey study with 611…

  19. Guidelines for Analysis of Environmental Health Planning in Developing Countries. Volume 2: Environmental Health Planning. International Health Planning Methods Series.

    ERIC Educational Resources Information Center

    Fraser, Renee White; Shani, Hadasa

    Intended to assist Agency for International Development (AID) officers, advisors, and health officials in incorporating health planning into national plans for economic development, this second of ten manuals in the International Health Planning Methods Series deals with assessment, planning, and evaluation in the field of environmental health.…

  20. Study of swelling behavior in ArF resist during development by the QCM method (3): observations of swelling layer elastic modulus

    NASA Astrophysics Data System (ADS)

    Sekiguchi, Atsushi

    2013-03-01

    The QCM method allows measurements of impedance, an index of swelling layer viscosity in a photoresist during development. While impedance is sometimes used as a qualitative index of change in the viscosity of the swelling layer, it has to date not been used quantitatively, for data analysis. We explored a method for converting impedance values to elastic modulus (Pa), a coefficient expressing viscosity. Applying this method, we compared changes in the viscosity of the swelling layer in an ArF resist generated during development in a TMAH developing solution and in a TBAH developing solution. This paper reports the results of this comparative study.

Top