Sample records for analysis methods psam

  1. Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.

  2. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  3. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  4. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  5. Increased Sensitivity of HIV-1 p24 ELISA Using a Photochemical Signal Amplification System.

    PubMed

    Bystryak, Simon; Santockyte, Rasa

    2015-10-01

    In this study we describe a photochemical signal amplification method (PSAM) for increasing of the sensitivity of enzyme-linked immunosorbent assay (ELISA) for determination of HIV-1 p24 antigen. The photochemical signal amplification method is based on an autocatalytic photochemical reaction of a horseradish peroxidase (HRP) substrate, orthophenylenediamine (OPD). To compare the performance of PSAM-boosted ELISA with a conventional colorimetric ELISA for determination of HIV-1 p24 antigen we employed a PerkinElmer HIV-1 p24 ELISA kit, using conventional ELISA alongside ELISA + PSAM. In the present study, we show that PSAM technology allows one to increase the analytical sensitivity and dynamic range of a commercial HIV-1 p24 ELISA kit, with and without immune-complex disruption, by a factor of approximately 40-fold. ELISA + PSAM is compatible with commercially available microtiter plate readers, requires only an inexpensive illumination device, and the PSAM amplification step takes no longer than 15 min. This method can be used for both commercially available and in-house ELISA tests, and has the advantage of being considerably simpler and less costly than alternative signal amplification methods. This method can be used for both commercially available and in-house ELISA tests, and has the advantage of being considerably simpler and less costly than alternative signal amplification methods.

  6. Combined calculi for photon orbital and spin angular momenta

    NASA Astrophysics Data System (ADS)

    Elias, N. M.

    2014-08-01

    Context. Wavelength, photon spin angular momentum (PSAM), and photon orbital angular momentum (POAM), completely describe the state of a photon or an electric field (an ensemble of photons). Wavelength relates directly to energy and linear momentum, the corresponding kinetic quantities. PSAM and POAM, themselves kinetic quantities, are colloquially known as polarization and optical vortices, respectively. Astrophysical sources emit photons that carry this information. Aims: PSAM characteristics of an electric field (intensity) are compactly described by the Jones (Stokes/Mueller) calculus. Similarly, I created calculi to represent POAM characteristics of electric fields and intensities in an astrophysical context. Adding wavelength dependence to all of these calculi is trivial. The next logical steps are to 1) form photon total angular momentum (PTAM = POAM + PSAM) calculi; 2) prove their validity using operators and expectation values; and 3) show that instrumental PSAM can affect measured POAM values for certain types of electric fields. Methods: I derive the PTAM calculi of electric fields and intensities by combining the POAM and PSAM calculi. I show how these quantities propagate from celestial sphere to image plane. I also form the PTAM operator (the sum of the POAM and PSAM operators), with and without instrumental PSAM, and calculate the corresponding expectation values. Results: Apart from the vector, matrix, dot product, and direct product symbols, the PTAM and POAM calculi appear superficially identical. I provide tables with all possible forms of PTAM calculi. I prove that PTAM expectation values are correct for instruments with and without instrumental PSAM. I also show that POAM measurements of "unfactored" PTAM electric fields passing through non-zero instrumental circular PSAM can be biased. Conclusions: The combined PTAM calculi provide insight into mathematically modeling PTAM sources and calibrating POAM- and PSAM-induced measurement errors.

  7. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  8. Rocketdyne PSAM: In-house enhancement/application

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ohara, K.

    1991-01-01

    The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.

  9. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  10. Proximate composition and nutritional evaluation of the adductor muscle of pen shell.

    PubMed

    Wu, Shengjun; Wu, Yuping

    2017-07-01

    The proximate composition of pen shell adductor muscle (PSAM) was determined, and its nutrition value was evaluated. Proximate composition analysis indicated that PSAM contained 91.07% (w/w) protein, 5.77% (w/w) ash, and 2.46% (w/w) fat. Calcium was the predominant mineral followed by zinc and then iron. The amino acid profile was in accordance with the recommended pattern of FAO/WHO except for histidine. At the same time, the first limiting amino acid was histidine. Fatty acid composition showed that docosahexaenoic acid was the major fatty acid, followed by palmitic, stearic, and arachidonic acids. Results indicated that PSAM was rich in nutrition and may be developed as a functional food.

  11. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A major research and technology program in Probabilistic Structural Analysis Methods (PSAM) is currently being sponsored by the NASA Lewis Research Center with Southwest Research Institute as the prime contractor. This program is motivated by the need to accurately predict structural response in an environment where the loadings, the material properties, and even the structure may be considered random. The heart of PSAM is a software package which combines advanced structural analysis codes with a fast probability integration (FPI) algorithm for the efficient calculation of stochastic structural response. The basic idea of PAAM is simple: make an approximate calculation of system response, including calculation of the associated probabilities, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The deterministic solution resulting should give a reasonable and realistic description of performance-limiting system responses, although some error will be inevitable. If the simple model has correctly captured the basic mechanics of the system, however, including the proper functional dependence of stress, frequency, etc. on design parameters, then the response sensitivities calculated may be of significantly higher accuracy.

  12. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.

  13. 2009 Space Shuttle Probabilistic Risk Assessment Overview

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.

    2010-01-01

    Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.

  14. Prostate-specific Antigen Mass Density--A Measure Predicting Prostate Cancer Volume and Accounting for Overweight and Obesity-related Prostate-specific Antigen Hemodilution.

    PubMed

    Kryvenko, Oleksandr N; Diaz, Mireya; Matoso, Andres; Kates, Max; Cohen, Jason; Swanson, Gregory P; Epstein, Jonathan I

    2016-04-01

    To test prostate-specific antigen mass density (PSAMD) as a predictor of total tumor volume (TTV) at radical prostatectomy (RP). We conducted a detailed pathologic analysis of 469 RP from men with NCCN low-risk prostate cancer who had Gleason score of 3 + 3 = 6 (grade group 1) at RP. We then compared the ability of PSA, PSA density (PSAD), PSA mass (PSAM-absolute amount of PSA in patient's circulation), and PSAM density (PSAM divided by prostate weight without seminal vesicles) to predict TTV at RP. PSAM was calculated by multiplying plasma volume (estimated body surface [weight, kg(0.425) × height, m(0.72) × 0.007184] × 1.67) by PSA. Performance of the above measures in different BMI categories was assessed. Kruskal-Wallis test was used to compare the means and Spearman's rank correlation coefficient to assess the correlations. The 469 men were normal weight (n = 129), overweight (n = 253), and obese (n = 87). Mean age of the patients' was 57.4 years and PSA of 4.53 ng/ml. Increase of prostate weight with body mass index (BMI) was reflected in PSAM (both P <.001) but not in other measures. BMI did not correlate with TTV and PSA. Among PSA, PSAD, PSAM, and PSAMD, PSAMD had the highest correlation with TTV (r = 0.336; P <.001). Prostate weight had stronger (negative) association with PSAMD (r = -0.394; <.001) than TTV. PSAMD is the biochemical measure with the best correlation with TTV at RP. Unlike other measures, it is not affected by BMI-related hemodilution. Thresholds should be established to use this more objective measure clinically in surveillance algorithms and in planning radical prostatectomy. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Characterization of Homopolymer and Polymer Blend Films by Phase Sensitive Acoustic Microscopy

    NASA Astrophysics Data System (ADS)

    Ngwa, Wilfred; Wannemacher, Reinhold; Grill, Wolfgang

    2003-03-01

    CHARACTERIZATION OF HOMOPOLYMER AND POLYMER BLEND FILMS BY PHASE SENSITIVE ACOUSTIC MICROSCOPY W Ngwa, R Wannemacher, W Grill Institute of Experimental Physics II, University of Leipzig, 04103 Leipzig, Germany Abstract We have used phase sensitive acoustic microscopy (PSAM) to study homopolymer thin films of polystyrene (PS) and poly (methyl methacrylate) (PMMA), as well as PS/PMMA blend films. We show from our results that PSAM can be used as a complementary and highly valuable technique for elucidating the three-dimensional (3D) morphology and micromechanical properties of thin films. Three-dimensional image acquisition with vector contrast provides the basis for: complex V(z) analysis (per image pixel), 3D image processing, height profiling, and subsurface image analysis of the polymer films. Results show good agreement with previous studies. In addition, important new information on the three dimensional structure and properties of polymer films is obtained. Homopolymer film structure analysis reveals (pseudo-) dewetting by retraction of droplets, resulting in a morphology that can serve as a starting point for the analysis of polymer blend thin films. The outcome of confocal laser scanning microscopy studies, performed on the same samples are correlated with the obtained results. Advantages and limitations of PSAM are discussed.

  16. NESSUS (Numerical Evaluation of Stochastic Structures Under Stress)/EXPERT: Bridging the gap between artificial intelligence and FORTRAN

    NASA Technical Reports Server (NTRS)

    Fink, Pamela K.; Palmer, Karol K.

    1988-01-01

    The development of a probabilistic structural analysis methodology (PSAM) is described. In the near-term, the methodology will be applied to designing critical components of the next generation space shuttle main engine. In the long-term, PSAM will be applied very broadly, providing designers with a new technology for more effective design of structures whose character and performance are significantly affected by random variables. The software under development to implement the ideas developed in PSAM resembles, in many ways, conventional deterministic structural analysis code. However, several additional capabilities regarding the probabilistic analysis makes the input data requirements and the resulting output even more complex. As a result, an intelligent front- and back-end to the code is being developed to assist the design engineer in providing the input data in a correct and appropriate manner. The type of knowledge that this entails is, in general, heuristically-based, allowing the fairly well-understood technology of production rules to apply with little difficulty. However, the PSAM code, called NESSUS, is written in FORTRAN-77 and runs on a DEC VAX. Thus, the associated expert system, called NESSUS/EXPERT, must run on a DEC VAX as well, and integrate effectively and efficiently with the existing FORTRAN code. This paper discusses the process undergone to select a suitable tool, identify an appropriate division between the functions that should be performed in FORTRAN and those that should be performed by production rules, and how integration of the conventional and AI technologies was achieved.

  17. High-Resolution Structure of a Self-Assembly-Competent Form of a Hydrophobic Peptide Captured in a Soluble [beta]-Sheet Scaffold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makabe, Koki; Biancalana, Matthew; Yan, Shude

    2010-02-08

    {beta}-Rich self-assembly is a major structural class of polypeptides, but still little is known about its atomic structures and biophysical properties. Major impediments for structural and biophysical studies of peptide self-assemblies include their insolubility and heterogeneous composition. We have developed a model system, termed peptide self-assembly mimic (PSAM), based on the single-layer {beta}-sheet of Borrelia outer surface protein A. PSAM allows for the capture of a defined number of self-assembly-like peptide repeats within a water-soluble protein, making structural and energetic studies possible. In this work, we extend our PSAM approach to a highly hydrophobic peptide sequence. We show that amore » penta-Ile peptide (Ile{sub 5}), which is insoluble and forms {beta}-rich self-assemblies in aqueous solution, can be captured within the PSAM scaffold in a form capable of self-assembly. The 1.1-{angstrom} crystal structure revealed that the Ile{sub 5} stretch forms a highly regular {beta}-strand within this flat {beta}-sheet. Self-assembly models built with multiple copies of the crystal structure of the Ile5 peptide segment showed no steric conflict, indicating that this conformation represents an assembly-competent form. The PSAM retained high conformational stability, suggesting that the flat {beta}-strand of the Ile{sub 5} stretch primed for self-assembly is a low-energy conformation of the Ile{sub 5} stretch and rationalizing its high propensity for self-assembly. The ability of the PSAM to 'solubilize' an otherwise insoluble peptide stretch suggests the potential of the PSAM approach to the characterization of self-assembling peptides.« less

  18. Probabilistic Structural Analysis Methods for select space propulsion system components (PSAM). Volume 3: Literature surveys and technical reports

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The technical effort and computer code developed during the first year are summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis.

  19. Automatic mathematical modeling for space application

    NASA Technical Reports Server (NTRS)

    Wang, Caroline K.

    1987-01-01

    A methodology for automatic mathematical modeling is described. The major objective is to create a very friendly environment for engineers to design, maintain and verify their model and also automatically convert the mathematical model into FORTRAN code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine simulation mathematical model called Propulsion System Automatic Modeling (PSAM). PSAM provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. PSAM contains an initial set of component process elements for the Space Shuttle Main Engine simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. PSAM is then able to automatically generate the model and the FORTRAN code. A future goal is to download the FORTRAN code to the VAX/VMS system for conventional computation.

  20. Comparative Cost-Effectiveness Analysis of Three Different Automated Medication Systems Implemented in a Danish Hospital Setting.

    PubMed

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    2018-02-01

    Automated medication systems have been found to reduce errors in the medication process, but little is known about the cost-effectiveness of such systems. The objective of this study was to perform a model-based indirect cost-effectiveness comparison of three different, real-world automated medication systems compared with current standard practice. The considered automated medication systems were a patient-specific automated medication system (psAMS), a non-patient-specific automated medication system (npsAMS), and a complex automated medication system (cAMS). The economic evaluation used original effect and cost data from prospective, controlled, before-and-after studies of medication systems implemented at a Danish hematological ward and an acute medical unit. Effectiveness was described as the proportion of clinical and procedural error opportunities that were associated with one or more errors. An error was defined as a deviation from the electronic prescription, from standard hospital policy, or from written procedures. The cost assessment was based on 6-month standardization of observed cost data. The model-based comparative cost-effectiveness analyses were conducted with system-specific assumptions of the effect size and costs in scenarios with consumptions of 15,000, 30,000, and 45,000 doses per 6-month period. With 30,000 doses the cost-effectiveness model showed that the cost-effectiveness ratio expressed as the cost per avoided clinical error was €24 for the psAMS, €26 for the npsAMS, and €386 for the cAMS. Comparison of the cost-effectiveness of the three systems in relation to different valuations of an avoided error showed that the psAMS was the most cost-effective system regardless of error type or valuation. The model-based indirect comparison against the conventional practice showed that psAMS and npsAMS were more cost-effective than the cAMS alternative, and that psAMS was more cost-effective than npsAMS.

  1. Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, J.; Ayala, S.

    1999-01-01

    NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.

  2. Probabilistic Structural Analysis Methods for select space propulsion system components (PSAM). Volume 2: Literature surveys of critical Space Shuttle main engine components

    NASA Technical Reports Server (NTRS)

    Rajagopal, K. R.

    1992-01-01

    The technical effort and computer code development is summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis. Volume 2 is a summary of critical SSME components.

  3. Examination of the Cross-cultural and Cross-language Equivalence of the Parenting Self-Agency Measure.

    ERIC Educational Resources Information Center

    Dumka, Larry E.; And Others

    1996-01-01

    Discusses initial validation and evaluation of the cross-cultural and cross-language equivalence of the Parenting Self-Agency Measure (PSAM) with English-speaking middle-income Anglo mothers (n=90) and Spanish-speaking low-income Mexican immigrant mothers (n=94). Hypothesized relationship of PSAM with measures of parents' coping strategies and…

  4. Structural Analysis Made 'NESSUSary'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application

  5. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  6. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  7. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  8. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.

  9. Commercialization of NESSUS: Status

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Millwater, Harry R.

    1991-01-01

    A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.

  10. NASA Johnson Space Center's Planetary Sample Analysis and Mission Science (PSAMS) Laboratory: A National Facility for Planetary Research

    NASA Technical Reports Server (NTRS)

    Draper, D. S.

    2016-01-01

    NASA Johnson Space Center's (JSC's) Astromaterials Research and Exploration Science (ARES) Division, part of the Exploration Integration and Science Directorate, houses a unique combination of laboratories and other assets for conducting cutting edge planetary research. These facilities have been accessed for decades by outside scientists, most at no cost and on an informal basis. ARES has thus provided substantial leverage to many past and ongoing science projects at the national and international level. Here we propose to formalize that support via an ARES/JSC Plane-tary Sample Analysis and Mission Science Laboratory (PSAMS Lab). We maintain three major research capa-bilities: astromaterial sample analysis, planetary process simulation, and robotic-mission analog research. ARES scientists also support planning for eventual human ex-ploration missions, including astronaut geological training. We outline our facility's capabilities and its potential service to the community at large which, taken together with longstanding ARES experience and expertise in curation and in applied mission science, enable multi-disciplinary planetary research possible at no other institution. Comprehensive campaigns incorporating sample data, experimental constraints, and mission science data can be conducted under one roof.

  11. Ultrafast proton shuttling in Psammocora cyan fluorescent protein.

    PubMed

    Kennis, John T M; van Stokkum, Ivo H M; Peterson, Dayna S; Pandit, Anjali; Wachter, Rebekka M

    2013-09-26

    Cyan, green, yellow, and red fluorescent proteins (FPs) homologous to green fluorescent protein (GFP) are used extensively as model systems to study fundamental processes in photobiology, such as the capture of light energy by protein-embedded chromophores, color tuning by the protein matrix, energy conversion by Förster resonance energy transfer (FRET), and excited-state proton transfer (ESPT) reactions. Recently, a novel cyan fluorescent protein (CFP) termed psamFP488 was isolated from the genus Psammocora of reef building corals. Within the cyan color class, psamFP488 is unusual because it exhibits a significantly extended Stokes shift. Here, we applied ultrafast transient absorption and pump-dump-probe spectroscopy to investigate the mechanistic basis of psamFP488 fluorescence, complemented with fluorescence quantum yield and dynamic light scattering measurements. Transient absorption spectroscopy indicated that, upon excitation at 410 nm, the stimulated cyan emission rises in 170 fs. With pump-dump-probe spectroscopy, we observe a very short-lived (110 fs) ground-state intermediate that we assign to the deprotonated, anionic chromophore. In addition, a minor fraction (14%) decays with 3.5 ps to the ground state. Structural analysis of homologous proteins indicates that Glu-167 is likely positioned in sufficiently close vicinity to the chromophore to act as a proton acceptor. Our findings support a model where unusually fast ESPT from the neutral chromophore to Glu-167 with a time constant of 170 fs and resulting emission from the anionic chromophore forms the basis of the large psamFP488 Stokes shift. When dumped to the ground state, the proton on neutral Glu is very rapidly shuttled back to the anionic chromophore in 110 fs. Proton shuttling in excited and ground states is a factor of 20-4000 faster than in GFP, which probably results from a favorable hydrogen-bonding geometry between the chromophore phenolic oxygen and the glutamate acceptor, possibly involving a short hydrogen bond. At any time in the reaction, the proton is localized on either the chromophore or Glu-167, which implies that most likely no low-barrier hydrogen bond exists between these molecular groups. This work supports the notion that proton transfer in biological systems, be it in an electronic excited or ground state, can be an intrinsically fast process that occurs on a 100 fs time scale. PsamFP488 represents an attractive model system that poses an ultrafast proton transfer regime in discrete steps. It constitutes a valuable model system in addition to wild type GFP, where proton transfer is relatively slow, and the S65T/H148D GFP mutant, where the effects of low-barrier hydrogen bonds dominate.

  12. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  13. Probabilistic Structures Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The basic formulation for probabilistic finite element analysis is described and demonstrated on a few sample problems. This formulation is based on iterative perturbation that uses the factorized stiffness on the unperturbed system as the iteration preconditioner for obtaining the solution to the perturbed problem. This approach eliminates the need to compute, store and manipulate explicit partial derivatives of the element matrices and force vector, which not only reduces memory usage considerably, but also greatly simplifies the coding and validation tasks. All aspects for the proposed formulation were combined in a demonstration problem using a simplified model of a curved turbine blade discretized with 48 shell elements, and having random pressure and temperature fields with partial correlation, random uniform thickness, and random stiffness at the root.

  14. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This annual report summarizes the work completed during the third year of technical effort on the referenced contract. Principal developments continue to focus on the Probabilistic Finite Element Method (PFEM) which has been under development for three years. Essentially all of the linear capabilities within the PFEM code are in place. Major progress in the application or verifications phase was achieved. An EXPERT module architecture was designed and partially implemented. EXPERT is a user interface module which incorporates an expert system shell for the implementation of a rule-based interface utilizing the experience and expertise of the user community. The Fast Probability Integration (FPI) Algorithm continues to demonstrate outstanding performance characteristics for the integration of probability density functions for multiple variables. Additionally, an enhanced Monte Carlo simulation algorithm was developed and demonstrated for a variety of numerical strategies.

  15. Composite Load Spectra for Select Space Propulsion Structural Components

    NASA Technical Reports Server (NTRS)

    Ho, Hing W.; Newell, James F.

    1994-01-01

    Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.

  16. Molecular Mechanism of Thioflavin-T Binding to the Surface of [beta]-Rich Peptide Self-Assemblies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biancalana, Matthew; Makabe, Koki; Koide, Akiko

    A number of small organic molecules have been developed that bind to amyloid fibrils, a subset of which also inhibit fibrillization. Among these, the benzothiol dye Thioflavin-T (ThT) has been used for decades in the diagnosis of protein-misfolding diseases and in kinetic studies of self-assembly (fibrillization). Despite its importance, efforts to characterize the ThT-binding mechanism at the atomic level have been hampered by the inherent insolubility and heterogeneity of peptide self-assemblies. To overcome these challenges, we have developed a minimalist approach to designing a ThT-binding site in a 'peptide self-assembly mimic' (PSAM) scaffold. PSAMs are engineered water-soluble proteins that mimicmore » a segment of beta-rich peptide self-assembly, and they are amenable to standard biophysical techniques and systematic mutagenesis. The PSAM beta-sheet contains rows of repetitive amino acid patterns running perpendicular to the strands (cross-strand ladders) that represent a ubiquitous structural feature of fibril-like surfaces. We successfully designed a ThT-binding site that recapitulates the hallmarks of ThT-fibril interactions by constructing a cross-strand ladder consisting of contiguous tyrosines. The X-ray crystal structures suggest that ThT interacts with the beta-sheet by docking onto surfaces formed by a single tyrosine ladder, rather than in the space between adjacent ladders. Systematic mutagenesis further demonstrated that tyrosine surfaces across four or more beta-strands formed the minimal binding site for ThT. Our work thus provides structural insights into how this widely used dye recognizes a prominent subset of peptide self-assemblies, and proposes a strategy to elucidate the mechanisms of fibril-ligand interactions.« less

  17. Electrochemical Impedance Spectroscopy for Real-Time Detection of Lipid Membrane Damage Based on a Porous Self-Assembly Monolayer Support.

    PubMed

    Zhang, Meng; Zhai, Qingyu; Wan, Liping; Chen, Li; Peng, Yu; Deng, Chunyan; Xiang, Juan; Yan, Jiawei

    2018-06-19

    Layer-by-layer dissolution and permeable pore formation are two typical membrane damage pathways, which induce membrane function disorder and result in serious disease, such as Alzheimer's disease, Keshan disease, Sickle-cell disease, and so on. To effectively distinguish and sensitively monitor these two typical membrane damage pathways, a facile electrochemical impedance strategy was developed on a porous self-assembly monolayer (pSAM) supported bilayer lipid membrane (BLM). The pSAM was prepared by selectively electrochemical reductive desorption of the mercaptopropionic acid in a mixed mercaptopropionic acid/11-mercaptoundecanoic acid self-assembled monolayer, which created plenty of nanopores with tens of nanometers in diameter and several nanometers in height (defined as inner-pores). The ultralow aspect ratio of the inner-pores was advantageous to the mass transfer of electrochemical probe [Fe(CN) 6 ] 3-/4- , simplifying the equivalent electric circuit for electrochemical impedance spectroscopy analysis at the electrode/membrane interface. [Fe(CN) 6 ] 3-/4- transferring from the bulk solution into the inner-pore induce significant changes of the interfacial impedance properties, improving the detection sensitivity. Based on these, the different membrane damage pathways were effectively distinguished and sensitively monitored with the normalized resistance-capacitance changes of inner-pore-related parameters including the electrolyte resistance within the pore length ( R pore ) and the metal/inner-pore interfacial capacitance ( C pore ) and the charge-transfer resistance ( R ct-in ) at the metal/inner-pore interface.

  18. Five-Axis Goniometric Stage

    DTIC Science & Technology

    2011-02-01

    PS4 , Anaheim Automation PSAM24V2.7A) and a custom built variable 24 VDC 5 A power supply (PS5). The input voltage to the variable power supply is...provide power for the four brushless DC motor controllers (Anaheim Automation, MDC050-050051). PS4 supplies power to the optional position sensors...blue) X R15 - (green) 21 R14 + (red) Y R14 - (brown) 22 Z P1 P2 P3 P4 P9 P8 P5 PS1 PS2 PS3 PS4 PS5 19 Pending Distribution A

  19. Partnering With NASA JSC for Community Research Needs; Collaborative and Student Opportunities via Jacobs and PSAMS Initiative

    NASA Technical Reports Server (NTRS)

    Danielson, Lisa; Draper, David

    2016-01-01

    NASA Johnson Space Center's (JSC's) Astromaterials Research and Exploration Science (ARES) Division houses a unique combination of laboratories and other assets for conducting cutting-edge planetary research. These facilities have been accessed for decades by outside scientists; over the past five years, the 16 full time contract research and technical staff members in our division have hosted a total of 223 visiting researchers, representing 35 institutions. In order to continue to provide this level of support to the planetary sciences community, and also expand our services and collaboration within the broader scientific community, we intend to submit a proposal to NASA specifically for facilities support and establishment of our laboratories as a collective, PSAMS, Planetary Sample Analyses and Mission Science. This initiative should result in substantial cost savings to PIs with NASA funding who wish to use our facilities. Another cost saving could be realized by aggregating visiting user experiments and analyses through COMPRES, which would be of particular interest to researchers in earth and material sciences. JSC is a recognized NASA center of excellence for curation, and in future will allow PIs and mission teams easy access to samples in Curation facilities that they have been approved to study. Our curation expertise could also be used for a collection of experimental run products that could be shared and distributed to COMPRES community members. These experimental run products could range from 1 bar controlled atmosphere furnace, piston cylinder, multi-anvil, CETUS (see companion abstract), to shocked products. Coordinated analyses of samples is one of the major strengths of our division, where a single sample can be prepared with minimal destruction for a variety of chemical and structural analyses, from macro to nano-scale.

  20. Partnering with NASA JSC for Community Research Needs; Collaborative and Student Opportunities via Jacobs and Psams Initiative

    NASA Astrophysics Data System (ADS)

    Danielson, L. R.; Draper, D. S.

    2016-12-01

    NASA Johnson Space Center's (JSC) Astromaterials Research and Exploration Science Division houses a unique combination of laboratories and other assets for conducting cutting-edge planetary research. These facilities have been accessed for decades by outside scientists; over the past five years, the 16 full time contract research and technical staff members in our division have hosted a total of 223 visiting researchers, representing 35 institutions. We intend to submit a proposal to NASA specifically for facilities support and establishment of our laboratories as a collective, PSAMS, Planetary Sample Analyses and Mission Science, which should result in substantial cost savings to PIs who wish to use our facilities. JSC is a recognized NASA center of excellence for curation, and in future will allow PIs easy access to samples in Curation facilities that they have been approved to study. Our curation expertise could also be used for a collection of experimental run products and standards that could be shared and distributed to community members, products that could range from 1 bar controlled atmosphere furnace, piston cylinder, multi-anvil, to shocked products. Coordinated analyses of samples is one of the major strengths of our division, where a single sample can be prepared with minimal destruction for a variety of chemical and structural analyses, from macro to nano-scale. A CT scanner will be delivered August 2016 and installed in the same building as all the other division experimental and analytical facilities, allowing users to construct a 3 dimensional model of their run product and/or starting material before any destruction of their sample for follow up analyses. The 3D printer may also be utilized to construct containers for diamond anvil cell experiments. Our staff scientists will work with PIs to maximize science return and serve the needs of the community. We welcome student visitors, and a graduate semester internship is available through Jacobs.

  1. Demonstration of the Application of Composite Load Spectra (CLS) and Probabilistic Structural Analysis (PSAM) Codes to SSME Heat Exchanger Turnaround Vane

    NASA Technical Reports Server (NTRS)

    Rajagopal, Kadambi R.; DebChaudhury, Amitabha; Orient, George

    2000-01-01

    This report describes a probabilistic structural analysis performed to determine the probabilistic structural response under fluctuating random pressure loads for the Space Shuttle Main Engine (SSME) turnaround vane. It uses a newly developed frequency and distance dependent correlation model that has features to model the decay phenomena along the flow and across the flow with the capability to introduce a phase delay. The analytical results are compared using two computer codes SAFER (Spectral Analysis of Finite Element Responses) and NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) and with experimentally observed strain gage data. The computer code NESSUS with an interface to a sub set of Composite Load Spectra (CLS) code is used for the probabilistic analysis. A Fatigue code was used to calculate fatigue damage due to the random pressure excitation. The random variables modeled include engine system primitive variables that influence the operating conditions, convection velocity coefficient, stress concentration factor, structural damping, and thickness of the inner and outer vanes. The need for an appropriate correlation model in addition to magnitude of the PSD is emphasized. The study demonstrates that correlation characteristics even under random pressure loads are capable of causing resonance like effects for some modes. The study identifies the important variables that contribute to structural alternate stress response and drive the fatigue damage for the new design. Since the alternate stress for the new redesign is less than the endurance limit for the material, the damage due high cycle fatigue is negligible.

  2. Metal-ion retention properties of water-soluble amphiphilic block copolymer in double emulsion systems (w/o/w) stabilized by non-ionic surfactants.

    PubMed

    Palencia, Manuel; Rivas, Bernabé L

    2011-11-15

    Metal-ion retention properties of water-soluble amphiphilic polymers in presence of double emulsion were studied by diafiltration. Double emulsion systems, water-in-oil-in-water, with a pH gradient between external and internal aqueous phases were prepared. A poly(styrene-co-maleic anhydride) (PSAM) solution at pH 6.0 was added to the external aqueous phase of double emulsion and by application of pressure a divalent metal-ion stream was continuously added. Metal-ions used were Cu(2+) and Cd(2+) at the same pH of polymer solution. According to our results, metal-ion retention is mainly the result of polymer-metal interaction. Interaction between PSMA and reverse emulsion globules is strongly controlled by amount of metal-ions added in the external aqueous phase. In addition, as metal-ion concentration was increased, a negative effect on polymer retention capacity and promotion of flocculation phenomena were produced. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Revisiting the Plastid Phylogenomics of Pinaceae with Two Complete Plastomes of Pseudolarix and Tsuga

    PubMed Central

    Sudianto, Edi; Wu, Chung-Shien; Lin, Ching-Ping; Chaw, Shu-Miaw

    2016-01-01

    Phylogeny of the ten Pinaceous genera has long been contentious. Plastid genomes (plastomes) provide an opportunity to resolve this problem because they contain rich evolutionary information. To comprehend the plastid phylogenomics of all ten Pinaceous genera, we sequenced the plastomes of two previously unavailable genera, Pseudolarix amabilis (122,234 bp) and Tsuga chinensis (120,859 bp). Both plastomes share similar gene repertoire and order. Here for the first time we report a unique insertion of tandem repeats in accD of T. chinensis. From the 65 plastid protein-coding genes common to all Pinaceous genera, we re-examined the phylogenetic relationship among all Pinaceous genera. Our two phylogenetic trees are congruent in an identical tree topology, with the five genera of the Abietoideae subfamily constituting a monophyletic clade separate from the other three subfamilies: Pinoideae, Piceoideae, and Laricoideae. The five genera of Abietoideae were grouped into two sister clades consisting of (1) Cedrus alone and (2) two sister subclades of Pseudolarix—Tsuga and Abies—Keteleeria, with the former uniquely losing the gene psaM and the latter specifically excluding the 3 psbA from the residual inverted repeat. PMID:27352945

  4. Revisiting the Plastid Phylogenomics of Pinaceae with Two Complete Plastomes of Pseudolarix and Tsuga.

    PubMed

    Sudianto, Edi; Wu, Chung-Shien; Lin, Ching-Ping; Chaw, Shu-Miaw

    2016-06-27

    Phylogeny of the ten Pinaceous genera has long been contentious. Plastid genomes (plastomes) provide an opportunity to resolve this problem because they contain rich evolutionary information. To comprehend the plastid phylogenomics of all ten Pinaceous genera, we sequenced the plastomes of two previously unavailable genera, Pseudolarix amabilis (122,234 bp) and Tsuga chinensis (120,859 bp). Both plastomes share similar gene repertoire and order. Here for the first time we report a unique insertion of tandem repeats in accD of T. chinensis From the 65 plastid protein-coding genes common to all Pinaceous genera, we re-examined the phylogenetic relationship among all Pinaceous genera. Our two phylogenetic trees are congruent in an identical tree topology, with the five genera of the Abietoideae subfamily constituting a monophyletic clade separate from the other three subfamilies: Pinoideae, Piceoideae, and Laricoideae. The five genera of Abietoideae were grouped into two sister clades consisting of (1) Cedrus alone and (2) two sister subclades of Pseudolarix-Tsuga and Abies-Keteleeria, with the former uniquely losing the gene psaM and the latter specifically excluding the 3 psbA from the residual inverted repeat. © The Author(s) 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  5. Engineering Risk Assessment of Space Thruster Challenge Problem

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Mattenberger, Christopher J.; Go, Susie

    2014-01-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center utilizes dynamic models with linked physics-of-failure analyses to produce quantitative risk assessments of space exploration missions. This paper applies the ERA approach to the baseline and extended versions of the PSAM Space Thruster Challenge Problem, which investigates mission risk for a deep space ion propulsion system with time-varying thruster requirements and operations schedules. The dynamic mission is modeled using a combination of discrete and continuous-time reliability elements within the commercially available GoldSim software. Loss-of-mission (LOM) probability results are generated via Monte Carlo sampling performed by the integrated model. Model convergence studies are presented to illustrate the sensitivity of integrated LOM results to the number of Monte Carlo trials. A deterministic risk model was also built for the three baseline and extended missions using the Ames Reliability Tool (ART), and results are compared to the simulation results to evaluate the relative importance of mission dynamics. The ART model did a reasonable job of matching the simulation models for the baseline case, while a hybrid approach using offline dynamic models was required for the extended missions. This study highlighted that state-of-the-art techniques can adequately adapt to a range of dynamic problems.

  6. Conserved Gene Order and Expanded Inverted Repeats Characterize Plastid Genomes of Thalassiosirales

    PubMed Central

    Ashworth, Matt P.; Baeshen, Nabih A.; Baeshen, Mohammad N.; Bahieldin, Ahmed; Theriot, Edward C.; Jansen, Robert K.

    2014-01-01

    Diatoms are mostly photosynthetic eukaryotes within the heterokont lineage. Variable plastid genome sizes and extensive genome rearrangements have been observed across the diatom phylogeny, but little is known about plastid genome evolution within order- or family-level clades. The Thalassiosirales is one of the more comprehensively studied orders in terms of both genetics and morphology. Seven complete diatom plastid genomes are reported here including four Thalassiosirales: Thalassiosira weissflogii, Roundia cardiophora, Cyclotella sp. WC03_2, Cyclotella sp. L04_2, and three additional non-Thalassiosirales species Chaetoceros simplex, Cerataulina daemon, and Rhizosolenia imbricata. The sizes of the seven genomes vary from 116,459 to 129,498 bp, and their genomes are compact and lack introns. The larger size of the plastid genomes of Thalassiosirales compared to other diatoms is due primarily to expansion of the inverted repeat. Gene content within Thalassiosirales is more conserved compared to other diatom lineages. Gene order within Thalassiosirales is highly conserved except for the extensive genome rearrangement in Thalassiosira oceanica. Cyclotella nana, Thalassiosira weissflogii and Roundia cardiophora share an identical gene order, which is inferred to be the ancestral order for the Thalassiosirales, differing from that of the other two Cyclotella species by a single inversion. The genes ilvB and ilvH are missing in all six diatom plastid genomes except for Cerataulina daemon, suggesting an independent gain of these genes in this species. The acpP1 gene is missing in all Thalassiosirales, suggesting that its loss may be a synapomorphy for the order and this gene may have been functionally transferred to the nucleus. Three genes involved in photosynthesis, psaE, psaI, psaM, are missing in Rhizosolenia imbricata, which represents the first documented instance of the loss of photosynthetic genes in diatom plastid genomes. PMID:25233465

  7. Chloroplast DNA sequence of the green alga Oedogonium cardiacum (Chlorophyceae): Unique genome architecture, derived characters shared with the Chaetophorales and novel genes acquired through horizontal transfer

    PubMed Central

    Brouard, Jean-Simon; Otis, Christian; Lemieux, Claude; Turmel, Monique

    2008-01-01

    Background To gain insight into the branching order of the five main lineages currently recognized in the green algal class Chlorophyceae and to expand our understanding of chloroplast genome evolution, we have undertaken the sequencing of chloroplast DNA (cpDNA) from representative taxa. The complete cpDNA sequences previously reported for Chlamydomonas (Chlamydomonadales), Scenedesmus (Sphaeropleales), and Stigeoclonium (Chaetophorales) revealed tremendous variability in their architecture, the retention of only few ancestral gene clusters, and derived clusters shared by Chlamydomonas and Scenedesmus. Unexpectedly, our recent phylogenies inferred from these cpDNAs and the partial sequences of three other chlorophycean cpDNAs disclosed two major clades, one uniting the Chlamydomonadales and Sphaeropleales (CS clade) and the other uniting the Oedogoniales, Chaetophorales and Chaetopeltidales (OCC clade). Although molecular signatures provided strong support for this dichotomy and for the branching of the Oedogoniales as the earliest-diverging lineage of the OCC clade, more data are required to validate these phylogenies. We describe here the complete cpDNA sequence of Oedogonium cardiacum (Oedogoniales). Results Like its three chlorophycean homologues, the 196,547-bp Oedogonium chloroplast genome displays a distinctive architecture. This genome is one of the most compact among photosynthetic chlorophytes. It has an atypical quadripartite structure, is intron-rich (17 group I and 4 group II introns), and displays 99 different conserved genes and four long open reading frames (ORFs), three of which are clustered in the spacious inverted repeat of 35,493 bp. Intriguingly, two of these ORFs (int and dpoB) revealed high similarities to genes not usually found in cpDNA. At the gene content and gene order levels, the Oedogonium genome most closely resembles its Stigeoclonium counterpart. Characters shared by these chlorophyceans but missing in members of the CS clade include the retention of psaM, rpl32 and trnL(caa), the loss of petA, the disruption of three ancestral clusters and the presence of five derived gene clusters. Conclusion The Oedogonium chloroplast genome disclosed additional characters that bolster the evidence for a close alliance between the Oedogoniales and Chaetophorales. Our unprecedented finding of int and dpoB in this cpDNA provides a clear example that novel genes were acquired by the chloroplast genome through horizontal transfers, possibly from a mitochondrial genome donor. PMID:18558012

  8. Community Extreme Tonnage User Service (CETUS): A 5000 Ton Open Research Facility in the United States

    NASA Technical Reports Server (NTRS)

    Danielson, L.; Righter, K.; McCubbin, F.

    2016-01-01

    Large sample volume 5000 ton multi-anvil presses have contributed to the exploration of deep Earth and planetary interiors, synthesis of ultra-hard and other novel materials, and serve as a sample complement to pressure and temperature regimes already attainable by diamond anvil cell experiments. However, no such facility exists on the North American continent. We propose the establishment of an open user facility for COMPRES members and the entire research community, with the unique capability of a 5000 ton (or more) press, supported by a host of extant co-located experimental and analytical laboratories and research staff. We offer wide range of complementary and/or preparatory experimental options. Any required synthesis of materials or follow up experiments can be carried out controlled atmosphere furnaces, piston cylinders, multi-anvil, or experimental impact apparatus. Additionally, our division houses two machine shops that would facilitate any modification or custom work necessary for development of CETUS, one for general fabrication and one located specifically within our experimental facilities. We also have a general sample preparation laboratory, specifically for experimental samples, that allows users to quickly and easily prepare samples for ebeam analyses and more. A service we can offer to COMPRES community members in general, and CETUS visiting users specifically, is a multitude of analytical instrumentation literally steps away from the experimental laboratories. This year we will be pursuing site funding of our laboratories through NASA's Planetary Science Directorate, which should result in substantial cost savings to all visiting users, and supports our mission of interagency cooperation for the enhancement of science for all (see companion PSAMS abstract). The PI is in a unique position as an employee of Jacobs Technology to draw funding from multiple sources, including those from industry and commerce. We submitted a Planetary Major Equipment proposal to the NASA Emerging Worlds solicitation for the full cost of a press, with competitive bids submitted from Sumitomo, Rockland Research, and Voggenreiter. Additional funding is currently being sought from industry sources through the Strategic Partnerships Office at NASA JSC, External Pursuits Program Office on the JETS contract, and Jacobs corporate in the United States. Internal funding is available for JETS contract personnel to travel to large press locations worldwide to study set-up and operations. We also anticipate a fortuitous cost savings in installation of the large press because plans are already underway for major renovations to the entire experimental petrology suite within the next 2 years in order to accommodate our growing user base. Our focus as contract staff is on serving the scientific needs of our users and collaborators. We are seeking community expert input on multiple aspects of this proposed facility, such as the press type and design, access management, immediate projects, and future innovation initiatives.

  9. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in the...

  10. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in the...

  11. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  12. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  13. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  14. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Methods of analysis. 163.5 Section 163.5 Food and... CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in cacao products shall be determined by the following methods of analysis prescribed in “Official Methods...

  15. Computational Methods for Structural Mechanics and Dynamics, part 1

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson (Editor); Housner, Jerrold M. (Editor); Tanner, John A. (Editor); Hayduk, Robert J. (Editor)

    1989-01-01

    The structural analysis methods research has several goals. One goal is to develop analysis methods that are general. This goal of generality leads naturally to finite-element methods, but the research will also include other structural analysis methods. Another goal is that the methods be amenable to error analysis; that is, given a physical problem and a mathematical model of that problem, an analyst would like to know the probable error in predicting a given response quantity. The ultimate objective is to specify the error tolerances and to use automated logic to adjust the mathematical model or solution strategy to obtain that accuracy. A third goal is to develop structural analysis methods that can exploit parallel processing computers. The structural analysis methods research will focus initially on three types of problems: local/global nonlinear stress analysis, nonlinear transient dynamics, and tire modeling.

  16. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  17. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  18. Methods for Determining Particle Size Distributions from Nuclear Detonations.

    DTIC Science & Technology

    1987-03-01

    Debris . . . 30 IV. Summary of Sample Preparation Method . . . . 35 V. Set Parameters for PCS ... ........... 39 VI. Analysis by Vendors...54 XV. Results From Brookhaven Analysis Using The Method of Cumulants ... ........... . 54 XVI. Results From Brookhaven Analysis of Sample...R-3 Using Histogram Method ......... .55 XVII. Results From Brookhaven Analysis of Sample R-8 Using Histogram Method ........... 56 XVIII.TEM Particle

  19. A catalog of automated analysis methods for enterprise models.

    PubMed

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  20. Validity and consistency assessment of accident analysis methods in the petroleum industry.

    PubMed

    Ahmadi, Omran; Mortazavi, Seyed Bagher; Khavanin, Ali; Mokarami, Hamidreza

    2017-11-17

    Accident analysis is the main aspect of accident investigation. It includes the method of connecting different causes in a procedural way. Therefore, it is important to use valid and reliable methods for the investigation of different causal factors of accidents, especially the noteworthy ones. This study aimed to prominently assess the accuracy (sensitivity index [SI]) and consistency of the six most commonly used accident analysis methods in the petroleum industry. In order to evaluate the methods of accident analysis, two real case studies (process safety and personal accident) from the petroleum industry were analyzed by 10 assessors. The accuracy and consistency of these methods were then evaluated. The assessors were trained in the workshop of accident analysis methods. The systematic cause analysis technique and bowtie methods gained the greatest SI scores for both personal and process safety accidents, respectively. The best average results of the consistency in a single method (based on 10 independent assessors) were in the region of 70%. This study confirmed that the application of methods with pre-defined causes and a logic tree could enhance the sensitivity and consistency of accident analysis.

  1. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL GENERAL ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis...

  2. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL GENERAL ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis...

  3. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL GENERAL ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis...

  4. Controlling the joint local false discovery rate is more powerful than meta-analysis methods in joint analysis of summary statistics from multiple genome-wide association studies.

    PubMed

    Jiang, Wei; Yu, Weichuan

    2017-02-15

    In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  5. Global/local methods research using a common structural analysis framework

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  6. Lipidomic analysis of biological samples: Comparison of liquid chromatography, supercritical fluid chromatography and direct infusion mass spectrometry methods.

    PubMed

    Lísa, Miroslav; Cífková, Eva; Khalikova, Maria; Ovčačíková, Magdaléna; Holčapek, Michal

    2017-11-24

    Lipidomic analysis of biological samples in a clinical research represents challenging task for analytical methods given by the large number of samples and their extreme complexity. In this work, we compare direct infusion (DI) and chromatography - mass spectrometry (MS) lipidomic approaches represented by three analytical methods in terms of comprehensiveness, sample throughput, and validation results for the lipidomic analysis of biological samples represented by tumor tissue, surrounding normal tissue, plasma, and erythrocytes of kidney cancer patients. Methods are compared in one laboratory using the identical analytical protocol to ensure comparable conditions. Ultrahigh-performance liquid chromatography/MS (UHPLC/MS) method in hydrophilic interaction liquid chromatography mode and DI-MS method are used for this comparison as the most widely used methods for the lipidomic analysis together with ultrahigh-performance supercritical fluid chromatography/MS (UHPSFC/MS) method showing promising results in metabolomics analyses. The nontargeted analysis of pooled samples is performed using all tested methods and 610 lipid species within 23 lipid classes are identified. DI method provides the most comprehensive results due to identification of some polar lipid classes, which are not identified by UHPLC and UHPSFC methods. On the other hand, UHPSFC method provides an excellent sensitivity for less polar lipid classes and the highest sample throughput within 10min method time. The sample consumption of DI method is 125 times higher than for other methods, while only 40μL of organic solvent is used for one sample analysis compared to 3.5mL and 4.9mL in case of UHPLC and UHPSFC methods, respectively. Methods are validated for the quantitative lipidomic analysis of plasma samples with one internal standard for each lipid class. Results show applicability of all tested methods for the lipidomic analysis of biological samples depending on the analysis requirements. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 1: Analysis methods

    NASA Technical Reports Server (NTRS)

    Waszak, M. R.; Schmidt, D. S.

    1985-01-01

    As aircraft become larger and lighter due to design requirements for increased payload and improved fuel efficiency, they will also become more flexible. For highly flexible vehicles, the handling qualities may not be accurately predicted by conventional methods. This study applies two analysis methods to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop model analysis technique. This method considers the effects of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Volume 1 consists of the development and application of the two analysis methods described above.

  8. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Methods of analysis. 133.5 Section 133.5 Food and... CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture, milkfat, and phosphatase levels in cheeses will be determined by the following methods of analysis from...

  9. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA, Agricultural...

  10. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural Marketing...

  11. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  12. Who's in and why? A typology of stakeholder analysis methods for natural resource management.

    PubMed

    Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C

    2009-04-01

    Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships.

  13. Comparability of river suspended-sediment sampling and laboratory analysis methods

    USGS Publications Warehouse

    Groten, Joel T.; Johnson, Gregory D.

    2018-03-06

    Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.

  14. Rapid Radiochemical Method for Radium-226 in Building ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Radium-226 in building materials Method Selected for: SAM lists this method for qualitative analysis of radium-226 in concrete or brick building materials Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  15. Rapid Radiochemical Method for Americium-241 in Building ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Americium-241 in building materials Method Selected for: SAM lists this method for qualitative analysis of americium-241 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  16. Adjusting for multiple prognostic factors in the analysis of randomised trials

    PubMed Central

    2013-01-01

    Background When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not generally need to depend on the method of randomisation used. Most methods of analysis work well with large sample sizes, however treating strata as random effects should be the analysis method of choice with binary or time-to-event outcomes and a small sample size. PMID:23898993

  17. Multivariate Methods for Meta-Analysis of Genetic Association Studies.

    PubMed

    Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G

    2018-01-01

    Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.

  18. Influence analysis in quantitative trait loci detection.

    PubMed

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Principal Component Relaxation Mode Analysis of an All-Atom Molecular Dynamics Simulation of Human Lysozyme

    NASA Astrophysics Data System (ADS)

    Nagai, Toshiki; Mitsutake, Ayori; Takano, Hiroshi

    2013-02-01

    A new relaxation mode analysis method, which is referred to as the principal component relaxation mode analysis method, has been proposed to handle a large number of degrees of freedom of protein systems. In this method, principal component analysis is carried out first and then relaxation mode analysis is applied to a small number of principal components with large fluctuations. To reduce the contribution of fast relaxation modes in these principal components efficiently, we have also proposed a relaxation mode analysis method using multiple evolution times. The principal component relaxation mode analysis method using two evolution times has been applied to an all-atom molecular dynamics simulation of human lysozyme in aqueous solution. Slow relaxation modes and corresponding relaxation times have been appropriately estimated, demonstrating that the method is applicable to protein systems.

  20. Rapid Radiochemical Method for Total Radiostrontium (Sr-90) ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Beta counting Method Developed for: Strontium-89 and strontium-90 in building materials Method Selected for: SAM lists this method for qualitative analysis of strontium-89 and strontium-90 in concrete or brick building materials Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  1. Integration of Research Studies: Meta-Analysis of Research. Methods of Integrative Analysis; Final Report.

    ERIC Educational Resources Information Center

    Glass, Gene V.; And Others

    Integrative analysis, or what is coming to be known as meta-analysis, is the integration of the findings of many empirical research studies of a topic. Meta-analysis differs from traditional narrative forms of research reviewing in that it is more quantitative and statistical. Thus, the methods of meta-analysis are merely statistical methods,…

  2. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  3. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGES

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  4. Study on Collision of Ship Side Structure by Simplified Plastic Analysis Method

    NASA Astrophysics Data System (ADS)

    Sun, C. J.; Zhou, J. H.; Wu, W.

    2017-10-01

    During its lifetime, a ship may encounter collision or grounding and sustain permanent damage after these types of accidents. Crashworthiness has been based on two kinds of main methods: simplified plastic analysis and numerical simulation. A simplified plastic analysis method is presented in this paper. Numerical methods using the non-linear finite-element software LS-DYNA are conducted to validate the method. The results show that, as for the accuracy of calculation results, the simplified plasticity analysis are in good agreement with the finite element simulation, which reveals that the simplified plasticity analysis method can quickly and accurately estimate the crashworthiness of the side structure during the collision process and can be used as a reliable risk assessment method.

  5. 77 FR 3230 - Codex Alimentarius Commission: Meeting of the Codex Committee on Methods of Analysis and Sampling

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-23

    ... Alimentarius Commission: Meeting of the Codex Committee on Methods of Analysis and Sampling AGENCY: Office of... discussed at the 33rd Session of the Codex Committee on Methods of Analysis and Sampling (CCMAS) of the... the criteria appropriate to Codex Methods of Analysis and Sampling; serving as a coordinating body for...

  6. 76 FR 3603 - Codex Alimentarius Commission: Meeting of the Codex Committee on Methods of Analysis and Sampling

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-20

    ... Alimentarius Commission: Meeting of the Codex Committee on Methods of Analysis and Sampling AGENCY: Office of... discussed at the 32nd session of the Codex Committee on Methods of Analysis and Sampling (CCMAS) of the... appropriate to Codex Methods of Analysis and Sampling; serving as a coordinating body for Codex with other...

  7. A Review of Flow Analysis Methods for Determination of Radionuclides in Nuclear Wastes and Nuclear Reactor Coolants

    DOE PAGES

    Trojanowicz, Marek; Kolacinska, Kamila; Grate, Jay W.

    2018-02-13

    Here, the safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. Themore » benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β–radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection.« less

  8. A Review of Flow Analysis Methods for Determination of Radionuclides in Nuclear Wastes and Nuclear Reactor Coolants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trojanowicz, Marek; Kolacinska, Kamila; Grate, Jay W.

    Here, the safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. Themore » benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β–radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection.« less

  9. A review of flow analysis methods for determination of radionuclides in nuclear wastes and nuclear reactor coolants.

    PubMed

    Trojanowicz, Marek; Kołacińska, Kamila; Grate, Jay W

    2018-06-01

    The safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. The benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β-radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Global/local methods research using the CSM testbed

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. Hayden, Jr.; Thompson, Danniella M.

    1990-01-01

    Research activities in global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  11. Dynamic Chest Image Analysis: Evaluation of Model-Based Pulmonary Perfusion Analysis With Pyramid Images

    DTIC Science & Technology

    2001-10-25

    Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for

  12. Waste Analysis Plan and Waste Characterization Survey, Barksdale AFB, Louisiana

    DTIC Science & Technology

    1991-03-01

    review to assess if analysis is needed, any analyses that are to be provided by generators, and methods to be used to meet specific waste analysis ...sampling method , sampling frequency, parameters of analysis , SW 846 test methods , Department of Transportation (DOT) shipping name and hazard class...S.e.iceA w/Atchs 2. HQ SAC/DEV Ltr, 28 Sep 90 19 119 APPENDIX B Waste Analysis Plan Rationale 21 APPENDIX B 1. SAMPLING METHOD RATIONALE: Composite Liquid

  13. Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) for a 3-D Flexible Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J.-W.

    2001-01-01

    The formulation and implementation of an optimization method called Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) are extended from single discipline analysis (aerodynamics only) to multidisciplinary analysis - in this case, static aero-structural analysis - and applied to a simple 3-D wing problem. The method aims to reduce the computational expense incurred in performing shape optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, Finite Element Method (FEM) structural analysis and sensitivity analysis tools. Results for this small problem show that the method reaches the same local optimum as conventional optimization. However, unlike its application to the win,, (single discipline analysis), the method. as I implemented here, may not show significant reduction in the computational cost. Similar reductions were seen in the two-design-variable (DV) problem results but not in the 8-DV results given here.

  14. Simultaneous Aerodynamic and Structural Design Optimization (SASDO) for a 3-D Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J.-W.; Newman, Perry A.

    2001-01-01

    The formulation and implementation of an optimization method called Simultaneous Aerodynamic and Structural Design Optimization (SASDO) is shown as an extension of the Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) method. It is extended by the inclusion of structure element sizing parameters as design variables and Finite Element Method (FEM) analysis responses as constraints. The method aims to reduce the computational expense. incurred in performing shape and sizing optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, FEM structural analysis and sensitivity analysis tools. SASDO is applied to a simple. isolated, 3-D wing in inviscid flow. Results show that the method finds the saine local optimum as a conventional optimization method with some reduction in the computational cost and without significant modifications; to the analysis tools.

  15. PARTIAL RESTRAINING FORCE INTRODUCTION METHOD FOR DESIGNING CONSTRUCTION COUNTERMESURE ON ΔB METHOD

    NASA Astrophysics Data System (ADS)

    Nishiyama, Taku; Imanishi, Hajime; Chiba, Noriyuki; Ito, Takao

    Landslide or slope failure is a three-dimensional movement phenomenon, thus a three-dimensional treatment makes it easier to understand stability. The ΔB method (simplified three-dimensional slope stability analysis method) is based on the limit equilibrium method and equals to an approximate three-dimensional slope stability analysis that extends two-dimensional cross-section stability analysis results to assess stability. This analysis can be conducted using conventional spreadsheets or two-dimensional slope stability computational software. This paper describes the concept of the partial restraining force in-troduction method for designing construction countermeasures using the distribution of the restraining force found along survey lines, which is based on the distribution of survey line safety factors derived from the above-stated analysis. This paper also presents the transverse distributive method of restraining force used for planning ground stabilizing on the basis of the example analysis.

  16. Glaucoma progression detection: agreement, sensitivity, and specificity of expert visual field evaluation, event analysis, and trend analysis.

    PubMed

    Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier

    2013-01-01

    To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.

  17. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  18. Threshold-free high-power methods for the ontological analysis of genome-wide gene-expression studies

    PubMed Central

    Nilsson, Björn; Håkansson, Petra; Johansson, Mikael; Nelander, Sven; Fioretos, Thoas

    2007-01-01

    Ontological analysis facilitates the interpretation of microarray data. Here we describe new ontological analysis methods which, unlike existing approaches, are threshold-free and statistically powerful. We perform extensive evaluations and introduce a new concept, detection spectra, to characterize methods. We show that different ontological analysis methods exhibit distinct detection spectra, and that it is critical to account for this diversity. Our results argue strongly against the continued use of existing methods, and provide directions towards an enhanced approach. PMID:17488501

  19. [Enzymatic analysis of the quality of foodstuffs].

    PubMed

    Kolesnov, A Iu

    1997-01-01

    Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.

  20. Rapid Radiochemical Method for Isotopic Uranium in Building ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Uranium-234, uranium-235, and uranium-238 in concrete and brick samples Method Selected for: SAM lists this method for qualitative analysis of uranium-234, uranium-235, and uranium-238 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  1. Methods for Conducting Cognitive Task Analysis for a Decision Making Task.

    DTIC Science & Technology

    1996-01-01

    Cognitive task analysis (CTA) improves traditional task analysis procedures by analyzing the thought processes of performers while they complete a...for using these methods to conduct a CTA for domains which involve critical decision making tasks in naturalistic settings. The cognitive task analysis methods

  2. A Method for Cognitive Task Analysis

    DTIC Science & Technology

    1992-07-01

    A method for cognitive task analysis is described based on the notion of ’generic tasks’. The method distinguishes three layers of analysis. At the...model for applied areas such as the development of knowledge-based systems and training, are discussed. Problem solving, Cognitive Task Analysis , Knowledge, Strategies.

  3. Comparison of variance estimators for meta-analysis of instrumental variable estimates

    PubMed Central

    Schmidt, AF; Hingorani, AD; Jefferis, BJ; White, J; Groenwold, RHH; Dudbridge, F

    2016-01-01

    Abstract Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two versions of the delta method (IV before or after pooling), four bootstrap estimators, a jack-knife estimator and a heteroscedasticity-consistent (HC) variance estimator were compared using simulation. Two types of meta-analyses were compared, a two-stage meta-analysis pooling results, and a one-stage meta-analysis pooling datasets. Results: Using a two-stage meta-analysis, coverage of the point estimate using bootstrapped estimators deviated from nominal levels at weak instrument settings and/or outcome probabilities ≤ 0.10. The jack-knife estimator was the least biased resampling method, the HC estimator often failed at outcome probabilities ≤ 0.50 and overall the delta method estimators were the least biased. In the presence of between-study heterogeneity, the delta method before meta-analysis performed best. Using a one-stage meta-analysis all methods performed equally well and better than two-stage meta-analysis of greater or equal size. Conclusions: In the presence of between-study heterogeneity, two-stage meta-analyses should preferentially use the delta method before meta-analysis. Weak instrument bias can be reduced by performing a one-stage meta-analysis. PMID:27591262

  4. Analysis of Explosives in Soil Using Solid Phase Microextraction and Gas Chromatography: Environmental Analysis

    DTIC Science & Technology

    2006-01-01

    ENVIRONMENTAL ANALYSIS Analysis of Explosives in Soil Using Solid Phase Microextraction and Gas Chromatography Howard T. Mayfield Air Force Research...Abstract: Current methods for the analysis of explosives in soils utilize time consuming sample preparation workups and extractions. The method detection...chromatography/mass spectrometry to provide a con- venient and sensitive analysis method for explosives in soil. Keywords: Explosives, TNT, solid phase

  5. [Clinical usefulness of urine-formed elements' information obtained from bacteria detection by flow cytometry method that uses nucleic acid staining].

    PubMed

    Nakagawa, Hiroko; Yuno, Tomoji; Itho, Kiichi

    2009-03-01

    Recently, specific detection method for Bacteria, by flow cytometry method using nucleic acid staining, was developed as a function of automated urine formed elements analyzer for routine urine testing. Here, we performed a basic study on this bacteria analysis method. In addition, we also have a comparison among urine sediment analysis, urine Gram staining and urine quantitative cultivation, the conventional methods performed up to now. As a result, the bacteria analysis with flow cytometry method that uses nucleic acid staining was excellent in reproducibility, and higher sensitivity compared with microscopic urinary sediment analysis. Based on the ROC curve analysis, which settled urine culture method as standard, cut-off level of 120/microL was defined and its sensitivity = 85.7%, specificity = 88.2%. In the analysis of scattergram, accompanied with urine culture method, among 90% of rod positive samples, 80% of dots were appeared in the area of 30 degrees from axis X. In addition, one case even indicated that analysis of bacteria by flow cytometry and scattergram of time series analysis might be helpful to trace the progress of causative bacteria therefore the information supposed to be clinically significant. Reporting bacteria information with nucleic acid staining flow cytometry method is expected to contribute to a rapid diagnostics and treatment of urinary tract infections. Besides, the contribution to screening examination of microbiology and clinical chemistry, will deliver a more efficient solution to urine analysis.

  6. Environmental and economic assessment methods for waste management decision-support: possibilities and limitations.

    PubMed

    Finnveden, Göran; Björklund, Anna; Moberg, Asa; Ekvall, Tomas

    2007-06-01

    A large number of methods and approaches that can be used for supporting waste management decisions at different levels in society have been developed. In this paper an overview of methods is provided and preliminary guidelines for the choice of methods are presented. The methods introduced include: Environmental Impact Assessment, Strategic Environmental Assessment, Life Cycle Assessment, Cost-Benefit Analysis, Cost-effectiveness Analysis, Life-cycle Costing, Risk Assessment, Material Flow Accounting, Substance Flow Analysis, Energy Analysis, Exergy Analysis, Entropy Analysis, Environmental Management Systems, and Environmental Auditing. The characteristics used are the types of impacts included, the objects under study and whether the method is procedural or analytical. The different methods can be described as systems analysis methods. Waste management systems thinking is receiving increasing attention. This is, for example, evidenced by the suggested thematic strategy on waste by the European Commission where life-cycle analysis and life-cycle thinking get prominent positions. Indeed, life-cycle analyses have been shown to provide policy-relevant and consistent results. However, it is also clear that the studies will always be open to criticism since they are simplifications of reality and include uncertainties. This is something all systems analysis methods have in common. Assumptions can be challenged and it may be difficult to generalize from case studies to policies. This suggests that if decisions are going to be made, they are likely to be made on a less than perfect basis.

  7. Analysis of the principal component algorithm in phase-shifting interferometry.

    PubMed

    Vargas, J; Quiroga, J Antonio; Belenguer, T

    2011-06-15

    We recently presented a new asynchronous demodulation method for phase-sampling interferometry. The method is based in the principal component analysis (PCA) technique. In the former work, the PCA method was derived heuristically. In this work, we present an in-depth analysis of the PCA demodulation method.

  8. Trial Sequential Methods for Meta-Analysis

    ERIC Educational Resources Information Center

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  9. Comparison of Two Global Sensitivity Analysis Methods for Hydrologic Modeling over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Hameed, M.; Demirel, M. C.; Moradkhani, H.

    2015-12-01

    Global Sensitivity Analysis (GSA) approach helps identify the effectiveness of model parameters or inputs and thus provides essential information about the model performance. In this study, the effects of the Sacramento Soil Moisture Accounting (SAC-SMA) model parameters, forcing data, and initial conditions are analysed by using two GSA methods: Sobol' and Fourier Amplitude Sensitivity Test (FAST). The simulations are carried out over five sub-basins within the Columbia River Basin (CRB) for three different periods: one-year, four-year, and seven-year. Four factors are considered and evaluated by using the two sensitivity analysis methods: the simulation length, parameter range, model initial conditions, and the reliability of the global sensitivity analysis methods. The reliability of the sensitivity analysis results is compared based on 1) the agreement between the two sensitivity analysis methods (Sobol' and FAST) in terms of highlighting the same parameters or input as the most influential parameters or input and 2) how the methods are cohered in ranking these sensitive parameters under the same conditions (sub-basins and simulation length). The results show the coherence between the Sobol' and FAST sensitivity analysis methods. Additionally, it is found that FAST method is sufficient to evaluate the main effects of the model parameters and inputs. Another conclusion of this study is that the smaller parameter or initial condition ranges, the more consistency and coherence between the sensitivity analysis methods results.

  10. 14 CFR 417.203 - Compliance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... analysis method is based on accurate data and scientific principles and is statistically valid. The FAA... safety analysis must also meet the requirements for methods of analysis contained in appendices A and B... from an identical or similar launch if the analysis still applies to the later launch. (b) Method of...

  11. 14 CFR 417.203 - Compliance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... analysis method is based on accurate data and scientific principles and is statistically valid. The FAA... safety analysis must also meet the requirements for methods of analysis contained in appendices A and B... from an identical or similar launch if the analysis still applies to the later launch. (b) Method of...

  12. 14 CFR 417.203 - Compliance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... analysis method is based on accurate data and scientific principles and is statistically valid. The FAA... safety analysis must also meet the requirements for methods of analysis contained in appendices A and B... from an identical or similar launch if the analysis still applies to the later launch. (b) Method of...

  13. Using Horn's Parallel Analysis Method in Exploratory Factor Analysis for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Çokluk, Ömay; Koçak, Duygu

    2016-01-01

    In this study, the number of factors obtained from parallel analysis, a method used for determining the number of factors in exploratory factor analysis, was compared to that of the factors obtained from eigenvalue and scree plot--two traditional methods for determining the number of factors--in terms of consistency. Parallel analysis is based on…

  14. Construction and Analysis of Multi-Rate Partitioned Runge-Kutta Methods

    DTIC Science & Technology

    2012-06-01

    ANALYSIS OF MULTI-RATE PARTITIONED RUNGE-KUTTA METHODS by Patrick R. Mugg June 2012 Thesis Advisor: Francis Giraldo Second Reader: Hong...COVERED Master’s Thesis 4. TITLE AND SUBTITLE Construction and Analysis of Multi-Rate Partitioned Runge-Kutta Methods 5. FUNDING NUMBERS 6. AUTHOR...The most widely known and used procedure for analyzing stability is the Von Neumann Method , such that Von Neumann’s stability analysis looks at

  15. Relative contributions of three descriptive methods: implications for behavioral assessment.

    PubMed

    Pence, Sacha T; Roscoe, Eileen M; Bourret, Jason C; Ahearn, William H

    2009-01-01

    This study compared the outcomes of three descriptive analysis methods-the ABC method, the conditional probability method, and the conditional and background probability method-to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior participated. Functional analyses indicated that participants' problem behavior was maintained by social positive reinforcement (n = 2), social negative reinforcement (n = 2), or automatic reinforcement (n = 2). Results showed that for all but 1 participant, descriptive analysis outcomes were similar across methods. In addition, for all but 1 participant, the descriptive analysis outcome differed substantially from the functional analysis outcome. This supports the general finding that descriptive analysis is a poor means of determining functional relations.

  16. Method and system of Jones-matrix mapping of blood plasma films with "fuzzy" analysis in differentiation of breast pathology changes

    NASA Astrophysics Data System (ADS)

    Zabolotna, Natalia I.; Radchenko, Kostiantyn O.; Karas, Oleksandr V.

    2018-01-01

    A fibroadenoma diagnosing of breast using statistical analysis (determination and analysis of statistical moments of the 1st-4th order) of the obtained polarization images of Jones matrix imaginary elements of the optically thin (attenuation coefficient τ <= 0,1 ) blood plasma films with further intellectual differentiation based on the method of "fuzzy" logic and discriminant analysis were proposed. The accuracy of the intellectual differentiation of blood plasma samples to the "norm" and "fibroadenoma" of breast was 82.7% by the method of linear discriminant analysis, and by the "fuzzy" logic method is 95.3%. The obtained results allow to confirm the potentially high level of reliability of the method of differentiation by "fuzzy" analysis.

  17. Estimation of low back moments from video analysis: a validation study.

    PubMed

    Coenen, Pieter; Kingma, Idsart; Boot, Cécile R L; Faber, Gert S; Xu, Xu; Bongers, Paulien M; van Dieën, Jaap H

    2011-09-02

    This study aimed to develop, compare and validate two versions of a video analysis method for assessment of low back moments during occupational lifting tasks since for epidemiological studies and ergonomic practice relatively cheap and easily applicable methods to assess low back loads are needed. Ten healthy subjects participated in a protocol comprising 12 lifting conditions. Low back moments were assessed using two variants of a video analysis method and a lab-based reference method. Repeated measures ANOVAs showed no overall differences in peak moments between the two versions of the video analysis method and the reference method. However, two conditions showed a minor overestimation of one of the video analysis method moments. Standard deviations were considerable suggesting that errors in the video analysis were random. Furthermore, there was a small underestimation of dynamic components and overestimation of the static components of the moments. Intraclass correlations coefficients for peak moments showed high correspondence (>0.85) of the video analyses with the reference method. It is concluded that, when a sufficient number of measurements can be taken, the video analysis method for assessment of low back loads during lifting tasks provides valid estimates of low back moments in ergonomic practice and epidemiological studies for lifts up to a moderate level of asymmetry. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  19. Relaxation mode analysis of a peptide system: comparison with principal component analysis.

    PubMed

    Mitsutake, Ayori; Iijima, Hiromitsu; Takano, Hiroshi

    2011-10-28

    This article reports the first attempt to apply the relaxation mode analysis method to a simulation of a biomolecular system. In biomolecular systems, the principal component analysis is a well-known method for analyzing the static properties of fluctuations of structures obtained by a simulation and classifying the structures into some groups. On the other hand, the relaxation mode analysis has been used to analyze the dynamic properties of homopolymer systems. In this article, a long Monte Carlo simulation of Met-enkephalin in gas phase has been performed. The results are analyzed by the principal component analysis and relaxation mode analysis methods. We compare the results of both methods and show the effectiveness of the relaxation mode analysis.

  20. A simple method for plasma total vitamin C analysis suitable for routine clinical laboratory use.

    PubMed

    Robitaille, Line; Hoffer, L John

    2016-04-21

    In-hospital hypovitaminosis C is highly prevalent but almost completely unrecognized. Medical awareness of this potentially important disorder is hindered by the inability of most hospital laboratories to determine plasma vitamin C concentrations. The availability of a simple, reliable method for analyzing plasma vitamin C could increase opportunities for routine plasma vitamin C analysis in clinical medicine. Plasma vitamin C can be analyzed by high performance liquid chromatography (HPLC) with electrochemical (EC) or ultraviolet (UV) light detection. We modified existing UV-HPLC methods for plasma total vitamin C analysis (the sum of ascorbic and dehydroascorbic acid) to develop a simple, constant-low-pH sample reduction procedure followed by isocratic reverse-phase HPLC separation using a purely aqueous low-pH non-buffered mobile phase. Although EC-HPLC is widely recommended over UV-HPLC for plasma total vitamin C analysis, the two methods have never been directly compared. We formally compared the simplified UV-HPLC method with EC-HPLC in 80 consecutive clinical samples. The simplified UV-HPLC method was less expensive, easier to set up, required fewer reagents and no pH adjustments, and demonstrated greater sample stability than many existing methods for plasma vitamin C analysis. When compared with the gold-standard EC-HPLC method in 80 consecutive clinical samples exhibiting a wide range of plasma vitamin C concentrations, it performed equivalently. The easy set up, simplicity and sensitivity of the plasma vitamin C analysis method described here could make it practical in a normally equipped hospital laboratory. Unlike any prior UV-HPLC method for plasma total vitamin C analysis, it was rigorously compared with the gold-standard EC-HPLC method and performed equivalently. Adoption of this method could increase the availability of plasma vitamin C analysis in clinical medicine.

  1. Bayesian data analysis in population ecology: motivations, methods, and benefits

    USGS Publications Warehouse

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  2. A comparison of analysis methods to estimate contingency strength.

    PubMed

    Lloyd, Blair P; Staubitz, Johanna L; Tapp, Jon T

    2018-05-09

    To date, several data analysis methods have been used to estimate contingency strength, yet few studies have compared these methods directly. To compare the relative precision and sensitivity of four analysis methods (i.e., exhaustive event-based, nonexhaustive event-based, concurrent interval, concurrent+lag interval), we applied all methods to a simulated data set in which several response-dependent and response-independent schedules of reinforcement were programmed. We evaluated the degree to which contingency strength estimates produced from each method (a) corresponded with expected values for response-dependent schedules and (b) showed sensitivity to parametric manipulations of response-independent reinforcement. Results indicated both event-based methods produced contingency strength estimates that aligned with expected values for response-dependent schedules, but differed in sensitivity to response-independent reinforcement. The precision of interval-based methods varied by analysis method (concurrent vs. concurrent+lag) and schedule type (continuous vs. partial), and showed similar sensitivities to response-independent reinforcement. Recommendations and considerations for measuring contingencies are identified. © 2018 Society for the Experimental Analysis of Behavior.

  3. Three-dimensional Stress Analysis Using the Boundary Element Method

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1984-01-01

    The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

  4. Quad-Tree Visual-Calculus Analysis of Satellite Coverage

    NASA Technical Reports Server (NTRS)

    Lo, Martin W.; Hockney, George; Kwan, Bruce

    2003-01-01

    An improved method of analysis of coverage of areas of the Earth by a constellation of radio-communication or scientific-observation satellites has been developed. This method is intended to supplant an older method in which the global-coverage-analysis problem is solved from a ground-to-satellite perspective. The present method provides for rapid and efficient analysis. This method is derived from a satellite-to-ground perspective and involves a unique combination of two techniques for multiresolution representation of map features on the surface of a sphere.

  5. The Tongue and Quill

    DTIC Science & Technology

    2004-08-01

    ethnography , phenomenological study , grounded theory study and content analysis. THE HISTORICAL METHOD Methods I. Qualitative Research Methods ... Phenomenological Study 4. Grounded Theory Study 5. Content Analysis II. Quantitative Research Methods A...A. The Historical Method B. General Qualitative

  6. Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems

    NASA Technical Reports Server (NTRS)

    Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.

    2005-01-01

    The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.

  7. Systematic text condensation: a strategy for qualitative analysis.

    PubMed

    Malterud, Kirsti

    2012-12-01

    To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies. Giorgi's psychological phenomenological analysis is the point of departure and inspiration for systematic text condensation. The basic elements of Giorgi's method and the elaboration of these in systematic text condensation are presented, followed by a detailed description of procedures for analysis according to systematic text condensation. Finally, similarities and differences compared with other frequently applied methods for qualitative analysis are identified, as the foundation of a discussion of strengths and limitations of systematic text condensation. Systematic text condensation is a descriptive and explorative method for thematic cross-case analysis of different types of qualitative data, such as interview studies, observational studies, and analysis of written texts. The method represents a pragmatic approach, although inspired by phenomenological ideas, and various theoretical frameworks can be applied. The procedure consists of the following steps: 1) total impression - from chaos to themes; 2) identifying and sorting meaning units - from themes to codes; 3) condensation - from code to meaning; 4) synthesizing - from condensation to descriptions and concepts. Similarities and differences comparing systematic text condensation with other frequently applied qualitative methods regarding thematic analysis, theoretical methodological framework, analysis procedures, and taxonomy are discussed. Systematic text condensation is a strategy for analysis developed from traditions shared by most of the methods for analysis of qualitative data. The method offers the novice researcher a process of intersubjectivity, reflexivity, and feasibility, while maintaining a responsible level of methodological rigour.

  8. Further evidence for the increased power of LOD scores compared with nonparametric methods.

    PubMed

    Durner, M; Vieland, V J; Greenberg, D A

    1999-01-01

    In genetic analysis of diseases in which the underlying model is unknown, "model free" methods-such as affected sib pair (ASP) tests-are often preferred over LOD-score methods, although LOD-score methods under the correct or even approximately correct model are more powerful than ASP tests. However, there might be circumstances in which nonparametric methods will outperform LOD-score methods. Recently, Dizier et al. reported that, in some complex two-locus (2L) models, LOD-score methods with segregation analysis-derived parameters had less power to detect linkage than ASP tests. We investigated whether these particular models, in fact, represent a situation that ASP tests are more powerful than LOD scores. We simulated data according to the parameters specified by Dizier et al. and analyzed the data by using a (a) single locus (SL) LOD-score analysis performed twice, under a simple dominant and a recessive mode of inheritance (MOI), (b) ASP methods, and (c) nonparametric linkage (NPL) analysis. We show that SL analysis performed twice and corrected for the type I-error increase due to multiple testing yields almost as much linkage information as does an analysis under the correct 2L model and is more powerful than either the ASP method or the NPL method. We demonstrate that, even for complex genetic models, the most important condition for linkage analysis is that the assumed MOI at the disease locus being tested is approximately correct, not that the inheritance of the disease per se is correctly specified. In the analysis by Dizier et al., segregation analysis led to estimates of dominance parameters that were grossly misspecified for the locus tested in those models in which ASP tests appeared to be more powerful than LOD-score analyses.

  9. Static aeroelastic analysis and tailoring of a single-element racing car wing

    NASA Astrophysics Data System (ADS)

    Sadd, Christopher James

    This thesis presents the research from an Engineering Doctorate research programme in collaboration with Reynard Motorsport Ltd, a manufacturer of racing cars. Racing car wing design has traditionally considered structures to be rigid. However, structures are never perfectly rigid and the interaction between aerodynamic loading and structural flexibility has a direct impact on aerodynamic performance. This interaction is often referred to as static aeroelasticity and the focus of this research has been the development of a computational static aeroelastic analysis method to improve the design of a single-element racing car wing. A static aeroelastic analysis method has been developed by coupling a Reynolds-Averaged Navier-Stokes CFD analysis method with a Finite Element structural analysis method using an iterative scheme. Development of this method has included assessment of CFD and Finite Element analysis methods and development of data transfer and mesh deflection methods. Experimental testing was also completed to further assess the computational analyses. The computational and experimental results show a good correlation and these studies have also shown that a Navier-Stokes static aeroelastic analysis of an isolated wing can be performed at an acceptable computational cost. The static aeroelastic analysis tool was used to assess methods of tailoring the structural flexibility of the wing to increase its aerodynamic performance. These tailoring methods were then used to produce two final wing designs to increase downforce and reduce drag respectively. At the average operating dynamic pressure of the racing car, the computational analysis predicts that the downforce-increasing wing has a downforce of C[1]=-1.377 in comparison to C[1]=-1.265 for the original wing. The computational analysis predicts that the drag-reducing wing has a drag of C[d]=0.115 in comparison to C[d]=0.143 for the original wing.

  10. Comparative analysis of methods and sources of financing of the transport organizations activity

    NASA Astrophysics Data System (ADS)

    Gorshkov, Roman

    2017-10-01

    The article considers the analysis of methods of financing of transport organizations in conditions of limited investment resources. A comparative analysis of these methods is carried out, the classification of investment, methods and sources of financial support for projects being implemented to date are presented. In order to select the optimal sources of financing for the projects, various methods of financial management and financial support for the activities of the transport organization were analyzed, which were considered from the perspective of analysis of advantages and limitations. The result of the study is recommendations on the selection of optimal sources and methods of financing of transport organizations.

  11. Comparison of histomorphometrical data obtained with two different image analysis methods.

    PubMed

    Ballerini, Lucia; Franke-Stenport, Victoria; Borgefors, Gunilla; Johansson, Carina B

    2007-08-01

    A common way to determine tissue acceptance of biomaterials is to perform histomorphometrical analysis on histologically stained sections from retrieved samples with surrounding tissue, using various methods. The "time and money consuming" methods and techniques used are often "in house standards". We address light microscopic investigations of bone tissue reactions on un-decalcified cut and ground sections of threaded implants. In order to screen sections and generate results faster, the aim of this pilot project was to compare results generated with the in-house standard visual image analysis tool (i.e., quantifications and judgements done by the naked eye) with a custom made automatic image analysis program. The histomorphometrical bone area measurements revealed no significant differences between the methods but the results of the bony contacts varied significantly. The raw results were in relative agreement, i.e., the values from the two methods were proportional to each other: low bony contact values in the visual method corresponded to low values with the automatic method. With similar resolution images and further improvements of the automatic method this difference should become insignificant. A great advantage using the new automatic image analysis method is that it is time saving--analysis time can be significantly reduced.

  12. Thermal image analysis using the serpentine method

    NASA Astrophysics Data System (ADS)

    Koprowski, Robert; Wilczyński, Sławomir

    2018-03-01

    Thermal imaging is an increasingly widespread alternative to other imaging methods. As a supplementary method in diagnostics, it can be used both statically and with dynamic temperature changes. The paper proposes a new image analysis method that allows for the acquisition of new diagnostic information as well as object segmentation. The proposed serpentine analysis uses known and new methods of image analysis and processing proposed by the authors. Affine transformations of an image and subsequent Fourier analysis provide a new diagnostic quality. The method is fully repeatable and automatic and independent of inter-individual variability in patients. The segmentation results are by 10% better than those obtained from the watershed method and the hybrid segmentation method based on the Canny detector. The first and second harmonics of serpentine analysis enable to determine the type of temperature changes in the region of interest (gradient, number of heat sources etc.). The presented serpentine method provides new quantitative information on thermal imaging and more. Since it allows for image segmentation and designation of contact points of two and more heat sources (local minimum), it can be used to support medical diagnostics in many areas of medicine.

  13. Systems and methods for detection of blowout precursors in combustors

    DOEpatents

    Lieuwen, Tim C.; Nair, Suraj

    2006-08-15

    The present invention comprises systems and methods for detecting flame blowout precursors in combustors. The blowout precursor detection system comprises a combustor, a pressure measuring device, and blowout precursor detection unit. A combustion controller may also be used to control combustor parameters. The methods of the present invention comprise receiving pressure data measured by an acoustic pressure measuring device, performing one or a combination of spectral analysis, statistical analysis, and wavelet analysis on received pressure data, and determining the existence of a blowout precursor based on such analyses. The spectral analysis, statistical analysis, and wavelet analysis further comprise their respective sub-methods to determine the existence of blowout precursors.

  14. Computer Graphics-aided systems analysis: application to well completion design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detamore, J.E.; Sarma, M.P.

    1985-03-01

    The development of an engineering tool (in the form of a computer model) for solving design and analysis problems related with oil and gas well production operations is discussed. The development of the method is based on integrating the concepts of ''Systems Analysis'' with the techniques of ''Computer Graphics''. The concepts behind the method are very general in nature. This paper, however, illustrates the application of the method in solving gas well completion design problems. The use of the method will save time and improve the efficiency of such design and analysis problems. The method can be extended to othermore » design and analysis aspects of oil and gas wells.« less

  15. Improved dynamic analysis method using load-dependent Ritz vectors

    NASA Technical Reports Server (NTRS)

    Escobedo-Torres, J.; Ricles, J. M.

    1993-01-01

    The dynamic analysis of large space structures is important in order to predict their behavior under operating conditions. Computer models of large space structures are characterized by having a large number of degrees of freedom, and the computational effort required to carry out the analysis is very large. Conventional methods of solution utilize a subset of the eigenvectors of the system, but for systems with many degrees of freedom, the solution of the eigenproblem is in many cases the most costly phase of the analysis. For this reason, alternate solution methods need to be considered. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. The load dependent Ritz vector method is presented as an alternative to the classical normal mode methods for obtaining dynamic responses of large space structures. A simplified model of a space station is used to compare results. Results show that the load dependent Ritz vector method predicts the dynamic response better than the classical normal mode method. Even though this alternate method is very promising, further studies are necessary to fully understand its attributes and limitations.

  16. A novel spinal kinematic analysis using X-ray imaging and vicon motion analysis: a case study.

    PubMed

    Noh, Dong K; Lee, Nam G; You, Joshua H

    2014-01-01

    This study highlights a novel spinal kinematic analysis method and the feasibility of X-ray imaging measurements to accurately assess thoracic spine motion. The advanced X-ray Nash-Moe method and analysis were used to compute the segmental range of motion in thoracic vertebra pedicles in vivo. This Nash-Moe X-ray imaging method was compared with a standardized method using the Vicon 3-dimensional motion capture system. Linear regression analysis showed an excellent and significant correlation between the two methods (R2 = 0.99, p < 0.05), suggesting that the analysis of spinal segmental range of motion using X-ray imaging measurements was accurate and comparable to the conventional 3-dimensional motion analysis system. Clinically, this novel finding is compelling evidence demonstrating that measurements with X-ray imaging are useful to accurately decipher pathological spinal alignment and movement impairments in idiopathic scoliosis (IS).

  17. [Development of sample pretreatment techniques-rapid detection coupling methods for food security analysis].

    PubMed

    Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke

    2013-07-01

    This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.

  18. Analysis and application of Fourier transform spectroscopy in atmospheric remote sensing

    NASA Technical Reports Server (NTRS)

    Park, J. H.

    1984-01-01

    An analysis method for Fourier transform spectroscopy is summarized with applications to various types of distortion in atmospheric absorption spectra. This analysis method includes the fast Fourier transform method for simulating the interferometric spectrum and the nonlinear least-squares method for retrieving the information from a measured spectrum. It is shown that spectral distortions can be simulated quite well and that the correct information can be retrieved from a distorted spectrum by this analysis technique.

  19. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P; Cowell, Andrew J; Gregory, Michelle L

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  20. Rapid Method for Sodium Hydroxide/Sodium Peroxide Fusion ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Plutonium-238 and plutonium-239 in water and air filters Method Selected for: SAM lists this method as a pre-treatment technique supporting analysis of refractory radioisotopic forms of plutonium in drinking water and air filters using the following qualitative techniques: • Rapid methods for acid or fusion digestion • Rapid Radiochemical Method for Plutonium-238 and Plutonium 239/240 in Building Materials for Environmental Remediation Following Radiological Incidents. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  1. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis

    ERIC Educational Resources Information Center

    Schiazza, Daniela Marie

    2013-01-01

    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  2. Integrative Analysis of “-Omics” Data Using Penalty Functions

    PubMed Central

    Zhao, Qing; Shi, Xingjie; Huang, Jian; Liu, Jin; Li, Yang; Ma, Shuangge

    2014-01-01

    In the analysis of omics data, integrative analysis provides an effective way of pooling information across multiple datasets or multiple correlated responses, and can be more effective than single-dataset (response) analysis. Multiple families of integrative analysis methods have been proposed in the literature. The current review focuses on the penalization methods. Special attention is paid to sparse meta-analysis methods that pool summary statistics across datasets, and integrative analysis methods that pool raw data across datasets. We discuss their formulation and rationale. Beyond “standard” penalized selection, we also review contrasted penalization and Laplacian penalization which accommodate finer data structures. The computational aspects, including computational algorithms and tuning parameter selection, are examined. This review concludes with possible limitations and extensions. PMID:25691921

  3. Power flow as a complement to statistical energy analysis and finite element analysis

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  4. Assessing differential expression in two-color microarrays: a resampling-based empirical Bayes approach.

    PubMed

    Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D

    2013-01-01

    Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.

  5. Application of computational aerodynamics methods to the design and analysis of transport aircraft

    NASA Technical Reports Server (NTRS)

    Da Costa, A. L.

    1978-01-01

    The application and validation of several computational aerodynamic methods in the design and analysis of transport aircraft is established. An assessment is made concerning more recently developed methods that solve three-dimensional transonic flow and boundary layers on wings. Capabilities of subsonic aerodynamic methods are demonstrated by several design and analysis efforts. Among the examples cited are the B747 Space Shuttle Carrier Aircraft analysis, nacelle integration for transport aircraft, and winglet optimization. The accuracy and applicability of a new three-dimensional viscous transonic method is demonstrated by comparison of computed results to experimental data

  6. Using time-frequency analysis to determine time-resolved detonation velocity with microwave interferometry.

    PubMed

    Kittell, David E; Mares, Jesus O; Son, Steven F

    2015-04-01

    Two time-frequency analysis methods based on the short-time Fourier transform (STFT) and continuous wavelet transform (CWT) were used to determine time-resolved detonation velocities with microwave interferometry (MI). The results were directly compared to well-established analysis techniques consisting of a peak-picking routine as well as a phase unwrapping method (i.e., quadrature analysis). The comparison is conducted on experimental data consisting of transient detonation phenomena observed in triaminotrinitrobenzene and ammonium nitrate-urea explosives, representing high and low quality MI signals, respectively. Time-frequency analysis proved much more capable of extracting useful and highly resolved velocity information from low quality signals than the phase unwrapping and peak-picking methods. Additionally, control of the time-frequency methods is mainly constrained to a single parameter which allows for a highly unbiased analysis method to extract velocity information. In contrast, the phase unwrapping technique introduces user based variability while the peak-picking technique does not achieve a highly resolved velocity result. Both STFT and CWT methods are proposed as improved additions to the analysis methods applied to MI detonation experiments, and may be useful in similar applications.

  7. Optimisation and validation of a rapid and efficient microemulsion liquid chromatographic (MELC) method for the determination of paracetamol (acetaminophen) content in a suppository formulation.

    PubMed

    McEvoy, Eamon; Donegan, Sheila; Power, Joe; Altria, Kevin

    2007-05-09

    A rapid and efficient oil-in-water microemulsion liquid chromatographic method has been optimised and validated for the analysis of paracetamol in a suppository formulation. Excellent linearity, accuracy, precision and assay results were obtained. Lengthy sample pre-treatment/extraction procedures were eliminated due to the solubilising power of the microemulsion and rapid analysis times were achieved. The method was optimised to achieve rapid analysis time and relatively high peak efficiencies. A standard microemulsion composition of 33 g SDS, 66 g butan-1-ol, 8 g n-octane in 1l of 0.05% TFA modified with acetonitrile has been shown to be suitable for the rapid analysis of paracetamol in highly hydrophobic preparations under isocratic conditions. Validated assay results and overall analysis time of the optimised method was compared to British Pharmacopoeia reference methods. Sample preparation and analysis times for the MELC analysis of paracetamol in a suppository were extremely rapid compared to the reference method and similar assay results were achieved. A gradient MELC method using the same microemulsion has been optimised for the resolution of paracetamol and five of its related substances in approximately 7 min.

  8. Local Analysis of Shock Capturing Using Discontinuous Galerkin Methodology

    NASA Technical Reports Server (NTRS)

    Atkins, H. L.

    1997-01-01

    The compact form of the discontinuous Galerkin method allows for a detailed local analysis of the method in the neighborhood of the shock for a non-linear model problem. Insight gained from the analysis leads to new flux formulas that are stable and that preserve the compactness of the method. Although developed for a model equation, the flux formulas are applicable to systems such as the Euler equations. This article presents the analysis for methods with a degree up to 5. The analysis is accompanied by supporting numerical experiments using Burgers' equation and the Euler equations.

  9. Analysis of Lard in Lipstick Formulation Using FTIR Spectroscopy and Multivariate Calibration: A Comparison of Three Extraction Methods.

    PubMed

    Waskitho, Dri; Lukitaningsih, Endang; Sudjadi; Rohman, Abdul

    2016-01-01

    Analysis of lard extracted from lipstick formulation containing castor oil has been performed using FTIR spectroscopic method combined with multivariate calibration. Three different extraction methods were compared, namely saponification method followed by liquid/liquid extraction with hexane/dichlorometane/ethanol/water, saponification method followed by liquid/liquid extraction with dichloromethane/ethanol/water, and Bligh & Dyer method using chloroform/methanol/water as extracting solvent. Qualitative and quantitative analysis of lard were performed using principle component (PCA) and partial least square (PLS) analysis, respectively. The results showed that, in all samples prepared by the three extraction methods, PCA was capable of identifying lard at wavelength region of 1200-800 cm -1 with the best result was obtained by Bligh & Dyer method. Furthermore, PLS analysis at the same wavelength region used for qualification showed that Bligh and Dyer was the most suitable extraction method with the highest determination coefficient (R 2 ) and the lowest root mean square error of calibration (RMSEC) as well as root mean square error of prediction (RMSEP) values.

  10. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  11. Methods for Analysis and Simulation of Ballistic Impact

    DTIC Science & Technology

    2017-04-01

    ARL-RP-0597 ● Apr 2017 US Army Research Laboratory Methods for Analysis and Simulation of Ballistic Impact by John D Clayton...Laboratory Methods for Analysis and Simulation of Ballistic Impact by John D Clayton Weapons and Materials Research Directorate, ARL...analytical, and numerical methods of ballistics research . Similar lengthy references dealing with pertinent aspects include [8, 9]. In contrast, the

  12. Coformer screening using thermal analysis based on binary phase diagrams.

    PubMed

    Yamashita, Hiroyuki; Hirakura, Yutaka; Yuda, Masamichi; Terada, Katsuhide

    2014-08-01

    The advent of cocrystals has demonstrated a growing need for efficient and comprehensive coformer screening in search of better development forms, including salt forms. Here, we investigated a coformer screening system for salts and cocrystals based on binary phase diagrams using thermal analysis and examined the effectiveness of the method. Indomethacin and tenoxicam were used as models of active pharmaceutical ingredients (APIs). Physical mixtures of an API and 42 kinds of coformers were analyzed using Differential Scanning Calorimetry (DSC) and X-ray DSC. We also conducted coformer screening using a conventional slurry method and compared these results with those from the thermal analysis method and previous studies. Compared with the slurry method, the thermal analysis method was a high-performance screening system, particularly for APIs with low solubility and/or propensity to form solvates. However, this method faced hurdles for screening coformers combined with an API in the presence of kinetic hindrance for salt or cocrystal formation during heating or if there is degradation near the metastable eutectic temperature. The thermal analysis and slurry methods are considered complementary to each other for coformer screening. Feasibility of the thermal analysis method in drug discovery practice is ensured given its small scale and high throughput.

  13. Change detection for synthetic aperture radar images based on pattern and intensity distinctiveness analysis

    NASA Astrophysics Data System (ADS)

    Wang, Xiao; Gao, Feng; Dong, Junyu; Qi, Qiang

    2018-04-01

    Synthetic aperture radar (SAR) image is independent on atmospheric conditions, and it is the ideal image source for change detection. Existing methods directly analysis all the regions in the speckle noise contaminated difference image. The performance of these methods is easily affected by small noisy regions. In this paper, we proposed a novel change detection framework for saliency-guided change detection based on pattern and intensity distinctiveness analysis. The saliency analysis step can remove small noisy regions, and therefore makes the proposed method more robust to the speckle noise. In the proposed method, the log-ratio operator is first utilized to obtain a difference image (DI). Then, the saliency detection method based on pattern and intensity distinctiveness analysis is utilized to obtain the changed region candidates. Finally, principal component analysis and k-means clustering are employed to analysis pixels in the changed region candidates. Thus, the final change map can be obtained by classifying these pixels into changed or unchanged class. The experiment results on two real SAR images datasets have demonstrated the effectiveness of the proposed method.

  14. An Analysis of Periodic Components in BL Lac Object S5 0716 +714 with MUSIC Method

    NASA Astrophysics Data System (ADS)

    Tang, J.

    2012-01-01

    Multiple signal classification (MUSIC) algorithms are introduced to the estimation of the period of variation of BL Lac objects.The principle of MUSIC spectral analysis method and theoretical analysis of the resolution of frequency spectrum using analog signals are included. From a lot of literatures, we have collected a lot of effective observation data of BL Lac object S5 0716 + 714 in V, R, I bands from 1994 to 2008. The light variation periods of S5 0716 +714 are obtained by means of the MUSIC spectral analysis method and periodogram spectral analysis method. There exist two major periods: (3.33±0.08) years and (1.24±0.01) years for all bands. The estimation of the period of variation of the algorithm based on the MUSIC spectral analysis method is compared with that of the algorithm based on the periodogram spectral analysis method. It is a super-resolution algorithm with small data length, and could be used to detect the period of variation of weak signals.

  15. Analysis of biomolecular solvation sites by 3D-RISM theory.

    PubMed

    Sindhikara, Daniel J; Hirata, Fumio

    2013-06-06

    We derive, implement, and apply equilibrium solvation site analysis for biomolecules. Our method utilizes 3D-RISM calculations to quickly obtain equilibrium solvent distributions without either necessity of simulation or limits of solvent sampling. Our analysis of these distributions extracts highest likelihood poses of solvent as well as localized entropies, enthalpies, and solvation free energies. We demonstrate our method on a structure of HIV-1 protease where excellent structural and thermodynamic data are available for comparison. Our results, obtained within minutes, show systematic agreement with available experimental data. Further, our results are in good agreement with established simulation-based solvent analysis methods. This method can be used not only for visual analysis of active site solvation but also for virtual screening methods and experimental refinement.

  16. The SNPforID Assay as a Supplementary Method in Kinship and Trace Analysis

    PubMed Central

    Schwark, Thorsten; Meyer, Patrick; Harder, Melanie; Modrow, Jan-Hendrick; von Wurmb-Schwark, Nicole

    2012-01-01

    Objective Short tandem repeat (STR) analysis using commercial multiplex PCR kits is the method of choice for kinship testing and trace analysis. However, under certain circumstances (deficiency testing, mutations, minute DNA amounts), STRs alone may not suffice. Methods We present a 50-plex single nucleotide polymorphism (SNP) assay based on the SNPs chosen by the SNPforID consortium as an additional method for paternity and for trace analysis. The new assay was applied to selected routine paternity and trace cases from our laboratory. Results and Conclusions Our investigation shows that the new SNP multiplex assay is a valuable method to supplement STR analysis, and is a powerful means to solve complicated genetic analyses. PMID:22851934

  17. The Use of Object-Oriented Analysis Methods in Surety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less

  18. Sampling and analysis of hexavalent chromium during exposure to chromic acid mist and welding fumes.

    PubMed

    Blomquist, G; Nilsson, C A; Nygren, O

    1983-12-01

    Sampling and analysis of hexavalent chromium during exposure to chromic acid mist and welding fumes. Scand j work environ & health 9 (1983) 489-495. In view of the serious health effects of hexavalent chromium, the problems involved in its sampling and analysis in workroom air have been the subject of much concern. In this paper, the stability problems arising from the reduction of hexavalent to trivalent chromium during sampling, sample storage, and analysis are discussed. Replacement of sulfuric acid by a sodium acetate buffer (pH 4) as a leaching solution prior to analysis with the diphenylcarbazide (DPC) method is suggested and is demonstrated to be necessary in order to avoid reduction. Field samples were taken from two different industrial processes-manual metal arc welding on stainless steel without shield gas and chromium plating. A comparison was made of the DPC method, acidic dissolution with atomic absorption spectrophotometric (AAS) analysis, and the carbonate method. For chromic acid mist, the DPC method and AAS analysis were shown to give the same results. In the analysis of welding fumes, the modified DPC method gave the same results as the laborious and less sensitive carbonate method.

  19. Gait Analysis Using Wearable Sensors

    PubMed Central

    Tao, Weijun; Liu, Tao; Zheng, Rencheng; Feng, Hutian

    2012-01-01

    Gait analysis using wearable sensors is an inexpensive, convenient, and efficient manner of providing useful information for multiple health-related applications. As a clinical tool applied in the rehabilitation and diagnosis of medical conditions and sport activities, gait analysis using wearable sensors shows great prospects. The current paper reviews available wearable sensors and ambulatory gait analysis methods based on the various wearable sensors. After an introduction of the gait phases, the principles and features of wearable sensors used in gait analysis are provided. The gait analysis methods based on wearable sensors is divided into gait kinematics, gait kinetics, and electromyography. Studies on the current methods are reviewed, and applications in sports, rehabilitation, and clinical diagnosis are summarized separately. With the development of sensor technology and the analysis method, gait analysis using wearable sensors is expected to play an increasingly important role in clinical applications. PMID:22438763

  20. Shape design sensitivity analysis and optimization of three dimensional elastic solids using geometric modeling and automatic regridding. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Yao, Tse-Min; Choi, Kyung K.

    1987-01-01

    An automatic regridding method and a three dimensional shape design parameterization technique were constructed and integrated into a unified theory of shape design sensitivity analysis. An algorithm was developed for general shape design sensitivity analysis of three dimensional eleastic solids. Numerical implementation of this shape design sensitivity analysis method was carried out using the finite element code ANSYS. The unified theory of shape design sensitivity analysis uses the material derivative of continuum mechanics with a design velocity field that represents shape change effects over the structural design. Automatic regridding methods were developed by generating a domain velocity field with boundary displacement method. Shape design parameterization for three dimensional surface design problems was illustrated using a Bezier surface with boundary perturbations that depend linearly on the perturbation of design parameters. A linearization method of optimization, LINRM, was used to obtain optimum shapes. Three examples from different engineering disciplines were investigated to demonstrate the accuracy and versatility of this shape design sensitivity analysis method.

  1. Comparative study of joint analysis of microarray gene expression data in survival prediction and risk assessment of breast cancer patients

    PubMed Central

    2016-01-01

    Abstract Microarray gene expression data sets are jointly analyzed to increase statistical power. They could either be merged together or analyzed by meta-analysis. For a given ensemble of data sets, it cannot be foreseen which of these paradigms, merging or meta-analysis, works better. In this article, three joint analysis methods, Z -score normalization, ComBat and the inverse normal method (meta-analysis) were selected for survival prognosis and risk assessment of breast cancer patients. The methods were applied to eight microarray gene expression data sets, totaling 1324 patients with two clinical endpoints, overall survival and relapse-free survival. The performance derived from the joint analysis methods was evaluated using Cox regression for survival analysis and independent validation used as bias estimation. Overall, Z -score normalization had a better performance than ComBat and meta-analysis. Higher Area Under the Receiver Operating Characteristic curve and hazard ratio were also obtained when independent validation was used as bias estimation. With a lower time and memory complexity, Z -score normalization is a simple method for joint analysis of microarray gene expression data sets. The derived findings suggest further assessment of this method in future survival prediction and cancer classification applications. PMID:26504096

  2. Meta-analysis and The Cochrane Collaboration: 20 years of the Cochrane Statistical Methods Group

    PubMed Central

    2013-01-01

    The Statistical Methods Group has played a pivotal role in The Cochrane Collaboration over the past 20 years. The Statistical Methods Group has determined the direction of statistical methods used within Cochrane reviews, developed guidance for these methods, provided training, and continued to discuss and consider new and controversial issues in meta-analysis. The contribution of Statistical Methods Group members to the meta-analysis literature has been extensive and has helped to shape the wider meta-analysis landscape. In this paper, marking the 20th anniversary of The Cochrane Collaboration, we reflect on the history of the Statistical Methods Group, beginning in 1993 with the identification of aspects of statistical synthesis for which consensus was lacking about the best approach. We highlight some landmark methodological developments that Statistical Methods Group members have contributed to in the field of meta-analysis. We discuss how the Group implements and disseminates statistical methods within The Cochrane Collaboration. Finally, we consider the importance of robust statistical methodology for Cochrane systematic reviews, note research gaps, and reflect on the challenges that the Statistical Methods Group faces in its future direction. PMID:24280020

  3. Evaluation of the direct and diffusion methods for the determination of fluoride content in table salt

    PubMed Central

    Martínez-Mier, E. Angeles; Soto-Rojas, Armando E.; Buckley, Christine M.; Margineda, Jorge; Zero, Domenick T.

    2010-01-01

    Objective The aim of this study was to assess methods currently used for analyzing fluoridated salt in order to identify the most useful method for this type of analysis. Basic research design Seventy-five fluoridated salt samples were obtained. Samples were analyzed for fluoride content, with and without pretreatment, using direct and diffusion methods. Element analysis was also conducted in selected samples. Fluoride was added to ultra pure NaCl and non-fluoridated commercial salt samples and Ca and Mg were added to fluoride samples in order to assess fluoride recoveries using modifications to the methods. Results Larger amounts of fluoride were found and recovered using diffusion than direct methods (96%–100% for diffusion vs. 67%–90% for direct). Statistically significant differences were obtained between direct and diffusion methods using different ion strength adjusters. Pretreatment methods reduced the amount of recovered fluoride. Determination of fluoride content was influenced both by the presence of NaCl and other ions in the salt. Conclusion Direct and diffusion techniques for analysis of fluoridated salt are suitable methods for fluoride analysis. The choice of method should depend on the purpose of the analysis. PMID:20088217

  4. Uncovering the requirements of cognitive work.

    PubMed

    Roth, Emilie M

    2008-06-01

    In this article, the author provides an overview of cognitive analysis methods and how they can be used to inform system analysis and design. Human factors has seen a shift toward modeling and support of cognitively intensive work (e.g., military command and control, medical planning and decision making, supervisory control of automated systems). Cognitive task analysis and cognitive work analysis methods extend traditional task analysis techniques to uncover the knowledge and thought processes that underlie performance in cognitively complex settings. The author reviews the multidisciplinary roots of cognitive analysis and the variety of cognitive task analysis and cognitive work analysis methods that have emerged. Cognitive analysis methods have been used successfully to guide system design, as well as development of function allocation, team structure, and training, so as to enhance performance and reduce the potential for error. A comprehensive characterization of cognitive work requires two mutually informing analyses: (a) examination of domain characteristics and constraints that define cognitive requirements and challenges and (b) examination of practitioner knowledge and strategies that underlie both expert and error-vulnerable performance. A variety of specific methods can be adapted to achieve these aims within the pragmatic constraints of particular projects. Cognitive analysis methods can be used effectively to anticipate cognitive performance problems and specify ways to improve individual and team cognitive performance (be it through new forms of training, user interfaces, or decision aids).

  5. Rapid Method for Sodium Hydroxide Fusion of Concrete and ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Americium-241, plutonium-238, plutonium-239, radium-226, strontium-90, uranium-234, uranium-235 and uranium-238 in concrete and brick samples Method Selected for: SAM lists this method for qualitative analysis of americium-241, plutonium-238, plutonium-239, radium-226, strontium-90, uranium-234, uranium-235 and uranium-238 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  6. On the Analysis of Output Information of S-tree Method

    NASA Astrophysics Data System (ADS)

    Bekaryan, Karen M.; Melkonyan, Anahit A.

    2007-08-01

    On of the most popular and effective method of analysis of hierarchical structure of N-body gravitating systems is method of S-tree diagrams. Apart from many interesting peculiarities, the method, unfortunately, is not free from some disadvantages, among which most important is an extremely complexity of analysis of output information. To solve this problem a number of methods are suggested. From our point of view, most effective approach is an application of all these methods simultaneousely. This allows to obtaine more complete and objective «picture» concerning a final distribution.

  7. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1982-01-01

    The computational methods used to predict and optimize the thermal structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a different yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  8. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1983-01-01

    The computational methods used to predict and optimize the thermal-structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a difficult yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally-useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  9. Probabilistic structural analysis methods for space transportation propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Moore, N.; Anis, C.; Newell, J.; Nagpal, V.; Singhal, S.

    1991-01-01

    Information on probabilistic structural analysis methods for space propulsion systems is given in viewgraph form. Information is given on deterministic certification methods, probability of failure, component response analysis, stress responses for 2nd stage turbine blades, Space Shuttle Main Engine (SSME) structural durability, and program plans. .

  10. Application of the pulsed fast/thermal neutron method for soil elemental analysis

    USDA-ARS?s Scientific Manuscript database

    Soil science is a research field where physic concepts and experimental methods are widely used, particularly in agro-chemistry and soil elemental analysis. Different methods of analysis are currently available. The evolution of nuclear physics (methodology and instrumentation) combined with the ava...

  11. SWECS tower dynamics analysis methods and results

    NASA Technical Reports Server (NTRS)

    Wright, A. D.; Sexton, J. H.; Butterfield, C. P.; Thresher, R. M.

    1981-01-01

    Several different tower dynamics analysis methods and computer codes were used to determine the natural frequencies and mode shapes of both guyed and freestanding wind turbine towers. These analysis methods are described and the results for two types of towers, a guyed tower and a freestanding tower, are shown. The advantages and disadvantages in the use of and the accuracy of each method are also described.

  12. Utilization of nuclear methods for materials analysis and the determination of concentration gradients

    NASA Technical Reports Server (NTRS)

    Darras, R.

    1979-01-01

    The various types of nuclear chemical analysis methods are discussed. The possibilities of analysis through activation and direct observation of nuclear reactions are described. Such methods make it possible to analyze trace elements and impurities with selectivity, accuracy, and a high degree of sensitivity. Such methods are used in measuring major elements present in materials which are available for analysis only in small quantities. These methods are well suited to superficial analyses and to determination of concentration gradients; provided the nature and energy of the incident particles are chosen judiciously. Typical examples of steels, pure iron and refractory metals are illustrated.

  13. A Century of Enzyme Kinetic Analysis, 1913 to 2013

    PubMed Central

    Johnson, Kenneth A.

    2013-01-01

    This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. PMID:23850893

  14. What Touched Your Heart? Collaborative Story Analysis Emerging From an Apsáalooke Cultural Context.

    PubMed

    Hallett, John; Held, Suzanne; McCormick, Alma Knows His Gun; Simonds, Vanessa; Real Bird, Sloane; Martin, Christine; Simpson, Colleen; Schure, Mark; Turnsplenty, Nicole; Trottier, Coleen

    2017-07-01

    Community-based participatory research and decolonizing research share some recommendations for best practices for conducting research. One commonality is partnering on all stages of research; co-developing methods of data analysis is one stage with a deficit of partnering examples. We present a novel community-based and developed method for analyzing qualitative data within an Indigenous health study and explain incompatibilities of existing methods for our purposes and community needs. We describe how we explored available literature, received counsel from community Elders and experts in the field, and collaboratively developed a data analysis method consonant with community values. The method of analysis, in which interview/story remained intact, team members received story, made meaning through discussion, and generated a conceptual framework to inform intervention development, is detailed. We offer the development process and method as an example for researchers working with communities who want to keep stories intact during qualitative data analysis.

  15. Development of an SPE/CE method for analyzing HAAs

    USGS Publications Warehouse

    Zhang, L.; Capel, P.D.; Hozalski, R.M.

    2007-01-01

    The haloacetic acid (HAA) analysis methods approved by the US Environmental Protection Agency involve extraction and derivatization of HAAs (typically to their methyl ester form) and analysis by gas chromatography (GC) with electron capture detection (ECD). Concerns associated with these methods include the time and effort of the derivatization process, use of potentially hazardous chemicals or conditions during methylation, poor recoveries because of low extraction efficiencies for some HAAs or matrix effects from sulfate, and loss of tribromoacetic acid because of decarboxylation. The HAA analysis method introduced here uses solid-phase extraction (SPE) followed by capillary electrophoresis (CE) analysis. The method is accurate, reproducible, sensitive, relatively safe, and easy to perform, and avoids the use of large amounts of solvent for liquid-liquid extraction and the potential hazards and hassles of derivatization. The cost of analyzing HAAs using this method should be lower than the currently approved methods, and utilities with a GC/ECD can perform the analysis in-house.

  16. Shape design sensitivity analysis and optimal design of structural systems

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.

    1987-01-01

    The material derivative concept of continuum mechanics and an adjoint variable method of design sensitivity analysis are used to relate variations in structural shape to measures of structural performance. A domain method of shape design sensitivity analysis is used to best utilize the basic character of the finite element method that gives accurate information not on the boundary but in the domain. Implementation of shape design sensitivty analysis using finite element computer codes is discussed. Recent numerical results are used to demonstrate the accuracy obtainable using the method. Result of design sensitivity analysis is used to carry out design optimization of a built-up structure.

  17. Compendium of Methods for Applying Measured Data to Vibration and Acoustic Problems

    DTIC Science & Technology

    1985-10-01

    statistical energy analysis , finite element models, transfer function...Procedures for the Modal Analysis Method .............................................. 8-22 8.4 Summary of the Procedures for the Statistical Energy Analysis Method... statistical energy analysis . 8-1 • o + . . i... "_+,A" L + "+..• •+A ’! i, + +.+ +• o.+ -ore -+. • -..- , .%..% ". • 2 -".-2- ;.-.’, . o . It is helpful

  18. Influence of ECG sampling rate in fetal heart rate variability analysis.

    PubMed

    De Jonckheere, J; Garabedian, C; Charlier, P; Champion, C; Servan-Schreiber, E; Storme, L; Debarge, V; Jeanne, M; Logier, R

    2017-07-01

    Fetal hypoxia results in a fetal blood acidosis (pH<;7.10). In such a situation, the fetus develops several adaptation mechanisms regulated by the autonomic nervous system. Many studies demonstrated significant changes in heart rate variability in hypoxic fetuses. So, fetal heart rate variability analysis could be of precious help for fetal hypoxia prediction. Commonly used fetal heart rate variability analysis methods have been shown to be sensitive to the ECG signal sampling rate. Indeed, a low sampling rate could induce variability in the heart beat detection which will alter the heart rate variability estimation. In this paper, we introduce an original fetal heart rate variability analysis method. We hypothesize that this method will be less sensitive to ECG sampling frequency changes than common heart rate variability analysis methods. We then compared the results of this new heart rate variability analysis method with two different sampling frequencies (250-1000 Hz).

  19. A scoping review of spatial cluster analysis techniques for point-event data.

    PubMed

    Fritz, Charles E; Schuurman, Nadine; Robertson, Colin; Lear, Scott

    2013-05-01

    Spatial cluster analysis is a uniquely interdisciplinary endeavour, and so it is important to communicate and disseminate ideas, innovations, best practices and challenges across practitioners, applied epidemiology researchers and spatial statisticians. In this research we conducted a scoping review to systematically search peer-reviewed journal databases for research that has employed spatial cluster analysis methods on individual-level, address location, or x and y coordinate derived data. To illustrate the thematic issues raised by our results, methods were tested using a dataset where known clusters existed. Point pattern methods, spatial clustering and cluster detection tests, and a locally weighted spatial regression model were most commonly used for individual-level, address location data (n = 29). The spatial scan statistic was the most popular method for address location data (n = 19). Six themes were identified relating to the application of spatial cluster analysis methods and subsequent analyses, which we recommend researchers to consider; exploratory analysis, visualization, spatial resolution, aetiology, scale and spatial weights. It is our intention that researchers seeking direction for using spatial cluster analysis methods, consider the caveats and strengths of each approach, but also explore the numerous other methods available for this type of analysis. Applied spatial epidemiology researchers and practitioners should give special consideration to applying multiple tests to a dataset. Future research should focus on developing frameworks for selecting appropriate methods and the corresponding spatial weighting schemes.

  20. An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks

    NASA Astrophysics Data System (ADS)

    Zhao, Peng-yuan; Huang, Xiao-ping

    2018-03-01

    Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.

  1. Fusing Symbolic and Numerical Diagnostic Computations

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.

  2. Analysis of Endocrine Disrupting Pesticides by Capillary GC with Mass Spectrometric Detection

    PubMed Central

    Matisová, Eva; Hrouzková, Svetlana

    2012-01-01

    Endocrine disrupting chemicals, among them many pesticides, alter the normal functioning of the endocrine system of both wildlife and humans at very low concentration levels. Therefore, the importance of method development for their analysis in food and the environment is increasing. This also covers contributions in the field of ultra-trace analysis of multicomponent mixtures of organic pollutants in complex matrices. With this fact conventional capillary gas chromatography (CGC) and fast CGC with mass spectrometric detection (MS) has acquired a real importance in the analysis of endocrine disrupting pesticide (EDP) residues. This paper provides an overview of GC methods, including sample preparation steps, for analysis of EDPs in a variety of matrices at ultra-trace concentration levels. Emphasis is put on separation method, mode of MS detection and ionization and obtained limits of detection and quantification. Analysis time is one of the most important aspects that should be considered in the choice of analytical methods for routine analysis. Therefore, the benefits of developed fast GC methods are important. PMID:23202677

  3. PROOF OF CONCEPT FOR A HUMAN RELIABILITY ANALYSIS METHOD FOR HEURISTIC USABILITY EVALUATION OF SOFTWARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe

    2005-09-01

    An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings withmore » HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.« less

  4. Application of several methods for determining transfer functions and frequency response of aircraft from flight data

    NASA Technical Reports Server (NTRS)

    Eggleston, John M; Mathews, Charles W

    1954-01-01

    In the process of analyzing the longitudinal frequency-response characteristics of aircraft, information on some of the methods of analysis has been obtained by the Langley Aeronautical Laboratory of the National Advisory Committee for Aeronautics. In the investigation of these methods, the practical applications and limitations were stressed. In general, the methods considered may be classed as: (1) analysis of sinusoidal response, (2) analysis of transient response as to harmonic content through determination of the Fourier integral by manual or machine methods, and (3) analysis of the transient through the use of least-squares solutions of the coefficients of an assumed equation for either the transient time response or frequency response (sometimes referred to as curve-fitting methods). (author)

  5. Diagnosis of cystic fibrosis with chloride meter (Sherwood M926S chloride analyzer®) and sweat test analysis system (CFΔ collection system®) compared to the Gibson Cooke method.

    PubMed

    Emiralioğlu, Nagehan; Özçelik, Uğur; Yalçın, Ebru; Doğru, Deniz; Kiper, Nural

    2016-01-01

    Sweat test with Gibson Cooke (GC) method is the diagnostic gold standard for cystic fibrosis (CF). Recently, alternative methods have been introduced to simplify both the collection and analysis of sweat samples. Our aim was to compare sweat chloride values obtained by GC method with other sweat test methods in patients diagnosed with CF and whose CF diagnosis had been ruled out. We wanted to determine if the other sweat test methods could reliably identify patients with CF and differentiate them from healthy subjects. Chloride concentration was measured with GC method, chloride meter and sweat test analysis system; also conductivity was determined with sweat test analysis system. Forty eight patients with CF and 82 patients without CF underwent the sweat test, showing median sweat chloride values 98.9 mEq/L with GC method, 101 mmol/L with chloride meter, 87.8 mmol/L with sweat test analysis system. In non-CF group, median sweat chloride values were 16.8 mEq/L with GC method, 10.5 mmol/L with chloride meter, and 15.6 mmol/L with sweat test analysis system. Median conductivity value was 107.3 mmol/L in CF group and 32.1 mmol/L in non CF group. There was a strong positive correlation between GC method and the other sweat test methods with a statistical significance (r=0.85) in all subjects. Sweat chloride concentration and conductivity by other sweat test methods highly correlate with the GC method. We think that the other sweat test equipments can be used as reliably as the classic GC method to diagnose or exclude CF.

  6. Coupled Electro-Magneto-Mechanical-Acoustic Analysis Method Developed by Using 2D Finite Element Method for Flat Panel Speaker Driven by Magnetostrictive-Material-Based Actuator

    NASA Astrophysics Data System (ADS)

    Yoo, Byungjin; Hirata, Katsuhiro; Oonishi, Atsurou

    In this study, a coupled analysis method for flat panel speakers driven by giant magnetostrictive material (GMM) based actuator was developed. The sound field produced by a flat panel speaker that is driven by a GMM actuator depends on the vibration of the flat panel, this vibration is a result of magnetostriction property of the GMM. In this case, to predict the sound pressure level (SPL) in the audio-frequency range, it is necessary to take into account not only the magnetostriction property of the GMM but also the effect of eddy current and the vibration characteristics of the actuator and the flat panel. In this paper, a coupled electromagnetic-structural-acoustic analysis method is presented; this method was developed by using the finite element method (FEM). This analysis method is used to predict the performance of a flat panel speaker in the audio-frequency range. The validity of the analysis method is verified by comparing with the measurement results of a prototype speaker.

  7. Methods for Force Analysis of Overconstrained Parallel Mechanisms: A Review

    NASA Astrophysics Data System (ADS)

    Liu, Wen-Lan; Xu, Yun-Dou; Yao, Jian-Tao; Zhao, Yong-Sheng

    2017-11-01

    The force analysis of overconstrained PMs is relatively complex and difficult, for which the methods have always been a research hotspot. However, few literatures analyze the characteristics and application scopes of the various methods, which is not convenient for researchers and engineers to master and adopt them properly. A review of the methods for force analysis of both passive and active overconstrained PMs is presented. The existing force analysis methods for these two kinds of overconstrained PMs are classified according to their main ideas. Each category is briefly demonstrated and evaluated from such aspects as the calculation amount, the comprehensiveness of considering limbs' deformation, and the existence of explicit expressions of the solutions, which provides an important reference for researchers and engineers to quickly find a suitable method. The similarities and differences between the statically indeterminate problem of passive overconstrained PMs and that of active overconstrained PMs are discussed, and a universal method for these two kinds of overconstrained PMs is pointed out. The existing deficiencies and development directions of the force analysis methods for overconstrained systems are indicated based on the overview.

  8. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Methods of analysis. 133.5 Section 133.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture...

  9. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Methods of analysis. 133.5 Section 133.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture...

  10. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Methods of analysis. 133.5 Section 133.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CHEESES AND RELATED CHEESE PRODUCTS General Provisions § 133.5 Methods of analysis. Moisture...

  11. Chapter A5. Section 6.1.F. Wastewater, Pharmaceutical, and Antibiotic Compounds

    USGS Publications Warehouse

    Lewis, Michael Edward; Zaugg, Steven D.

    2003-01-01

    The USGS differentiates between samples collected for analysis of wastewater compounds and those collected for analysis of pharmaceutical and antibiotic compounds, based on the analytical schedule for the laboratory method. Currently, only the wastewater laboratory method for field-filtered samples (SH1433) is an approved, routine (production) method. (The unfiltered wastewater method LC 8033 also is available but requires a proposal for custom analysis.) At this time, analysis of samples for pharmaceutical and antibiotic compounds is confined to research studies and is available only on a custom basis.

  12. A New View of Earthquake Ground Motion Data: The Hilbert Spectral Analysis

    NASA Technical Reports Server (NTRS)

    Huang, Norden; Busalacchi, Antonio J. (Technical Monitor)

    2000-01-01

    A brief description of the newly developed Empirical Mode Decomposition (ENID) and Hilbert Spectral Analysis (HSA) method will be given. The decomposition is adaptive and can be applied to both nonlinear and nonstationary data. Example of the method applied to a sample earthquake record will be given. The results indicate those low frequency components, totally missed by the Fourier analysis, are clearly identified by the new method. Comparisons with Wavelet and window Fourier analysis show the new method offers much better temporal and frequency resolutions.

  13. Theoretical analysis of HVAC duct hanger systems

    NASA Technical Reports Server (NTRS)

    Miller, R. D.

    1987-01-01

    Several methods are presented which, together, may be used in the analysis of duct hanger systems over a wide range of frequencies. The finite element method (FEM) and component mode synthesis (CMS) method are used for low- to mid-frequency range computations and have been shown to yield reasonably close results. The statistical energy analysis (SEA) method yields predictions which agree with the CMS results for the 800 to 1000 Hz range provided that a sufficient number of modes participate. The CMS approach has been shown to yield valuable insight into the mid-frequency range of the analysis. It has been demonstrated that it is possible to conduct an analysis of a duct/hanger system in a cost-effective way for a wide frequency range, using several methods which overlap for several frequency bands.

  14. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  15. Multivariate analysis in thoracic research.

    PubMed

    Mengual-Macenlle, Noemí; Marcos, Pedro J; Golpe, Rafael; González-Rivas, Diego

    2015-03-01

    Multivariate analysis is based in observation and analysis of more than one statistical outcome variable at a time. In design and analysis, the technique is used to perform trade studies across multiple dimensions while taking into account the effects of all variables on the responses of interest. The development of multivariate methods emerged to analyze large databases and increasingly complex data. Since the best way to represent the knowledge of reality is the modeling, we should use multivariate statistical methods. Multivariate methods are designed to simultaneously analyze data sets, i.e., the analysis of different variables for each person or object studied. Keep in mind at all times that all variables must be treated accurately reflect the reality of the problem addressed. There are different types of multivariate analysis and each one should be employed according to the type of variables to analyze: dependent, interdependence and structural methods. In conclusion, multivariate methods are ideal for the analysis of large data sets and to find the cause and effect relationships between variables; there is a wide range of analysis types that we can use.

  16. Regularized Generalized Canonical Correlation Analysis

    ERIC Educational Resources Information Center

    Tenenhaus, Arthur; Tenenhaus, Michel

    2011-01-01

    Regularized generalized canonical correlation analysis (RGCCA) is a generalization of regularized canonical correlation analysis to three or more sets of variables. It constitutes a general framework for many multi-block data analysis methods. It combines the power of multi-block data analysis methods (maximization of well identified criteria) and…

  17. Preliminary analysis techniques for ring and stringer stiffened cylindrical shells

    NASA Technical Reports Server (NTRS)

    Graham, J.

    1993-01-01

    This report outlines methods of analysis for the buckling of thin-walled circumferentially and longitudinally stiffened cylindrical shells. Methods of analysis for the various failure modes are presented in one cohesive package. Where applicable, more than one method of analysis for a failure mode is presented along with standard practices. The results of this report are primarily intended for use in launch vehicle design in the elastic range. A Microsoft Excel worksheet with accompanying macros has been developed to automate the analysis procedures.

  18. Operation Guiding Light-Scientific Program and Field Plan. The Pilot Field Experiment for NORDA Project Chemical Dynamics in Ocean Frontal Areas

    DTIC Science & Technology

    1985-03-01

    distribution. Samples of suspended partici’lates will also be collected for later image and elemental analysis . 25 Method of analysis for particle...will be flow injection analysis . This method will allow rapid, continuous analysis of seawater nutrients. Measurements will be made at one minute...5 m intervals) as well as from the underway pumping system. Method of pigment analysis for porphyrin and carotenoid pigments will be separation by

  19. Methodology Series Module 10: Qualitative Health Research

    PubMed Central

    Setia, Maninder Singh

    2017-01-01

    Although quantitative designs are commonly used in clinical research, some studies require qualitative methods. These designs are different from quantitative methods; thus, researchers should be aware of data collection methods and analyses for qualitative research. Qualitative methods are particularly useful to understand patient experiences with the treatment or new methods of management or to explore issues in detail. These methods are useful in social and behavioral research. In qualitative research, often, the main focus is to understand the issue in detail rather than generalizability; thus, the sampling methods commonly used are purposive sampling; quota sampling; and snowball sampling (for hard to reach groups). Data can be collected using in-depth interviews (IDIs) or focus group discussions (FGDs). IDI is a one-to-one interview with the participant. FGD is a method of group interview or discussion, in which more than one participant is interviewed at the same time and is usually led by a facilitator. The commonly used methods for data analysis are: thematic analysis; grounded theory analysis; and framework analysis. Qualitative data collection and analysis require special expertise. Hence, if the reader plans to conduct qualitative research, they should team up with a qualitative researcher. PMID:28794545

  20. Atomistic cluster alignment method for local order mining in liquids and glasses

    NASA Astrophysics Data System (ADS)

    Fang, X. W.; Wang, C. Z.; Yao, Y. X.; Ding, Z. J.; Ho, K. M.

    2010-11-01

    An atomistic cluster alignment method is developed to identify and characterize the local atomic structural order in liquids and glasses. With the “order mining” idea for structurally disordered systems, the method can detect the presence of any type of local order in the system and can quantify the structural similarity between a given set of templates and the aligned clusters in a systematic and unbiased manner. Moreover, population analysis can also be carried out for various types of clusters in the system. The advantages of the method in comparison with other previously developed analysis methods are illustrated by performing the structural analysis for four prototype systems (i.e., pure Al, pure Zr, Zr35Cu65 , and Zr36Ni64 ). The results show that the cluster alignment method can identify various types of short-range orders (SROs) in these systems correctly while some of these SROs are difficult to capture by most of the currently available analysis methods (e.g., Voronoi tessellation method). Such a full three-dimensional atomistic analysis method is generic and can be applied to describe the magnitude and nature of noncrystalline ordering in many disordered systems.

  1. Methodology Series Module 10: Qualitative Health Research.

    PubMed

    Setia, Maninder Singh

    2017-01-01

    Although quantitative designs are commonly used in clinical research, some studies require qualitative methods. These designs are different from quantitative methods; thus, researchers should be aware of data collection methods and analyses for qualitative research. Qualitative methods are particularly useful to understand patient experiences with the treatment or new methods of management or to explore issues in detail. These methods are useful in social and behavioral research. In qualitative research, often, the main focus is to understand the issue in detail rather than generalizability; thus, the sampling methods commonly used are purposive sampling; quota sampling; and snowball sampling (for hard to reach groups). Data can be collected using in-depth interviews (IDIs) or focus group discussions (FGDs). IDI is a one-to-one interview with the participant. FGD is a method of group interview or discussion, in which more than one participant is interviewed at the same time and is usually led by a facilitator. The commonly used methods for data analysis are: thematic analysis; grounded theory analysis; and framework analysis. Qualitative data collection and analysis require special expertise. Hence, if the reader plans to conduct qualitative research, they should team up with a qualitative researcher.

  2. A validated UHPLC-tandem mass spectrometry method for quantitative analysis of flavonolignans in milk thistle (Silybum marianum) extracts.

    PubMed

    Graf, Tyler N; Cech, Nadja B; Polyak, Stephen J; Oberlies, Nicholas H

    2016-07-15

    Validated methods are needed for the analysis of natural product secondary metabolites. These methods are particularly important to translate in vitro observations to in vivo studies. Herein, a method is reported for the analysis of the key secondary metabolites, a series of flavonolignans and a flavonoid, from an extract prepared from the seeds of milk thistle [Silybum marianum (L.) Gaertn. (Asteraceae)]. This report represents the first UHPLC MS-MS method validated for quantitative analysis of these compounds. The method takes advantage of the excellent resolution achievable with UHPLC to provide a complete analysis in less than 7min. The method is validated using both UV and MS detectors, making it applicable in laboratories with different types of analytical instrumentation available. Lower limits of quantitation achieved with this method range from 0.0400μM to 0.160μM with UV and from 0.0800μM to 0.160μM with MS. The new method is employed to evaluate variability in constituent composition in various commercial S. marianum extracts, and to show that storage of the milk thistle compounds in DMSO leads to degradation. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Statistical Methods for the Analysis of Discrete Choice Experiments: A Report of the ISPOR Conjoint Analysis Good Research Practices Task Force.

    PubMed

    Hauber, A Brett; González, Juan Marcos; Groothuis-Oudshoorn, Catharina G M; Prior, Thomas; Marshall, Deborah A; Cunningham, Charles; IJzerman, Maarten J; Bridges, John F P

    2016-06-01

    Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice experiments (DCEs), have been increasingly used to quantify preferences of patients, caregivers, physicians, and other stakeholders. Recent consensus-based guidance on good research practices, including two recent task force reports from the International Society for Pharmacoeconomics and Outcomes Research, has aided in improving the quality of conjoint analyses and DCEs in outcomes research. Nevertheless, uncertainty regarding good research practices for the statistical analysis of data from DCEs persists. There are multiple methods for analyzing DCE data. Understanding the characteristics and appropriate use of different analysis methods is critical to conducting a well-designed DCE study. This report will assist researchers in evaluating and selecting among alternative approaches to conducting statistical analysis of DCE data. We first present a simplistic DCE example and a simple method for using the resulting data. We then present a pedagogical example of a DCE and one of the most common approaches to analyzing data from such a question format-conditional logit. We then describe some common alternative methods for analyzing these data and the strengths and weaknesses of each alternative. We present the ESTIMATE checklist, which includes a list of questions to consider when justifying the choice of analysis method, describing the analysis, and interpreting the results. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  5. Analysis of Urinary Metabolites of Nerve and Blister Chemical Warfare Agents

    DTIC Science & Technology

    2014-08-01

    of CWAs. The analysis methods use UHPLC-MS/MS in Multiple Reaction Monitoring ( MRM ) mode to enhance the selectivity and sensitivity of the method...Chromatography Mass Spectrometry LOD Limit Of Detection LOQ Limit of Quantitation MRM Multiple Reaction Monitoring MSMS Tandem mass...urine [1]. Those analysis methods use UHPLC- MS/MS in Multiple Reaction Monitoring ( MRM ) mode to enhance the selectivity and sensitivity of the method

  6. Parameters Estimation For A Patellofemoral Joint Of A Human Knee Using A Vector Method

    NASA Astrophysics Data System (ADS)

    Ciszkiewicz, A.; Knapczyk, J.

    2015-08-01

    Position and displacement analysis of a spherical model of a human knee joint using the vector method was presented. Sensitivity analysis and parameter estimation were performed using the evolutionary algorithm method. Computer simulations for the mechanism with estimated parameters proved the effectiveness of the prepared software. The method itself can be useful when solving problems concerning the displacement and loads analysis in the knee joint.

  7. Variability of Currents in Great South Channel and Over Georges Bank: Observation and Modeling

    DTIC Science & Technology

    1992-06-01

    Rizzoli motivated me to study the driv:,: mechanism of stratified tidal rectification using diagnostic analysis methods . Conversations with Glen...drifter trajectories in the 1988 and 1989 surveys give further encouragement that the analysis method yields an accurate picture of the nontidal flow...harmonic truncation method . Scaling analysis argues that this method is not appropriate for a step topography because it is valid only when the

  8. Nonlinear multivariate and time series analysis by neural network methods

    NASA Astrophysics Data System (ADS)

    Hsieh, William W.

    2004-03-01

    Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data, data from observational arrays, from satellites, or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression at the base, followed by principal component analysis (PCA) and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression and classification. More recently, neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA), and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA, and NLSSA techniques and their applications to various data sets of the atmosphere and the ocean (especially for the El Niño-Southern Oscillation and the stratospheric quasi-biennial oscillation). These data sets reveal that the linear methods are often too simplistic to describe real-world systems, with a tendency to scatter a single oscillatory phenomenon into numerous unphysical modes or higher harmonics, which can be largely alleviated in the new nonlinear paradigm.

  9. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  10. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  11. Symposium on Parallel Computational Methods for Large-scale Structural Analysis and Design, 2nd, Norfolk, VA, US

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O. (Editor); Housner, Jerrold M. (Editor)

    1993-01-01

    Computing speed is leaping forward by several orders of magnitude each decade. Engineers and scientists gathered at a NASA Langley symposium to discuss these exciting trends as they apply to parallel computational methods for large-scale structural analysis and design. Among the topics discussed were: large-scale static analysis; dynamic, transient, and thermal analysis; domain decomposition (substructuring); and nonlinear and numerical methods.

  12. Advancing our thinking in presence-only and used-available analysis.

    PubMed

    Warton, David; Aarts, Geert

    2013-11-01

    1. The problems of analysing used-available data and presence-only data are equivalent, and this paper uses this equivalence as a platform for exploring opportunities for advancing analysis methodology. 2. We suggest some potential methodological advances in used-available analysis, made possible via lessons learnt in the presence-only literature, for example, using modern methods to improve predictive performance. We also consider the converse - potential advances in presence-only analysis inspired by used-available methodology. 3. Notwithstanding these potential advances in methodology, perhaps a greater opportunity is in advancing our thinking about how to apply a given method to a particular data set. 4. It is shown by example that strikingly different results can be achieved for a single data set by applying a given method of analysis in different ways - hence having chosen a method of analysis, the next step of working out how to apply it is critical to performance. 5. We review some key issues to consider in deciding how to apply an analysis method: apply the method in a manner that reflects the study design; consider data properties; and use diagnostic tools to assess how reasonable a given analysis is for the data at hand. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.

  13. Integrative Analysis of Prognosis Data on Multiple Cancer Subtypes

    PubMed Central

    Liu, Jin; Huang, Jian; Zhang, Yawei; Lan, Qing; Rothman, Nathaniel; Zheng, Tongzhang; Ma, Shuangge

    2014-01-01

    Summary In cancer research, profiling studies have been extensively conducted, searching for genes/SNPs associated with prognosis. Cancer is diverse. Examining the similarity and difference in the genetic basis of multiple subtypes of the same cancer can lead to a better understanding of their connections and distinctions. Classic meta-analysis methods analyze each subtype separately and then compare analysis results across subtypes. Integrative analysis methods, in contrast, analyze the raw data on multiple subtypes simultaneously and can outperform meta-analysis methods. In this study, prognosis data on multiple subtypes of the same cancer are analyzed. An AFT (accelerated failure time) model is adopted to describe survival. The genetic basis of multiple subtypes is described using the heterogeneity model, which allows a gene/SNP to be associated with prognosis of some subtypes but not others. A compound penalization method is developed to identify genes that contain important SNPs associated with prognosis. The proposed method has an intuitive formulation and is realized using an iterative algorithm. Asymptotic properties are rigorously established. Simulation shows that the proposed method has satisfactory performance and outperforms a penalization-based meta-analysis method and a regularized thresholding method. An NHL (non-Hodgkin lymphoma) prognosis study with SNP measurements is analyzed. Genes associated with the three major subtypes, namely DLBCL, FL, and CLL/SLL, are identified. The proposed method identifies genes that are different from alternatives and have important implications and satisfactory prediction performance. PMID:24766212

  14. Comparison of a clinical gait analysis method using videography and temporal-distance measures with 16-mm cinematography.

    PubMed

    Stuberg, W A; Colerick, V L; Blanke, D J; Bruce, W

    1988-08-01

    The purpose of this study was to compare a clinical gait analysis method using videography and temporal-distance measures with 16-mm cinematography in a gait analysis laboratory. Ten children with a diagnosis of cerebral palsy (means age = 8.8 +/- 2.7 years) and 9 healthy children (means age = 8.9 +/- 2.4 years) participated in the study. Stride length, walking velocity, and goniometric measurements of the hip, knee, and ankle were recorded using the two gait analysis methods. A multivariate analysis of variance was used to determine significant differences between the data collected using the two methods. Pearson product-moment correlation coefficients were determined to examine the relationship between the measurements recorded by the two methods. The consistency of performance of the subjects during walking was examined by intraclass correlation coefficients. No significant differences were found between the methods for the variables studied. Pearson product-moment correlation coefficients ranged from .79 to .95, and intraclass coefficients ranged from .89 to .97. The clinical gait analysis method was found to be a valid tool in comparison with 16-mm cinematography for the variables that were studied.

  15. Aerodynamic design optimization using sensitivity analysis and computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay; Eleshaky, Mohamed E.

    1991-01-01

    A new and efficient method is presented for aerodynamic design optimization, which is based on a computational fluid dynamics (CFD)-sensitivity analysis algorithm. The method is applied to design a scramjet-afterbody configuration for an optimized axial thrust. The Euler equations are solved for the inviscid analysis of the flow, which in turn provides the objective function and the constraints. The CFD analysis is then coupled with the optimization procedure that uses a constrained minimization method. The sensitivity coefficients, i.e. gradients of the objective function and the constraints, needed for the optimization are obtained using a quasi-analytical method rather than the traditional brute force method of finite difference approximations. During the one-dimensional search of the optimization procedure, an approximate flow analysis (predicted flow) based on a first-order Taylor series expansion is used to reduce the computational cost. Finally, the sensitivity of the optimum objective function to various design parameters, which are kept constant during the optimization, is computed to predict new optimum solutions. The flow analysis of the demonstrative example are compared with the experimental data. It is shown that the method is more efficient than the traditional methods.

  16. A Dual Super-Element Domain Decomposition Approach for Parallel Nonlinear Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Jokhio, G. A.; Izzuddin, B. A.

    2015-05-01

    This article presents a new domain decomposition method for nonlinear finite element analysis introducing the concept of dual partition super-elements. The method extends ideas from the displacement frame method and is ideally suited for parallel nonlinear static/dynamic analysis of structural systems. In the new method, domain decomposition is realized by replacing one or more subdomains in a "parent system," each with a placeholder super-element, where the subdomains are processed separately as "child partitions," each wrapped by a dual super-element along the partition boundary. The analysis of the overall system, including the satisfaction of equilibrium and compatibility at all partition boundaries, is realized through direct communication between all pairs of placeholder and dual super-elements. The proposed method has particular advantages for matrix solution methods based on the frontal scheme, and can be readily implemented for existing finite element analysis programs to achieve parallelization on distributed memory systems with minimal intervention, thus overcoming memory bottlenecks typically faced in the analysis of large-scale problems. Several examples are presented in this article which demonstrate the computational benefits of the proposed parallel domain decomposition approach and its applicability to the nonlinear structural analysis of realistic structural systems.

  17. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    PubMed

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  18. Comparison of four extraction/methylation analytical methods to measure fatty acid composition by gas chromatography in meat.

    PubMed

    Juárez, M; Polvillo, O; Contò, M; Ficco, A; Ballico, S; Failla, S

    2008-05-09

    Four different extraction-derivatization methods commonly used for fatty acid analysis in meat (in situ or one-step method, saponification method, classic method and a combination of classic extraction and saponification derivatization) were tested. The in situ method had low recovery and variation. The saponification method showed the best balance between recovery, precision, repeatability and reproducibility. The classic method had high recovery and acceptable variation values, except for the polyunsaturated fatty acids, showing higher variation than the former methods. The combination of extraction and methylation steps had great recovery values, but the precision, repeatability and reproducibility were not acceptable. Therefore the saponification method would be more convenient for polyunsaturated fatty acid analysis, whereas the in situ method would be an alternative for fast analysis. However the classic method would be the method of choice for the determination of the different lipid classes.

  19. Implementing EPA Method 537

    EPA Science Inventory

    This presentation describes EPA Method 537 for the analysis of 14 perfluorinated alkyl acids in drinking water as well as the challenges associated with preparing a laboratory for analysis using Method 537.

  20. RESEARCH METHOD FOR SAMPLING AND ANALYSIS OF FIBROUS AMPHIBOLE IN VERMICULITE ATTIC INSULATION

    EPA Science Inventory

    NRMRL hosted a meeting on July 17-18, 2003 entitled, "Analytical Method for Bulk Analysis of Vermiculite." The purpose of this effort was to produce an interim research method for use by U.S. EPA's Office of Research and Development (ORD) for the analysis of bulk vermiculite for...

  1. Turbulence excited frequency domain damping measurement and truncation effects

    NASA Technical Reports Server (NTRS)

    Soovere, J.

    1976-01-01

    Existing frequency domain modal frequency and damping analysis methods are discussed. The effects of truncation in the Laplace and Fourier transform data analysis methods are described. Methods for eliminating truncation errors from measured damping are presented. Implications of truncation effects in fast Fourier transform analysis are discussed. Limited comparison with test data is presented.

  2. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Methods of analysis. 163.5 Section 163.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in...

  3. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Methods of analysis. 163.5 Section 163.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in...

  4. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Methods of analysis. 163.5 Section 163.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in...

  5. 21 CFR 163.5 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Methods of analysis. 163.5 Section 163.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in...

  6. A Practical Method of Policy Analysis by Simulating Policy Options

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    This article focuses on a method of policy analysis that has evolved from the previous articles in this issue. The first section, "Toward a Theory of Educational Production," identifies concepts from science and achievement production to be incorporated into this policy analysis method. Building on Kuhn's (1970) discussion regarding paradigms, the…

  7. Risk Analysis Methods for Deepwater Port Oil Transfer Systems

    DOT National Transportation Integrated Search

    1976-06-01

    This report deals with the risk analysis methodology for oil spills from the oil transfer systems in deepwater ports. Failure mode and effect analysis in combination with fault tree analysis are identified as the methods best suited for the assessmen...

  8. A simplified and efficient method for the analysis of fatty acid methyl esters suitable for large clinical studies.

    PubMed

    Masood, Athar; Stark, Ken D; Salem, Norman

    2005-10-01

    Conventional sample preparation for fatty acid analysis is a complicated, multiple-step process, and gas chromatography (GC) analysis alone can require >1 h per sample to resolve fatty acid methyl esters (FAMEs). Fast GC analysis was adapted to human plasma FAME analysis using a modified polyethylene glycol column with smaller internal diameters, thinner stationary phase films, increased carrier gas linear velocity, and faster temperature ramping. Our results indicated that fast GC analyses were comparable to conventional GC in peak resolution. A conventional transesterification method based on Lepage and Roy was simplified to a one-step method with the elimination of the neutralization and centrifugation steps. A robotics-amenable method was also developed, with lower methylation temperatures and in an open-tube format using multiple reagent additions. The simplified methods produced results that were quantitatively similar and with similar coefficients of variation as compared with the original Lepage and Roy method. The present streamlined methodology is suitable for the direct fatty acid analysis of human plasma, is appropriate for research studies, and will facilitate large clinical trials and make possible population studies.

  9. Application of the probabilistic approximate analysis method to a turbopump blade analysis. [for Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.

    1990-01-01

    An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.

  10. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  11. Heading in the right direction: thermodynamics-based network analysis and pathway engineering.

    PubMed

    Ataman, Meric; Hatzimanikatis, Vassily

    2015-12-01

    Thermodynamics-based network analysis through the introduction of thermodynamic constraints in metabolic models allows a deeper analysis of metabolism and guides pathway engineering. The number and the areas of applications of thermodynamics-based network analysis methods have been increasing in the last ten years. We review recent applications of these methods and we identify the areas that such analysis can contribute significantly, and the needs for future developments. We find that organisms with multiple compartments and extremophiles present challenges for modeling and thermodynamics-based flux analysis. The evolution of current and new methods must also address the issues of the multiple alternatives in flux directionalities and the uncertainties and partial information from analytical methods. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Chosen aspects of multi-criteria analysis applied to support the choice of materials for building structures

    NASA Astrophysics Data System (ADS)

    Szafranko, E.

    2017-08-01

    When planning a building structure, dilemmas arise as to what construction and material solutions are feasible. The decisions are not always obvious. A procedure for selecting the variant that will best satisfy the expectations of the investor and future users of a structure must be founded on mathematical methods. The following deserve special attention: the MCE methods, Hierarchical Analysis Methods and Weighting Methods. Another interesting solution, particularly useful when dealing with evaluations which take into account negative values, is the Indicator Method. MCE methods are relatively popular owing to the simplicity of the calculations and ease of the interpretation of the results. Having prepared the input data properly, they enable the user to compare them on the same level. In a situation where an analysis involves a large number of data, it is more convenient to divide them into groups according to main criteria and subcriteria. This option is provided by hierarchical analysis methods. They are based on ordered sets of criteria, which are evaluated in groups. In some cases, this approach yields the results that are superior and easier to read. If an analysis encompasses direct and indirect effects, an Indicator Method seems to be a justified choice for selecting the right solution. The Indicator Method is different in character and relies on weights and assessments of effects. It allows the user to evaluate effectively the analyzed variants. This article explains the methodology of conducting a multi-criteria analysis, showing its advantages and disadvantages. An example of calculations contained in the article shows what problems can be encountered when making an assessment of various solutions regarding building materials and structures. For comparison, an analysis based on graphical methods developed by the author was presented.

  13. Bayesian dose-response analysis for epidemiological studies with complex uncertainty in dose estimation.

    PubMed

    Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L

    2016-02-10

    Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Paule‐Mandel estimators for network meta‐analysis with random inconsistency effects

    PubMed Central

    Veroniki, Areti Angeliki; Law, Martin; Tricco, Andrea C.; Baker, Rose

    2017-01-01

    Network meta‐analysis is used to simultaneously compare multiple treatments in a single analysis. However, network meta‐analyses may exhibit inconsistency, where direct and different forms of indirect evidence are not in agreement with each other, even after allowing for between‐study heterogeneity. Models for network meta‐analysis with random inconsistency effects have the dual aim of allowing for inconsistencies and estimating average treatment effects across the whole network. To date, two classical estimation methods for fitting this type of model have been developed: a method of moments that extends DerSimonian and Laird's univariate method and maximum likelihood estimation. However, the Paule and Mandel estimator is another recommended classical estimation method for univariate meta‐analysis. In this paper, we extend the Paule and Mandel method so that it can be used to fit models for network meta‐analysis with random inconsistency effects. We apply all three estimation methods to a variety of examples that have been used previously and we also examine a challenging new dataset that is highly heterogenous. We perform a simulation study based on this new example. We find that the proposed Paule and Mandel method performs satisfactorily and generally better than the previously proposed method of moments because it provides more accurate inferences. Furthermore, the Paule and Mandel method possesses some advantages over likelihood‐based methods because it is both semiparametric and requires no convergence diagnostics. Although restricted maximum likelihood estimation remains the gold standard, the proposed methodology is a fully viable alternative to this and other estimation methods. PMID:28585257

  15. A comparative analysis of spectral exponent estimation techniques for 1/fβ processes with applications to the analysis of stride interval time series

    PubMed Central

    Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin

    2013-01-01

    Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509

  16. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods

    PubMed Central

    Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    Background When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. Methods In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. Conclusions When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle. PMID:29742115

  17. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, T. H. H.

    1985-01-01

    Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.

  18. Cluster Correspondence Analysis.

    PubMed

    van de Velden, M; D'Enza, A Iodice; Palumbo, F

    2017-03-01

    A method is proposed that combines dimension reduction and cluster analysis for categorical data by simultaneously assigning individuals to clusters and optimal scaling values to categories in such a way that a single between variance maximization objective is achieved. In a unified framework, a brief review of alternative methods is provided and we show that the proposed method is equivalent to GROUPALS applied to categorical data. Performance of the methods is appraised by means of a simulation study. The results of the joint dimension reduction and clustering methods are compared with the so-called tandem approach, a sequential analysis of dimension reduction followed by cluster analysis. The tandem approach is conjectured to perform worse when variables are added that are unrelated to the cluster structure. Our simulation study confirms this conjecture. Moreover, the results of the simulation study indicate that the proposed method also consistently outperforms alternative joint dimension reduction and clustering methods.

  19. Evaluation of a cost-effective loads approach. [shock spectra/impedance method for Viking Orbiter

    NASA Technical Reports Server (NTRS)

    Garba, J. A.; Wada, B. K.; Bamford, R.; Trubert, M. R.

    1976-01-01

    A shock spectra/impedance method for loads predictions is used to estimate member loads for the Viking Orbiter, a 7800-lb interplanetary spacecraft that has been designed using transient loads analysis techniques. The transient loads analysis approach leads to a lightweight structure but requires complex and costly analyses. To reduce complexity and cost, a shock spectra/impedance method is currently being used to design the Mariner Jupiter Saturn spacecraft. This method has the advantage of using low-cost in-house loads analysis techniques and typically results in more conservative structural loads. The method is evaluated by comparing the increase in Viking member loads to the loads obtained by the transient loads analysis approach. An estimate of the weight penalty incurred by using this method is presented. The paper also compares the calculated flight loads from the transient loads analyses and the shock spectra/impedance method to measured flight data.

  20. David W. Templeton | NREL

    Science.gov Websites

    and algal biomass analysis methods and applications of these methods to different processes. Templeton , internally funded research project to develop microalgal compositional analysis methods that included setting methods Closing mass and component balances around pretreatment, saccharification, and fermentation unit

  1. Coding and Commonality Analysis: Non-ANOVA Methods for Analyzing Data from Experiments.

    ERIC Educational Resources Information Center

    Thompson, Bruce

    The advantages and disadvantages of three analytic methods used to analyze experimental data in educational research are discussed. The same hypothetical data set is used with all methods for a direct comparison. The Analysis of Variance (ANOVA) method and its several analogs are collectively labeled OVA methods and are evaluated. Regression…

  2. Differences Between Ward's and UPGMA Methods of Cluster Analysis: Implications for School Psychology.

    ERIC Educational Resources Information Center

    Hale, Robert L.; Dougherty, Donna

    1988-01-01

    Compared the efficacy of two methods of cluster analysis, the unweighted pair-groups method using arithmetic averages (UPGMA) and Ward's method, for students grouped on intelligence, achievement, and social adjustment by both clustering methods. Found UPGMA more efficacious based on output, on cophenetic correlation coefficients generated by each…

  3. Set of new draft methods for the analysis of organic disinfection by-products, including 551 and 552. Draft report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-01-01

    The set of documents discusses the new draft methods (EPA method 551, EPA method 552) for the analysis of disinfection byproducts contained in drinking water. The methods use the techniques of liquid/liquid extraction and gas chromatography with electron capture detection.

  4. Development of a probabilistic analysis methodology for structural reliability estimation

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.

    1991-01-01

    The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.

  5. How to Compare the Security Quality Requirements Engineering (SQUARE) Method with Other Methods

    DTIC Science & Technology

    2007-08-01

    Attack Trees for Modeling and Analysis 10 2.8 Misuse and Abuse Cases 10 2.9 Formal Methods 11 2.9.1 Software Cost Reduction 12 2.9.2 Common...modern or efficient techniques. • Requirements analysis typically is either not performed at all (identified requirements are directly specified without...any analysis or modeling) or analysis is restricted to functional re- quirements and ignores quality requirements, other nonfunctional requirements

  6. Steroid hormones in environmental matrices: extraction method comparison.

    PubMed

    Andaluri, Gangadhar; Suri, Rominder P S; Graham, Kendon

    2017-11-09

    The U.S. Environmental Protection Agency (EPA) has developed methods for the analysis of steroid hormones in water, soil, sediment, and municipal biosolids by HRGC/HRMS (EPA Method 1698). Following the guidelines provided in US-EPA Method 1698, the extraction methods were validated with reagent water and applied to municipal wastewater, surface water, and municipal biosolids using GC/MS/MS for the analysis of nine most commonly detected steroid hormones. This is the first reported comparison of the separatory funnel extraction (SFE), continuous liquid-liquid extraction (CLLE), and Soxhlet extraction methods developed by the U.S. EPA. Furthermore, a solid phase extraction (SPE) method was also developed in-house for the extraction of steroid hormones from aquatic environmental samples. This study provides valuable information regarding the robustness of the different extraction methods. Statistical analysis of the data showed that SPE-based methods provided better recovery efficiencies and lower variability of the steroid hormones followed by SFE. The analytical methods developed in-house for extraction of biosolids showed a wide recovery range; however, the variability was low (≤ 7% RSD). Soxhlet extraction and CLLE are lengthy procedures and have been shown to provide highly variably recovery efficiencies. The results of this study are guidance for better sample preparation strategies in analytical methods for steroid hormone analysis, and SPE adds to the choice in environmental sample analysis.

  7. Methods to control for unmeasured confounding in pharmacoepidemiology: an overview.

    PubMed

    Uddin, Md Jamal; Groenwold, Rolf H H; Ali, Mohammed Sanni; de Boer, Anthonius; Roes, Kit C B; Chowdhury, Muhammad A B; Klungel, Olaf H

    2016-06-01

    Background Unmeasured confounding is one of the principal problems in pharmacoepidemiologic studies. Several methods have been proposed to detect or control for unmeasured confounding either at the study design phase or the data analysis phase. Aim of the Review To provide an overview of commonly used methods to detect or control for unmeasured confounding and to provide recommendations for proper application in pharmacoepidemiology. Methods/Results Methods to control for unmeasured confounding in the design phase of a study are case only designs (e.g., case-crossover, case-time control, self-controlled case series) and the prior event rate ratio adjustment method. Methods that can be applied in the data analysis phase include, negative control method, perturbation variable method, instrumental variable methods, sensitivity analysis, and ecological analysis. A separate group of methods are those in which additional information on confounders is collected from a substudy. The latter group includes external adjustment, propensity score calibration, two-stage sampling, and multiple imputation. Conclusion As the performance and application of the methods to handle unmeasured confounding may differ across studies and across databases, we stress the importance of using both statistical evidence and substantial clinical knowledge for interpretation of the study results.

  8. Buckling analysis and test correlation of hat stiffened panels for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    Percy, Wendy C.; Fields, Roger A.

    1990-01-01

    The paper discusses the design, analysis, and test of hat stiffened panels subjected to a variety of thermal and mechanical load conditions. The panels were designed using data from structural optimization computer codes and finite element analysis. Test methods included the grid shadow moire method and a single gage force stiffness method. The agreement between the test data and analysis provides confidence in the methods that are currently being used to design structures for hypersonic vehicles. The agreement also indicates that post buckled strength may potentially be used to reduce the vehicle weight.

  9. Selecting risk factors: a comparison of discriminant analysis, logistic regression and Cox's regression model using data from the Tromsø Heart Study.

    PubMed

    Brenn, T; Arnesen, E

    1985-01-01

    For comparative evaluation, discriminant analysis, logistic regression and Cox's model were used to select risk factors for total and coronary deaths among 6595 men aged 20-49 followed for 9 years. Groups with mortality between 5 and 93 per 1000 were considered. Discriminant analysis selected variable sets only marginally different from the logistic and Cox methods which always selected the same sets. A time-saving option, offered for both the logistic and Cox selection, showed no advantage compared with discriminant analysis. Analysing more than 3800 subjects, the logistic and Cox methods consumed, respectively, 80 and 10 times more computer time than discriminant analysis. When including the same set of variables in non-stepwise analyses, all methods estimated coefficients that in most cases were almost identical. In conclusion, discriminant analysis is advocated for preliminary or stepwise analysis, otherwise Cox's method should be used.

  10. Robust Mediation Analysis Based on Median Regression

    PubMed Central

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  11. Conducting qualitative research in mental health: Thematic and content analyses.

    PubMed

    Crowe, Marie; Inder, Maree; Porter, Richard

    2015-07-01

    The objective of this paper is to describe two methods of qualitative analysis - thematic analysis and content analysis - and to examine their use in a mental health context. A description of the processes of thematic analysis and content analysis is provided. These processes are then illustrated by conducting two analyses of the same qualitative data. Transcripts of qualitative interviews are analysed using each method to illustrate these processes. The illustration of the processes highlights the different outcomes from the same set of data. Thematic and content analyses are qualitative methods that serve different research purposes. Thematic analysis provides an interpretation of participants' meanings, while content analysis is a direct representation of participants' responses. These methods provide two ways of understanding meanings and experiences and provide important knowledge in a mental health context. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  12. Application of Bounded Linear Stability Analysis Method for Metrics-Driven Adaptive Control

    NASA Technical Reports Server (NTRS)

    Bakhtiari-Nejad, Maryam; Nguyen, Nhan T.; Krishnakumar, Kalmanje

    2009-01-01

    This paper presents the application of Bounded Linear Stability Analysis (BLSA) method for metrics-driven adaptive control. The bounded linear stability analysis method is used for analyzing stability of adaptive control models, without linearizing the adaptive laws. Metrics-driven adaptive control introduces a notion that adaptation should be driven by some stability metrics to achieve robustness. By the application of bounded linear stability analysis method the adaptive gain is adjusted during the adaptation in order to meet certain phase margin requirements. Analysis of metrics-driven adaptive control is evaluated for a second order system that represents a pitch attitude control of a generic transport aircraft. The analysis shows that the system with the metrics-conforming variable adaptive gain becomes more robust to unmodeled dynamics or time delay. The effect of analysis time-window for BLSA is also evaluated in order to meet the stability margin criteria.

  13. Global sensitivity analysis for urban water quality modelling: Terminology, convergence and comparison of different methods

    NASA Astrophysics Data System (ADS)

    Vanrolleghem, Peter A.; Mannina, Giorgio; Cosenza, Alida; Neumann, Marc B.

    2015-03-01

    Sensitivity analysis represents an important step in improving the understanding and use of environmental models. Indeed, by means of global sensitivity analysis (GSA), modellers may identify both important (factor prioritisation) and non-influential (factor fixing) model factors. No general rule has yet been defined for verifying the convergence of the GSA methods. In order to fill this gap this paper presents a convergence analysis of three widely used GSA methods (SRC, Extended FAST and Morris screening) for an urban drainage stormwater quality-quantity model. After the convergence was achieved the results of each method were compared. In particular, a discussion on peculiarities, applicability, and reliability of the three methods is presented. Moreover, a graphical Venn diagram based classification scheme and a precise terminology for better identifying important, interacting and non-influential factors for each method is proposed. In terms of convergence, it was shown that sensitivity indices related to factors of the quantity model achieve convergence faster. Results for the Morris screening method deviated considerably from the other methods. Factors related to the quality model require a much higher number of simulations than the number suggested in literature for achieving convergence with this method. In fact, the results have shown that the term "screening" is improperly used as the method may exclude important factors from further analysis. Moreover, for the presented application the convergence analysis shows more stable sensitivity coefficients for the Extended-FAST method compared to SRC and Morris screening. Substantial agreement in terms of factor fixing was found between the Morris screening and Extended FAST methods. In general, the water quality related factors exhibited more important interactions than factors related to water quantity. Furthermore, in contrast to water quantity model outputs, water quality model outputs were found to be characterised by high non-linearity.

  14. Advantages of Social Network Analysis in Educational Research

    ERIC Educational Resources Information Center

    Ushakov, K. M.; Kukso, K. N.

    2015-01-01

    Currently one of the main tools for the large scale studies of schools is statistical analysis. Although it is the most common method and it offers greatest opportunities for analysis, there are other quantitative methods for studying schools, such as network analysis. We discuss the potential advantages that network analysis has for educational…

  15. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    PubMed

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  16. Dual ant colony operational modal analysis parameter estimation method

    NASA Astrophysics Data System (ADS)

    Sitarz, Piotr; Powałka, Bartosz

    2018-01-01

    Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.

  17. Methods for the visualization and analysis of extracellular matrix protein structure and degradation.

    PubMed

    Leonard, Annemarie K; Loughran, Elizabeth A; Klymenko, Yuliya; Liu, Yueying; Kim, Oleg; Asem, Marwa; McAbee, Kevin; Ravosa, Matthew J; Stack, M Sharon

    2018-01-01

    This chapter highlights methods for visualization and analysis of extracellular matrix (ECM) proteins, with particular emphasis on collagen type I, the most abundant protein in mammals. Protocols described range from advanced imaging of complex in vivo matrices to simple biochemical analysis of individual ECM proteins. The first section of this chapter describes common methods to image ECM components and includes protocols for second harmonic generation, scanning electron microscopy, and several histological methods of ECM localization and degradation analysis, including immunohistochemistry, Trichrome staining, and in situ zymography. The second section of this chapter details both a common transwell invasion assay and a novel live imaging method to investigate cellular behavior with respect to collagen and other ECM proteins of interest. The final section consists of common electrophoresis-based biochemical methods that are used in analysis of ECM proteins. Use of the methods described herein will enable researchers to gain a greater understanding of the role of ECM structure and degradation in development and matrix-related diseases such as cancer and connective tissue disorders. © 2018 Elsevier Inc. All rights reserved.

  18. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  19. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  20. Decerns: A framework for multi-criteria decision analysis

    DOE PAGES

    Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...

    2015-02-27

    A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.

  1. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  2. What Touched Your Heart? Collaborative Story Analysis Emerging From an Apsáalooke Cultural Context

    PubMed Central

    Hallett, John; Held, Suzanne; McCormick, Alma Knows His Gun; Simonds, Vanessa; Bird, Sloane Real; Martin, Christine; Simpson, Colleen; Schure, Mark; Turnsplenty, Nicole; Trottier, Coleen

    2017-01-01

    Community-based participatory research and decolonizing research share some recommendations for best practices for conducting research. One commonality is partnering on all stages of research; co-developing methods of data analysis is one stage with a deficit of partnering examples. We present a novel community-based and developed method for analyzing qualitative data within an Indigenous health study and explain incompatibilities of existing methods for our purposes and community needs. We describe how we explored available literature, received counsel from community Elders and experts in the field, and collaboratively developed a data analysis method consonant with community values. The method of analysis, in which interview/story remained intact, team members received story, made meaning through discussion, and generated a conceptual framework to inform intervention development, is detailed. We offer the development process and method as an example for researchers working with communities who want to keep stories intact during qualitative data analysis. PMID:27659019

  3. Comparison of two occurrence risk assessment methods for collapse gully erosion ——A case study in Guangdong province

    NASA Astrophysics Data System (ADS)

    Sun, K.; Cheng, D. B.; He, J. J.; Zhao, Y. L.

    2018-02-01

    Collapse gully erosion is a specific type of soil erosion in the red soil region of southern China, and early warning and prevention of the occurrence of collapse gully erosion is very important. Based on the idea of risk assessment, this research, taking Guangdong province as an example, adopt the information acquisition analysis and the logistic regression analysis, to discuss the feasibility for collapse gully erosion risk assessment in regional scale, and compare the applicability of the different risk assessment methods. The results show that in the Guangdong province, the risk degree of collapse gully erosion occurrence is high in northeastern and western area, and relatively low in southwestern and central part. The comparing analysis of the different risk assessment methods on collapse gully also indicated that the risk distribution patterns from the different methods were basically consistent. However, the accuracy of risk map from the information acquisition analysis method was slightly better than that from the logistic regression analysis method.

  4. Analysis of beryllium and depleted uranium: An overview of detection methods in aerosols and soils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camins, I.; Shinn, J.H.

    We conducted a survey of commercially available methods for analysis of beryllium and depleted uranium in aerosols and soils to find a reliable, cost-effective, and sufficiently precise method for researchers involved in environmental testing at the Yuma Proving Ground, Yuma, Arizona. Criteria used for evaluation include cost, method of analysis, specificity, sensitivity, reproducibility, applicability, and commercial availability. We found that atomic absorption spectrometry with graphite furnace meets these criteria for testing samples for beryllium. We found that this method can also be used to test samples for depleted uranium. However, atomic absorption with graphite furnace is not as sensitive amore » measurement method for depleted uranium as it is for beryllium, so we recommend that quality control of depleted uranium analysis be maintained by testing 10 of every 1000 samples by neutron activation analysis. We also evaluated 45 companies and institutions that provide analyses of beryllium and depleted uranium. 5 refs., 1 tab.« less

  5. Robust Methods for Moderation Analysis with a Two-Level Regression Model.

    PubMed

    Yang, Miao; Yuan, Ke-Hai

    2016-01-01

    Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.

  6. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  7. Measurement of edge residual stresses in glass by the phase-shifting method

    NASA Astrophysics Data System (ADS)

    Ajovalasit, A.; Petrucci, G.; Scafidi, M.

    2011-05-01

    Control and measurement of residual stress in glass is of great importance in the industrial field. Since glass is a birefringent material, the residual stress analysis is based mainly on the photoelastic method. This paper considers two methods of automated analysis of membrane residual stress in glass sheets, based on the phase-shifting concept in monochromatic light. In particular these methods are the automated versions of goniometric compensation methods of Tardy and Sénarmont. The proposed methods can effectively replace manual methods of compensation (goniometric compensation of Tardy and Sénarmont, Babinet and Babinet-Soleil compensators) provided by current standards on the analysis of residual stresses in glasses.

  8. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  9. METHOD OF CHEMICAL ANALYSIS FOR OIL SHALE WASTES

    EPA Science Inventory

    Several methods of chemical analysis are described for oil shale wastewaters and retort gases. These methods are designed to support the field testing of various pollution control systems. As such, emphasis has been placed on methods which are rapid and sufficiently rugged to per...

  10. The solution of linear systems of equations with a structural analysis code on the NAS CRAY-2

    NASA Technical Reports Server (NTRS)

    Poole, Eugene L.; Overman, Andrea L.

    1988-01-01

    Two methods for solving linear systems of equations on the NAS Cray-2 are described. One is a direct method; the other is an iterative method. Both methods exploit the architecture of the Cray-2, particularly the vectorization, and are aimed at structural analysis applications. To demonstrate and evaluate the methods, they were installed in a finite element structural analysis code denoted the Computational Structural Mechanics (CSM) Testbed. A description of the techniques used to integrate the two solvers into the Testbed is given. Storage schemes, memory requirements, operation counts, and reformatting procedures are discussed. Finally, results from the new methods are compared with results from the initial Testbed sparse Choleski equation solver for three structural analysis problems. The new direct solvers described achieve the highest computational rates of the methods compared. The new iterative methods are not able to achieve as high computation rates as the vectorized direct solvers but are best for well conditioned problems which require fewer iterations to converge to the solution.

  11. An Excel‐based implementation of the spectral method of action potential alternans analysis

    PubMed Central

    Pearman, Charles M.

    2014-01-01

    Abstract Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro‐arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T‐wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. PMID:25501439

  12. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    PubMed

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Method for combined biometric and chemical analysis of human fingerprints.

    PubMed

    Staymates, Jessica L; Orandi, Shahram; Staymates, Matthew E; Gillen, Greg

    This paper describes a method for combining direct chemical analysis of latent fingerprints with subsequent biometric analysis within a single sample. The method described here uses ion mobility spectrometry (IMS) as a chemical detection method for explosives and narcotics trace contamination. A collection swab coated with a high-temperature adhesive has been developed to lift latent fingerprints from various surfaces. The swab is then directly inserted into an IMS instrument for a quick chemical analysis. After the IMS analysis, the lifted print remains intact for subsequent biometric scanning and analysis using matching algorithms. Several samples of explosive-laden fingerprints were successfully lifted and the explosives detected with IMS. Following explosive detection, the lifted fingerprints remained of sufficient quality for positive match scores using a prepared gallery consisting of 60 fingerprints. Based on our results ( n  = 1200), there was no significant decrease in the quality of the lifted print post IMS analysis. In fact, for a small subset of lifted prints, the quality was improved after IMS analysis. The described method can be readily applied to domestic criminal investigations, transportation security, terrorist and bombing threats, and military in-theatre settings.

  14. A Review on the Nonlinear Dynamical System Analysis of Electrocardiogram Signal

    PubMed Central

    Mohapatra, Biswajit

    2018-01-01

    Electrocardiogram (ECG) signal analysis has received special attention of the researchers in the recent past because of its ability to divulge crucial information about the electrophysiology of the heart and the autonomic nervous system activity in a noninvasive manner. Analysis of the ECG signals has been explored using both linear and nonlinear methods. However, the nonlinear methods of ECG signal analysis are gaining popularity because of their robustness in feature extraction and classification. The current study presents a review of the nonlinear signal analysis methods, namely, reconstructed phase space analysis, Lyapunov exponents, correlation dimension, detrended fluctuation analysis (DFA), recurrence plot, Poincaré plot, approximate entropy, and sample entropy along with their recent applications in the ECG signal analysis. PMID:29854361

  15. Nonlinear analysis of structures. [within framework of finite element method

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.

    1974-01-01

    The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.

  16. A Review on the Nonlinear Dynamical System Analysis of Electrocardiogram Signal.

    PubMed

    Nayak, Suraj K; Bit, Arindam; Dey, Anilesh; Mohapatra, Biswajit; Pal, Kunal

    2018-01-01

    Electrocardiogram (ECG) signal analysis has received special attention of the researchers in the recent past because of its ability to divulge crucial information about the electrophysiology of the heart and the autonomic nervous system activity in a noninvasive manner. Analysis of the ECG signals has been explored using both linear and nonlinear methods. However, the nonlinear methods of ECG signal analysis are gaining popularity because of their robustness in feature extraction and classification. The current study presents a review of the nonlinear signal analysis methods, namely, reconstructed phase space analysis, Lyapunov exponents, correlation dimension, detrended fluctuation analysis (DFA), recurrence plot, Poincaré plot, approximate entropy, and sample entropy along with their recent applications in the ECG signal analysis.

  17. Weighted analysis methods for mapped plot forest inventory data: Tables, regressions, maps and graphs

    Treesearch

    Paul C. Van Deusen; Linda S. Heath

    2010-01-01

    Weighted estimation methods for analysis of mapped plot forest inventory data are discussed. The appropriate weighting scheme can vary depending on the type of analysis and graphical display. Both statistical issues and user expectations need to be considered in these methods. A weighting scheme is proposed that balances statistical considerations and the logical...

  18. Decision Making Methods in Space Economics and Systems Engineering

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2006-01-01

    This viewgraph presentation reviews various methods of decision making and the impact that they have on space economics and systems engineering. Some of the methods discussed are: Present Value and Internal Rate of Return (IRR); Cost-Benefit Analysis; Real Options; Cost-Effectiveness Analysis; Cost-Utility Analysis; Multi-Attribute Utility Theory (MAUT); and Analytic Hierarchy Process (AHP).

  19. 'Dilute-and-shoot' triple parallel mass spectrometry method for analysis of vitamin D and triacylglycerols in dietary supplements

    USDA-ARS?s Scientific Manuscript database

    A method is demonstrated for analysis of vitamin D-fortified dietary supplements that eliminates virtually all chemical pretreatment prior to analysis, and is referred to as a ‘dilute and shoot’ method. Three mass spectrometers, in parallel, plus a UV detector, an evaporative light scattering detec...

  20. Factor Retention in Exploratory Factor Analysis: A Comparison of Alternative Methods.

    ERIC Educational Resources Information Center

    Mumford, Karen R.; Ferron, John M.; Hines, Constance V.; Hogarty, Kristine Y.; Kromrey, Jeffery D.

    This study compared the effectiveness of 10 methods of determining the number of factors to retain in exploratory common factor analysis. The 10 methods included the Kaiser rule and a modified Kaiser criterion, 3 variations of parallel analysis, 4 regression-based variations of the scree procedure, and the minimum average partial procedure. The…

  1. Benefit Analysis of Proposed Information Systems

    DTIC Science & Technology

    1991-03-01

    be evaluated in such an analysis. Different benefit comparison and user satisfaction methods are reviewed for their particular advantages and...Different benefit comparison and user satisfaction methods are reviewed for their particular advantages and disadvantages. A discussion is given on...determine the alternative that is the most advantageous to the government. Secondly, which benefit analysis methods are capable of calculating a

  2. Climate Change Discourse in Mass Media: Application of Computer-Assisted Content Analysis

    ERIC Educational Resources Information Center

    Kirilenko, Andrei P.; Stepchenkova, Svetlana O.

    2012-01-01

    Content analysis of mass media publications has become a major scientific method used to analyze public discourse on climate change. We propose a computer-assisted content analysis method to extract prevalent themes and analyze discourse changes over an extended period in an objective and quantifiable manner. The method includes the following: (1)…

  3. Development of the mathematical model for design and verification of acoustic modal analysis methods

    NASA Astrophysics Data System (ADS)

    Siner, Alexander; Startseva, Maria

    2016-10-01

    To reduce the turbofan noise it is necessary to develop methods for the analysis of the sound field generated by the blade machinery called modal analysis. Because modal analysis methods are very difficult and their testing on the full scale measurements are very expensive and tedious it is necessary to construct some mathematical models allowing to test modal analysis algorithms fast and cheap. At this work the model allowing to set single modes at the channel and to analyze generated sound field is presented. Modal analysis of the sound generated by the ring array of point sound sources is made. Comparison of experimental and numerical modal analysis results is presented at this work.

  4. Portfolio Analysis for Vector Calculus

    ERIC Educational Resources Information Center

    Kaplan, Samuel R.

    2015-01-01

    Classic stock portfolio analysis provides an applied context for Lagrange multipliers that undergraduate students appreciate. Although modern methods of portfolio analysis are beyond the scope of vector calculus, classic methods reinforce the utility of this material. This paper discusses how to introduce classic stock portfolio analysis in a…

  5. HDBStat!: a platform-independent software suite for statistical analysis of high dimensional biology data.

    PubMed

    Trivedi, Prinal; Edwards, Jode W; Wang, Jelai; Gadbury, Gary L; Srinivasasainagendra, Vinodh; Zakharkin, Stanislav O; Kim, Kyoungmi; Mehta, Tapan; Brand, Jacob P L; Patki, Amit; Page, Grier P; Allison, David B

    2005-04-06

    Many efforts in microarray data analysis are focused on providing tools and methods for the qualitative analysis of microarray data. HDBStat! (High-Dimensional Biology-Statistics) is a software package designed for analysis of high dimensional biology data such as microarray data. It was initially developed for the analysis of microarray gene expression data, but it can also be used for some applications in proteomics and other aspects of genomics. HDBStat! provides statisticians and biologists a flexible and easy-to-use interface to analyze complex microarray data using a variety of methods for data preprocessing, quality control analysis and hypothesis testing. Results generated from data preprocessing methods, quality control analysis and hypothesis testing methods are output in the form of Excel CSV tables, graphs and an Html report summarizing data analysis. HDBStat! is a platform-independent software that is freely available to academic institutions and non-profit organizations. It can be downloaded from our website http://www.soph.uab.edu/ssg_content.asp?id=1164.

  6. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part II. Statistical Methods of Meta-Analysis

    PubMed Central

    Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi

    2015-01-01

    Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107

  7. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods.

    PubMed

    Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle.

  8. The power-proportion method for intracranial volume correction in volumetric imaging analysis.

    PubMed

    Liu, Dawei; Johnson, Hans J; Long, Jeffrey D; Magnotta, Vincent A; Paulsen, Jane S

    2014-01-01

    In volumetric brain imaging analysis, volumes of brain structures are typically assumed to be proportional or linearly related to intracranial volume (ICV). However, evidence abounds that many brain structures have power law relationships with ICV. To take this relationship into account in volumetric imaging analysis, we propose a power law based method-the power-proportion method-for ICV correction. The performance of the new method is demonstrated using data from the PREDICT-HD study.

  9. Spectral and correlation analysis with applications to middle-atmosphere radars

    NASA Technical Reports Server (NTRS)

    Rastogi, Prabhat K.

    1989-01-01

    The correlation and spectral analysis methods for uniformly sampled stationary random signals, estimation of their spectral moments, and problems arising due to nonstationary are reviewed. Some of these methods are already in routine use in atmospheric radar experiments. Other methods based on the maximum entropy principle and time series models have been used in analyzing data, but are just beginning to receive attention in the analysis of radar signals. These methods are also briefly discussed.

  10. Research on the raw data processing method of the hydropower construction project

    NASA Astrophysics Data System (ADS)

    Tian, Zhichao

    2018-01-01

    In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.

  11. Analysis of Biomass Sugars Using a Novel HPLC Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agblevor, F. A.; Hames, B. R.; Schell, D.

    The precise quantitative analysis of biomass sugars is a very important step in the conversion of biomass feedstocks to fuels and chemicals. However, the most accurate method of biomass sugar analysis is based on the gas chromatography analysis of derivatized sugars either as alditol acetates or trimethylsilanes. The derivatization method is time consuming but the alternative high-performance liquid chromatography (HPLC) method cannot resolve most sugars found in biomass hydrolysates. We have demonstrated for the first time that by careful manipulation of the HPLC mobile phase, biomass monomeric sugars (arabinose, xylose, fructose, glucose, mannose, and galactose) can be analyzed quantitatively andmore » there is excellent baseline resolution of all the sugars. This method was demonstrated for standard sugars, pretreated corn stover liquid and solid fractions. Our method can also be used to analyze dimeric sugars (cellobiose and sucrose).« less

  12. Analytical methods of the U.S. Geological Survey's New York District Water-Analysis Laboratory

    USGS Publications Warehouse

    Lawrence, Gregory B.; Lincoln, Tricia A.; Horan-Ross, Debra A.; Olson, Mark L.; Waldron, Laura A.

    1995-01-01

    The New York District of the U.S. Geological Survey (USGS) in Troy, N.Y., operates a water-analysis laboratory for USGS watershed-research projects in the Northeast that require analyses of precipitation and of dilute surface water and soil water for major ions; it also provides analyses of certain chemical constituents in soils and soil gas samples.This report presents the methods for chemical analyses of water samples, soil-water samples, and soil-gas samples collected in wateshed-research projects. The introduction describes the general materials and technicques for each method and explains the USGS quality-assurance program and data-management procedures; it also explains the use of cross reference to the three most commonly used methods manuals for analysis of dilute waters. The body of the report describes the analytical procedures for (1) solution analysis, (2) soil analysis, and (3) soil-gas analysis. The methods are presented in alphabetical order by constituent. The method for each constituent is preceded by (1) reference codes for pertinent sections of the three manuals mentioned above, (2) a list of the method's applications, and (3) a summary of the procedure. The methods section for each constitutent contains the following categories: instrumentation and equipment, sample preservation and storage, reagents and standards, analytical procedures, quality control, maintenance, interferences, safety considerations, and references. Sufficient information is presented for each method to allow the resulting data to be appropriately used in environmental investigations.

  13. Identification of suitable genes contributes to lung adenocarcinoma clustering by multiple meta-analysis methods.

    PubMed

    Yang, Ze-Hui; Zheng, Rui; Gao, Yuan; Zhang, Qiang

    2016-09-01

    With the widespread application of high-throughput technology, numerous meta-analysis methods have been proposed for differential expression profiling across multiple studies. We identified the suitable differentially expressed (DE) genes that contributed to lung adenocarcinoma (ADC) clustering based on seven popular multiple meta-analysis methods. Seven microarray expression profiles of ADC and normal controls were extracted from the ArrayExpress database. The Bioconductor was used to perform the data preliminary preprocessing. Then, DE genes across multiple studies were identified. Hierarchical clustering was applied to compare the classification performance for microarray data samples. The classification efficiency was compared based on accuracy, sensitivity and specificity. Across seven datasets, 573 ADC cases and 222 normal controls were collected. After filtering out unexpressed and noninformative genes, 3688 genes were remained for further analysis. The classification efficiency analysis showed that DE genes identified by sum of ranks method separated ADC from normal controls with the best accuracy, sensitivity and specificity of 0.953, 0.969 and 0.932, respectively. The gene set with the highest classification accuracy mainly participated in the regulation of response to external stimulus (P = 7.97E-04), cyclic nucleotide-mediated signaling (P = 0.01), regulation of cell morphogenesis (P = 0.01) and regulation of cell proliferation (P = 0.01). Evaluation of DE genes identified by different meta-analysis methods in classification efficiency provided a new perspective to the choice of the suitable method in a given application. Varying meta-analysis methods always present varying abilities, so synthetic consideration should be taken when providing meta-analysis methods for particular research. © 2015 John Wiley & Sons Ltd.

  14. Applications of modern statistical methods to analysis of data in physical science

    NASA Astrophysics Data System (ADS)

    Wicker, James Eric

    Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.

  15. Mutual Coupling Analysis for Conformal Microstrip Antennas.

    DTIC Science & Technology

    1984-12-01

    6 0.001/ko, and the infinite integral is terminated at k 150 ko . 28*,-J ." . .. C. MUTUAL COUPLING ANALYSIS In this section, the moment method ...fact that it does provide an attractive alternative to the Green’s function method on which the analysis in later sections is based. In the present...by the moment method , the chosen set of expansion dipole modes plays a very important role. The efficiency as well as accuracy of the analysis depend

  16. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  17. Test/semi-empirical analysis of a carbon/epoxy fabric stiffened panel

    NASA Technical Reports Server (NTRS)

    Spier, E. E.; Anderson, J. A.

    1990-01-01

    The purpose of this work-in-progress is to present a semi-empirical analysis method developed to predict the buckling and crippling loads of carbon/epoxy fabric blade stiffened panels in compression. This is a hand analysis method comprised of well known, accepted techniques, logical engineering judgements, and experimental data that results in conservative solutions. In order to verify this method, a stiffened panel was fabricated and tested. Both the best and analysis results are presented.

  18. Handbook for Construction of Task Inventories for Navy Enlisted Ratings

    DTIC Science & Technology

    1984-01-01

    1 March 1977. Cambardella, J. J. 9., & Alvord, V. 0. TI-CODAP: A computerized method of Aob analysis for personnel nasement. Prince George’s County...ficity needed In the occupational analysis and influences the choice of an analysis method . The primary source of job data usually is the job holder at...analyzed, various methods of collecting and procersing data were considered, and an Introduc- tory approach to the collection and analysis of Navy

  19. Analysis of failure and maintenance experiences of motor operated valves in a Finnish nuclear power plant

    NASA Astrophysics Data System (ADS)

    Simola, Kaisa; Laakso, Kari

    1992-01-01

    Eight years of operating experiences of 104 motor operated closing valves in different safety systems in nuclear power units were analyzed in a systematic way. The qualitative methods used were Failure Mode and Effect Analysis (FMEA) and Maintenance Effects and Criticality Analysis (MECA). These reliability engineering methods are commonly used in the design stage of equipment. The successful application of these methods for analysis and utilization of operating experiences was demonstrated.

  20. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  1. An improved method for the analysis of sennosides in Cassia angustifolia by high-performance liquid chromatography.

    PubMed

    Bala, S; Uniyal, G C; Dubey, T; Singh, S P

    2001-01-01

    A reversed-phase column liquid chromatographic method for the analysis of sennosides A and B present in leaf and pod extracts of Cassia angustifolia has been developed using a Symmetry C18 column and a linear binary gradient profile. The method can be utilised for the quantitative determination of other sennosides as a baseline resolution for most of the constituents was achieved. The method is economical in terms of the time taken and the amount of solvent used (25 mL) for each analysis. The validity of the method with respect to analysis was confirmed by comparing the UV spectra of each peak with those of reference compounds using a photodiode array detector.

  2. A century of enzyme kinetic analysis, 1913 to 2013.

    PubMed

    Johnson, Kenneth A

    2013-09-02

    This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  3. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.

    2003-01-01

    An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

  4. Decision Support Methods and Tools

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.

    2006-01-01

    This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed

  5. Partial spline models for the inclusion of tropopause and frontal boundary information in otherwise smooth two- and three-dimensional objective analysis

    NASA Technical Reports Server (NTRS)

    Shiau, Jyh-Jen; Wahba, Grace; Johnson, Donald R.

    1986-01-01

    A new method, based on partial spline models, is developed for including specified discontinuities in otherwise smooth two- and three-dimensional objective analyses. The method is appropriate for including tropopause height information in two- and three-dimensinal temperature analyses, using the O'Sullivan-Wahba physical variational method for analysis of satellite radiance data, and may in principle be used in a combined variational analysis of observed, forecast, and climate information. A numerical method for its implementation is described and a prototype two-dimensional analysis based on simulated radiosonde and tropopause height data is shown. The method may also be appropriate for other geophysical problems, such as modeling the ocean thermocline, fronts, discontinuities, etc.

  6. The Fusion of Financial Analysis and Seismology: Statistical Methods from Financial Market Analysis Applied to Earthquake Data

    NASA Astrophysics Data System (ADS)

    Ohyanagi, S.; Dileonardo, C.

    2013-12-01

    As a natural phenomenon earthquake occurrence is difficult to predict. Statistical analysis of earthquake data was performed using candlestick chart and Bollinger Band methods. These statistical methods, commonly used in the financial world to analyze market trends were tested against earthquake data. Earthquakes above Mw 4.0 located on shore of Sanriku (37.75°N ~ 41.00°N, 143.00°E ~ 144.50°E) from February 1973 to May 2013 were selected for analysis. Two specific patterns in earthquake occurrence were recognized through the analysis. One is a spread of candlestick prior to the occurrence of events greater than Mw 6.0. A second pattern shows convergence in the Bollinger Band, which implies a positive or negative change in the trend of earthquakes. Both patterns match general models for the buildup and release of strain through the earthquake cycle, and agree with both the characteristics of the candlestick chart and Bollinger Band analysis. These results show there is a high correlation between patterns in earthquake occurrence and trend analysis by these two statistical methods. The results of this study agree with the appropriateness of the application of these financial analysis methods to the analysis of earthquake occurrence.

  7. Defining, illustrating and reflecting on logic analysis with an example from a professional development program.

    PubMed

    Tremblay, Marie-Claude; Brousselle, Astrid; Richard, Lucie; Beaudet, Nicole

    2013-10-01

    Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Two new methods to fit models for network meta-analysis with random inconsistency effects.

    PubMed

    Law, Martin; Jackson, Dan; Turner, Rebecca; Rhodes, Kirsty; Viechtbauer, Wolfgang

    2016-07-28

    Meta-analysis is a valuable tool for combining evidence from multiple studies. Network meta-analysis is becoming more widely used as a means to compare multiple treatments in the same analysis. However, a network meta-analysis may exhibit inconsistency, whereby the treatment effect estimates do not agree across all trial designs, even after taking between-study heterogeneity into account. We propose two new estimation methods for network meta-analysis models with random inconsistency effects. The model we consider is an extension of the conventional random-effects model for meta-analysis to the network meta-analysis setting and allows for potential inconsistency using random inconsistency effects. Our first new estimation method uses a Bayesian framework with empirically-based prior distributions for both the heterogeneity and the inconsistency variances. We fit the model using importance sampling and thereby avoid some of the difficulties that might be associated with using Markov Chain Monte Carlo (MCMC). However, we confirm the accuracy of our importance sampling method by comparing the results to those obtained using MCMC as the gold standard. The second new estimation method we describe uses a likelihood-based approach, implemented in the metafor package, which can be used to obtain (restricted) maximum-likelihood estimates of the model parameters and profile likelihood confidence intervals of the variance components. We illustrate the application of the methods using two contrasting examples. The first uses all-cause mortality as an outcome, and shows little evidence of between-study heterogeneity or inconsistency. The second uses "ear discharge" as an outcome, and exhibits substantial between-study heterogeneity and inconsistency. Both new estimation methods give results similar to those obtained using MCMC. The extent of heterogeneity and inconsistency should be assessed and reported in any network meta-analysis. Our two new methods can be used to fit models for network meta-analysis with random inconsistency effects. They are easily implemented using the accompanying R code in the Additional file 1. Using these estimation methods, the extent of inconsistency can be assessed and reported.

  9. Advanced composites structural concepts and materials technologies for primary aircraft structures: Structural response and failure analysis

    NASA Technical Reports Server (NTRS)

    Dorris, William J.; Hairr, John W.; Huang, Jui-Tien; Ingram, J. Edward; Shah, Bharat M.

    1992-01-01

    Non-linear analysis methods were adapted and incorporated in a finite element based DIAL code. These methods are necessary to evaluate the global response of a stiffened structure under combined in-plane and out-of-plane loading. These methods include the Arc Length method and target point analysis procedure. A new interface material model was implemented that can model elastic-plastic behavior of the bond adhesive. Direct application of this method is in skin/stiffener interface failure assessment. Addition of the AML (angle minus longitudinal or load) failure procedure and Hasin's failure criteria provides added capability in the failure predictions. Interactive Stiffened Panel Analysis modules were developed as interactive pre-and post-processors. Each module provides the means of performing self-initiated finite elements based analysis of primary structures such as a flat or curved stiffened panel; a corrugated flat sandwich panel; and a curved geodesic fuselage panel. This module brings finite element analysis into the design of composite structures without the requirement for the user to know much about the techniques and procedures needed to actually perform a finite element analysis from scratch. An interactive finite element code was developed to predict bolted joint strength considering material and geometrical non-linearity. The developed method conducts an ultimate strength failure analysis using a set of material degradation models.

  10. Methods for land use impact assessment: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perminova, Tataina, E-mail: tatiana.perminova@utt.fr; Department of Geoecology and Geochemistry, Institute of Natural Resources, National Research Tomsk Polytechnic University, 30 Lenin Avenue, 634050 Tomsk; Sirina, Natalia, E-mail: natalia.sirina@utt.fr

    Many types of methods to assess land use impact have been developed. Nevertheless a systematic synthesis of all these approaches is necessary to highlight the most commonly used and most effective methods. Given the growing interest in this area of research, a review of the different methods of assessing land use impact (LUI) was performed using bibliometric analysis. One hundred eighty seven articles of agricultural and biological science, and environmental sciences were examined. According to our results, the most frequently used land use assessment methods are Life-Cycle Assessment, Material Flow Analysis/Input–Output Analysis, Environmental Impact Assessment and Ecological Footprint. Comparison ofmore » the methods allowed their specific features to be identified and to arrive at the conclusion that a combination of several methods is the best basis for a comprehensive analysis of land use impact assessment. - Highlights: • We identified the most frequently used methods in land use impact assessment. • A comparison of the methods based on several criteria was carried out. • Agricultural land use is by far the most common area of study within the methods. • Incentive driven methods, like LCA, arouse the most interest in this field.« less

  11. Correlating the EMC analysis and testing methods for space systems in MIL-STD-1541A

    NASA Technical Reports Server (NTRS)

    Perez, Reinaldo J.

    1990-01-01

    A study was conducted to improve the correlation between the electromagnetic compatibility (EMC) analysis models stated in MIL-STD-1541A and the suggested testing methods used for space systems. The test and analysis methods outlined in MIL-STD-1541A are described, and a comparative assessment of testing and analysis techniques as they relate to several EMC areas is presented. Suggestions on present analysis and test methods are introduced to harmonize and bring the analysis and testing tools in MIL-STD-1541A into closer agreement. It is suggested that test procedures in MIL-STD-1541A must be improved by providing alternatives to the present use of shielded enclosures as the primary site for such tests. In addition, the alternate use of anechoic chambers and open field test sites must be considered.

  12. Correlative and multivariate analysis of increased radon concentration in underground laboratory.

    PubMed

    Maletić, Dimitrije M; Udovičić, Vladimir I; Banjanac, Radomir M; Joković, Dejan R; Dragić, Aleksandar L; Veselinović, Nikola B; Filipović, Jelena

    2014-11-01

    The results of analysis using correlative and multivariate methods, as developed for data analysis in high-energy physics and implemented in the Toolkit for Multivariate Analysis software package, of the relations of the variation of increased radon concentration with climate variables in shallow underground laboratory is presented. Multivariate regression analysis identified a number of multivariate methods which can give a good evaluation of increased radon concentrations based on climate variables. The use of the multivariate regression methods will enable the investigation of the relations of specific climate variable with increased radon concentrations by analysis of regression methods resulting in 'mapped' underlying functional behaviour of radon concentrations depending on a wide spectrum of climate variables. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Design sensitivity analysis using EAL. Part 1: Conventional design parameters

    NASA Technical Reports Server (NTRS)

    Dopker, B.; Choi, Kyung K.; Lee, J.

    1986-01-01

    A numerical implementation of design sensitivity analysis of builtup structures is presented, using the versatility and convenience of an existing finite element structural analysis code and its database management system. The finite element code used in the implemenatation presented is the Engineering Analysis Language (EAL), which is based on a hybrid method of analysis. It was shown that design sensitivity computations can be carried out using the database management system of EAL, without writing a separate program and a separate database. Conventional (sizing) design parameters such as cross-sectional area of beams or thickness of plates and plane elastic solid components are considered. Compliance, displacement, and stress functionals are considered as performance criteria. The method presented is being extended to implement shape design sensitivity analysis using a domain method and a design component method.

  14. Methods for High-Order Multi-Scale and Stochastic Problems Analysis, Algorithms, and Applications

    DTIC Science & Technology

    2016-10-17

    finite volume schemes, discontinuous Galerkin finite element method, and related methods, for solving computational fluid dynamics (CFD) problems and...approximation for finite element methods. (3) The development of methods of simulation and analysis for the study of large scale stochastic systems of...laws, finite element method, Bernstein-Bezier finite elements , weakly interacting particle systems, accelerated Monte Carlo, stochastic networks 16

  15. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    PubMed

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  16. Probabilistic Sensitivity Analysis for Launch Vehicles with Varying Payloads and Adapters for Structural Dynamics and Loads

    NASA Technical Reports Server (NTRS)

    McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.

    2012-01-01

    This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.

  17. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hugo, Jacques

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method wasmore » adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.« less

  18. Regression Analysis and Calibration Recommendations for the Characterization of Balance Temperature Effects

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Volden, T.

    2018-01-01

    Analysis and use of temperature-dependent wind tunnel strain-gage balance calibration data are discussed in the paper. First, three different methods are presented and compared that may be used to process temperature-dependent strain-gage balance data. The first method uses an extended set of independent variables in order to process the data and predict balance loads. The second method applies an extended load iteration equation during the analysis of balance calibration data. The third method uses temperature-dependent sensitivities for the data analysis. Physical interpretations of the most important temperature-dependent regression model terms are provided that relate temperature compensation imperfections and the temperature-dependent nature of the gage factor to sets of regression model terms. Finally, balance calibration recommendations are listed so that temperature-dependent calibration data can be obtained and successfully processed using the reviewed analysis methods.

  19. Amie Sluiter | NREL

    Science.gov Websites

    biomass analysis methods and is primary author on 11 Laboratory Analytical Procedures, which are ) spectroscopic analysis methods. These methods allow analysts to predict the composition of feedstock and process . Patent No. 6,737,258 (2002) Featured Publications "Improved methods for the determination of drying

  20. Computational Methods for Analyzing Health News Coverage

    ERIC Educational Resources Information Center

    McFarlane, Delano J.

    2011-01-01

    Researchers that investigate the media's coverage of health have historically relied on keyword searches to retrieve relevant health news coverage, and manual content analysis methods to categorize and score health news text. These methods are problematic. Manual content analysis methods are labor intensive, time consuming, and inherently…

  1. Macro elemental analysis of food samples by nuclear analytical technique

    NASA Astrophysics Data System (ADS)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  2. A comparative study of multivariable robustness analysis methods as applied to integrated flight and propulsion control

    NASA Technical Reports Server (NTRS)

    Schierman, John D.; Lovell, T. A.; Schmidt, David K.

    1993-01-01

    Three multivariable robustness analysis methods are compared and contrasted. The focus of the analysis is on system stability and performance robustness to uncertainty in the coupling dynamics between two interacting subsystems. Of particular interest is interacting airframe and engine subsystems, and an example airframe/engine vehicle configuration is utilized in the demonstration of these approaches. The singular value (SV) and structured singular value (SSV) analysis methods are compared to a method especially well suited for analysis of robustness to uncertainties in subsystem interactions. This approach is referred to here as the interacting subsystem (IS) analysis method. This method has been used previously to analyze airframe/engine systems, emphasizing the study of stability robustness. However, performance robustness is also investigated here, and a new measure of allowable uncertainty for acceptable performance robustness is introduced. The IS methodology does not require plant uncertainty models to measure the robustness of the system, and is shown to yield valuable information regarding the effects of subsystem interactions. In contrast, the SV and SSV methods allow for the evaluation of the robustness of the system to particular models of uncertainty, and do not directly indicate how the airframe (engine) subsystem interacts with the engine (airframe) subsystem.

  3. Using recurrence plot analysis for software execution interpretation and fault detection

    NASA Astrophysics Data System (ADS)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  4. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  5. Sequential change detection and monitoring of temporal trends in random-effects meta-analysis.

    PubMed

    Dogo, Samson Henry; Clark, Allan; Kulinskaya, Elena

    2017-06-01

    Temporal changes in magnitude of effect sizes reported in many areas of research are a threat to the credibility of the results and conclusions of meta-analysis. Numerous sequential methods for meta-analysis have been proposed to detect changes and monitor trends in effect sizes so that meta-analysis can be updated when necessary and interpreted based on the time it was conducted. The difficulties of sequential meta-analysis under the random-effects model are caused by dependencies in increments introduced by the estimation of the heterogeneity parameter τ 2 . In this paper, we propose the use of a retrospective cumulative sum (CUSUM)-type test with bootstrap critical values. This method allows retrospective analysis of the past trajectory of cumulative effects in random-effects meta-analysis and its visualization on a chart similar to CUSUM chart. Simulation results show that the new method demonstrates good control of Type I error regardless of the number or size of the studies and the amount of heterogeneity. Application of the new method is illustrated on two examples of medical meta-analyses. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  6. The Analysis of Seawater: A Laboratory-Centered Learning Project in General Chemistry.

    ERIC Educational Resources Information Center

    Selco, Jodye I.; Roberts, Julian L., Jr.; Wacks, Daniel B.

    2003-01-01

    Describes a sea-water analysis project that introduces qualitative and quantitative analysis methods and laboratory methods such as gravimetric analysis, potentiometric titration, ion-selective electrodes, and the use of calibration curves. Uses a problem-based cooperative teaching approach. (Contains 24 references.) (YDS)

  7. Generalized Full-Information Item Bifactor Analysis

    ERIC Educational Resources Information Center

    Cai, Li; Yang, Ji Seung; Hansen, Mark

    2011-01-01

    Full-information item bifactor analysis is an important statistical method in psychological and educational measurement. Current methods are limited to single-group analysis and inflexible in the types of item response models supported. We propose a flexible multiple-group item bifactor analysis framework that supports a variety of…

  8. Participant Interaction in Asynchronous Learning Environments: Evaluating Interaction Analysis Methods

    ERIC Educational Resources Information Center

    Blanchette, Judith

    2012-01-01

    The purpose of this empirical study was to determine the extent to which three different objective analytical methods--sequence analysis, surface cohesion analysis, and lexical cohesion analysis--can most accurately identify specific characteristics of online interaction. Statistically significant differences were found in all points of…

  9. Meta-analysis of haplotype-association studies: comparison of methods and empirical evaluation of the literature

    PubMed Central

    2011-01-01

    Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440

  10. GSA-PCA: gene set generation by principal component analysis of the Laplacian matrix of a metabolic network

    PubMed Central

    2012-01-01

    Background Gene Set Analysis (GSA) has proven to be a useful approach to microarray analysis. However, most of the method development for GSA has focused on the statistical tests to be used rather than on the generation of sets that will be tested. Existing methods of set generation are often overly simplistic. The creation of sets from individual pathways (in isolation) is a poor reflection of the complexity of the underlying metabolic network. We have developed a novel approach to set generation via the use of Principal Component Analysis of the Laplacian matrix of a metabolic network. We have analysed a relatively simple data set to show the difference in results between our method and the current state-of-the-art pathway-based sets. Results The sets generated with this method are semi-exhaustive and capture much of the topological complexity of the metabolic network. The semi-exhaustive nature of this method has also allowed us to design a hypergeometric enrichment test to determine which genes are likely responsible for set significance. We show that our method finds significant aspects of biology that would be missed (i.e. false negatives) and addresses the false positive rates found with the use of simple pathway-based sets. Conclusions The set generation step for GSA is often neglected but is a crucial part of the analysis as it defines the full context for the analysis. As such, set generation methods should be robust and yield as complete a representation of the extant biological knowledge as possible. The method reported here achieves this goal and is demonstrably superior to previous set analysis methods. PMID:22876834

  11. A transient response analysis of the space shuttle vehicle during liftoff

    NASA Technical Reports Server (NTRS)

    Brunty, J. A.

    1990-01-01

    A proposed transient response method is formulated for the liftoff analysis of the space shuttle vehicles. It uses a power series approximation with unknown coefficients for the interface forces between the space shuttle and mobile launch platform. This allows the equation of motion of the two structures to be solved separately with the unknown coefficients at the end of each step. These coefficients are obtained by enforcing the interface compatibility conditions between the two structures. Once the unknown coefficients are determined, the total response is computed for that time step. The method is validated by a numerical example of a cantilevered beam and by the liftoff analysis of the space shuttle vehicles. The proposed method is compared to an iterative transient response analysis method used by Martin Marietta for their space shuttle liftoff analysis. It is shown that the proposed method uses less computer time than the iterative method and does not require as small a time step for integration. The space shuttle vehicle model is reduced using two different types of component mode synthesis (CMS) methods, the Lanczos method and the Craig and Bampton CMS method. By varying the cutoff frequency in the Craig and Bampton method it was shown that the space shuttle interface loads can be computed with reasonable accuracy. Both the Lanczos CMS method and Craig and Bampton CMS method give similar results. A substantial amount of computer time is saved using the Lanczos CMS method over that of the Craig and Bampton method. However, when trying to compute a large number of Lanczos vectors, input/output computer time increased and increased the overall computer time. The application of several liftoff release mechanisms that can be adapted to the proposed method are discussed.

  12. Methods for network meta-analysis of continuous outcomes using individual patient data: a case study in acupuncture for chronic pain.

    PubMed

    Saramago, Pedro; Woods, Beth; Weatherly, Helen; Manca, Andrea; Sculpher, Mark; Khan, Kamran; Vickers, Andrew J; MacPherson, Hugh

    2016-10-06

    Network meta-analysis methods, which are an extension of the standard pair-wise synthesis framework, allow for the simultaneous comparison of multiple interventions and consideration of the entire body of evidence in a single statistical model. There are well-established advantages to using individual patient data to perform network meta-analysis and methods for network meta-analysis of individual patient data have already been developed for dichotomous and time-to-event data. This paper describes appropriate methods for the network meta-analysis of individual patient data on continuous outcomes. This paper introduces and describes network meta-analysis of individual patient data models for continuous outcomes using the analysis of covariance framework. Comparisons are made between this approach and change score and final score only approaches, which are frequently used and have been proposed in the methodological literature. A motivating example on the effectiveness of acupuncture for chronic pain is used to demonstrate the methods. Individual patient data on 28 randomised controlled trials were synthesised. Consistency of endpoints across the evidence base was obtained through standardisation and mapping exercises. Individual patient data availability avoided the use of non-baseline-adjusted models, allowing instead for analysis of covariance models to be applied and thus improving the precision of treatment effect estimates while adjusting for baseline imbalance. The network meta-analysis of individual patient data using the analysis of covariance approach is advocated to be the most appropriate modelling approach for network meta-analysis of continuous outcomes, particularly in the presence of baseline imbalance. Further methods developments are required to address the challenge of analysing aggregate level data in the presence of baseline imbalance.

  13. A Compact Review of Multi-criteria Decision Analysis Uncertainty Techniques

    DTIC Science & Technology

    2013-02-01

    9 3.4 PROMETHEE -GAIA Method...obtained (74). 3.4 PROMETHEE -GAIA Method Preference Ranking Organization Method for Enrichment Evaluation ( PROMETHEE ) and Geometrical Analysis for...greater understanding of the importance of their selections. The PROMETHEE method was designed to perform MCDA while accounting for each of these

  14. Modal analysis applied to circular, rectangular, and coaxial waveguides

    NASA Technical Reports Server (NTRS)

    Hoppe, D. J.

    1988-01-01

    Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.

  15. Post-16 Physics and Chemistry Uptake: Combining Large-Scale Secondary Analysis with In-Depth Qualitative Methods

    ERIC Educational Resources Information Center

    Hampden-Thompson, Gillian; Lubben, Fred; Bennett, Judith

    2011-01-01

    Quantitative secondary analysis of large-scale data can be combined with in-depth qualitative methods. In this paper, we discuss the role of this combined methods approach in examining the uptake of physics and chemistry in post compulsory schooling for students in England. The secondary data analysis of the National Pupil Database (NPD) served…

  16. Microfluidic systems and methods of transport and lysis of cells and analysis of cell lysate

    DOEpatents

    Culbertson, Christopher T.; Jacobson, Stephen C.; McClain, Maxine A.; Ramsey, J. Michael

    2004-08-31

    Microfluidic systems and methods are disclosed which are adapted to transport and lyse cellular components of a test sample for analysis. The disclosed microfluidic systems and methods, which employ an electric field to rupture the cell membrane, cause unusually rapid lysis, thereby minimizing continued cellular activity and resulting in greater accuracy of analysis of cell processes.

  17. Microfluidic systems and methods for transport and lysis of cells and analysis of cell lysate

    DOEpatents

    Culbertson, Christopher T [Oak Ridge, TN; Jacobson, Stephen C [Knoxville, TN; McClain, Maxine A [Knoxville, TN; Ramsey, J Michael [Knoxville, TN

    2008-09-02

    Microfluidic systems and methods are disclosed which are adapted to transport and lyse cellular components of a test sample for analysis. The disclosed microfluidic systems and methods, which employ an electric field to rupture the cell membrane, cause unusually rapid lysis, thereby minimizing continued cellular activity and resulting in greater accuracy of analysis of cell processes.

  18. Method of multi-dimensional moment analysis for the characterization of signal peaks

    DOEpatents

    Pfeifer, Kent B; Yelton, William G; Kerr, Dayle R; Bouchier, Francis A

    2012-10-23

    A method of multi-dimensional moment analysis for the characterization of signal peaks can be used to optimize the operation of an analytical system. With a two-dimensional Peclet analysis, the quality and signal fidelity of peaks in a two-dimensional experimental space can be analyzed and scored. This method is particularly useful in determining optimum operational parameters for an analytical system which requires the automated analysis of large numbers of analyte data peaks. For example, the method can be used to optimize analytical systems including an ion mobility spectrometer that uses a temperature stepped desorption technique for the detection of explosive mixtures.

  19. MDAS: an integrated system for metabonomic data analysis.

    PubMed

    Liu, Juan; Li, Bo; Xiong, Jiang-Hui

    2009-03-01

    Metabonomics, the latest 'omics' research field, shows great promise as a tool in biomarker discovery, drug efficacy and toxicity analysis, disease diagnosis and prognosis. One of the major challenges now facing researchers is how to process this data to yield useful information about a biological system, e.g., the mechanism of diseases. Traditional methods employed in metabonomic data analysis use multivariate analysis methods developed independently in chemometrics research. Additionally, with the development of machine learning approaches, some methods such as SVMs also show promise for use in metabonomic data analysis. Aside from the application of general multivariate analysis and machine learning methods to this problem, there is also a need for an integrated tool customized for metabonomic data analysis which can be easily used by biologists to reveal interesting patterns in metabonomic data.In this paper, we present a novel software tool MDAS (Metabonomic Data Analysis System) for metabonomic data analysis which integrates traditional chemometrics methods and newly introduced machine learning approaches. MDAS contains a suite of functional models for metabonomic data analysis and optimizes the flow of data analysis. Several file formats can be accepted as input. The input data can be optionally preprocessed and can then be processed with operations such as feature analysis and dimensionality reduction. The data with reduced dimensionalities can be used for training or testing through machine learning models. The system supplies proper visualization for data preprocessing, feature analysis, and classification which can be a powerful function for users to extract knowledge from the data. MDAS is an integrated platform for metabonomic data analysis, which transforms a complex analysis procedure into a more formalized and simplified one. The software package can be obtained from the authors.

  20. A homotopy analysis method for the nonlinear partial differential equations arising in engineering

    NASA Astrophysics Data System (ADS)

    Hariharan, G.

    2017-05-01

    In this article, we have established the homotopy analysis method (HAM) for solving a few partial differential equations arising in engineering. This technique provides the solutions in rapid convergence series with computable terms for the problems with high degree of nonlinear terms appearing in the governing differential equations. The convergence analysis of the proposed method is also discussed. Finally, we have given some illustrative examples to demonstrate the validity and applicability of the proposed method.

  1. Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.

    1989-01-01

    The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  2. A method for data base management and analysis for wind tunnel data

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1987-01-01

    To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.

  3. An Analysis of a Finite Element Method for Convection-Diffusion Problems. Part II. A Posteriori Error Estimates and Adaptivity.

    DTIC Science & Technology

    1983-03-01

    AN ANALYSIS OF A FINITE ELEMENT METHOD FOR CONVECTION- DIFFUSION PROBLEMS PART II: A POSTERIORI ERROR ESTIMATES AND ADAPTIVITY by W. G. Szymczak Y 6a...PERIOD COVERED AN ANALYSIS OF A FINITE ELEMENT METHOD FOR final life of the contract CONVECTION- DIFFUSION PROBLEM S. Part II: A POSTERIORI ERROR ...Element Method for Convection- Diffusion Problems. Part II: A Posteriori Error Estimates and Adaptivity W. G. Szvmczak and I. Babu~ka# Laboratory for

  4. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, Theodore H. H.

    1991-01-01

    The following tasks on the study of advanced stress analysis methods applicable to turbine engine structures are described: (1) constructions of special elements which contain traction-free circular boundaries; (2) formulation of new version of mixed variational principles and new version of hybrid stress elements; (3) establishment of methods for suppression of kinematic deformation modes; (4) construction of semiLoof plate and shell elements by assumed stress hybrid method; and (5) elastic-plastic analysis by viscoplasticity theory using the mechanical subelement model.

  5. A numerical method for the stress analysis of stiffened-shell structures under nonuniform temperature distributions

    NASA Technical Reports Server (NTRS)

    Heldenfels, Richard R

    1951-01-01

    A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.

  6. High accuracy method for the application of isotope dilution to gas chromatography/mass spectrometric analysis of gases.

    PubMed

    Milton, Martin J T; Wang, Jian

    2003-01-01

    A new isotope dilution mass spectrometry (IDMS) method for high-accuracy quantitative analysis of gases has been developed and validated by the analysis of standard mixtures of carbon dioxide in nitrogen. The method does not require certified isotopic reference materials and does not require direct measurements of the highly enriched spike. The relative uncertainty of the method is shown to be 0.2%. Reproduced with the permission of Her Majesty's Stationery Office. Copyright Crown copyright 2003.

  7. High-pressure liquid chromatography analysis of antibiotic susceptibility disks.

    PubMed Central

    Hagel, R B; Waysek, E H; Cort, W M

    1979-01-01

    The analysis of antibiotic susceptibility disks by high-pressure liquid chromatography (HPLC) was investigated. Methods are presented for the potency determination of mecillinam, ampicillin, carbenicillin, and cephalothin alone and in various combinations. Good agreement between HPLC and microbiological data is observed for potency determinations with recoveries of greater than 95%. Relative standard deviations of lower than 2% are recorded for each HPLC method. HPLC methods offer improved accuracy and greater precision when compared to the standard microbiological methods of analysis for susceptibility disks. PMID:507793

  8. A method for the analysis of nonlinearities in aircraft dynamic response to atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1976-01-01

    An analytical method is developed which combines the equivalent linearization technique for the analysis of the response of nonlinear dynamic systems with the amplitude modulated random process (Press model) for atmospheric turbulence. The method is initially applied to a bilinear spring system. The analysis of the response shows good agreement with exact results obtained by the Fokker-Planck equation. The method is then applied to an example of control-surface displacement limiting in an aircraft with a pitch-hold autopilot.

  9. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  10. A method for determining spiral-bevel gear tooth geometry for finite element analysis

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.; Litvin, Faydor L.

    1991-01-01

    An analytical method was developed to determine gear tooth surface coordinates of face-milled spiral bevel gears. The method uses the basic gear design parameters in conjunction with the kinematical aspects of spiral bevel gear manufacturing machinery. A computer program, SURFACE, was developed. The computer program calculates the surface coordinates and outputs 3-D model data that can be used for finite element analysis. Development of the modeling method and an example case are presented. This analysis method could also find application for gear inspection and near-net-shape gear forging die design.

  11. An operational modal analysis method in frequency and spatial domain

    NASA Astrophysics Data System (ADS)

    Wang, Tong; Zhang, Lingmi; Tamura, Yukio

    2005-12-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  12. Sampling methods for microbiological analysis of red meat and poultry carcasses.

    PubMed

    Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos

    2004-06-01

    Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.

  13. The detection methods of dynamic objects

    NASA Astrophysics Data System (ADS)

    Knyazev, N. L.; Denisova, L. A.

    2018-01-01

    The article deals with the application of cluster analysis methods for solving the task of aircraft detection on the basis of distribution of navigation parameters selection into groups (clusters). The modified method of cluster analysis for search and detection of objects and then iterative combining in clusters with the subsequent count of their quantity for increase in accuracy of the aircraft detection have been suggested. The course of the method operation and the features of implementation have been considered. In the conclusion the noted efficiency of the offered method for exact cluster analysis for finding targets has been shown.

  14. QCL spectroscopy combined with the least squares method for substance analysis

    NASA Astrophysics Data System (ADS)

    Samsonov, D. A.; Tabalina, A. S.; Fufurin, I. L.

    2017-11-01

    The article briefly describes distinctive features of quantum cascade lasers (QCL). It also describes an experimental set-up for acquiring mid-infrared absorption spectra using QCL. The paper demonstrates experimental results in the form of normed spectra. We tested the application of the least squares method for spectrum analysis. We used this method for substance identification and extraction of concentration data. We compare the results with more common methods of absorption spectroscopy. Eventually, we prove the feasibility of using this simple method for quantitative and qualitative analysis of experimental data acquired with QCL.

  15. Analysis of Bonded Joints Between the Facesheet and Flange of Corrugated Composite Panels

    NASA Technical Reports Server (NTRS)

    Yarrington, Phillip W.; Collier, Craig S.; Bednarcyk, Brett A.

    2008-01-01

    This paper outlines a method for the stress analysis of bonded composite corrugated panel facesheet to flange joints. The method relies on the existing HyperSizer Joints software, which analyzes the bonded joint, along with a beam analogy model that provides the necessary boundary loading conditions to the joint analysis. The method is capable of predicting the full multiaxial stress and strain fields within the flange to facesheet joint and thus can determine ply-level margins and evaluate delamination. Results comparing the method to NASTRAN finite element model stress fields are provided illustrating the accuracy of the method.

  16. A Macro Analysis of DoD Logistics Systems. Volume 3. A Framework for Policy-Level Logistics Management

    DTIC Science & Technology

    1978-12-01

    prioritization. 5 (We have chosen to use a variation of Saaty’s method in our hierarchical analysis, discussed in chapter 5, but for a purpose different ...the word "framework" to refer to an abstract structure for think- ing through policy-tieel management problems. This structure raises method - ological...readiness and logistics system performance, and we relied heavily on "structural" and trend analysis. By structural analysis, we meant a formal method for

  17. Application of Solid Phase Microextraction Coupled with Gas Chromatography/Mass Spectrometry as a Rapid Method for Field Sampling and Analysis of Chemical Warfare Agents and Toxic Industrial Chemicals

    DTIC Science & Technology

    2003-01-01

    PHASE MICROEXTRACTION COUPLED WITH GAS CHROMATOGRAPHY/MASS SPECTROMETRY AS A RAPID METHOD FOR FIELD SAMPLING AND ANALYSIS OF CHEMICAL WARFARE AGENTS...SAMPLING AND ANALYSIS OF CHEMICAL WARFARE AGENTS AND TOXIC INDUSTRIAL CHEMICALS 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...GAS CHROMATOGRAPHY/MASS SPECTROMETRY AS A RAPID METHOD FOR FIELD SAMPLING AND ANALYSIS OF CHEMICAL WARFARE AGENTS AND TOXIC INDUSTRIAL CHEMICALS

  18. An analysis of general chain systems

    NASA Technical Reports Server (NTRS)

    Passerello, C. E.; Huston, R. L.

    1972-01-01

    A general analysis of dynamic systems consisting of connected rigid bodies is presented. The number of bodies and their manner of connection is arbitrary so long as no closed loops are formed. The analysis represents a dynamic finite element method, which is computer-oriented and designed so that nonworking, interval constraint forces are automatically eliminated. The method is based upon Lagrange's form of d'Alembert's principle. Shifter matrix transformations are used with the geometrical aspects of the analysis. The method is illustrated with a space manipulator.

  19. Possibility of successive SRXFA use along with chemical-spectral methods for palladium analysis in geological samples

    NASA Astrophysics Data System (ADS)

    Kislov, E. V.; Kulikov, A. A.; Kulikova, A. B.

    1989-10-01

    Samples of basit-ultrabasit rocks and NiCu ores of the Ioko-Dovyren and Chaya massifs were analysed by SRXFA and a chemical-spectral method. SRXFA perfectly satisfies the quantitative noble-metals analysis of ore-free rocks. Combination of SRXFA and chemical-spectral analysis has good prospects. After analysis of a great number of samples by SRXFA it is necessary to select samples which would show minimal and maximal results for the chemical-spectral method.

  20. Face recognition using slow feature analysis and contourlet transform

    NASA Astrophysics Data System (ADS)

    Wang, Yuehao; Peng, Lingling; Zhe, Fuchuan

    2018-04-01

    In this paper we propose a novel face recognition approach based on slow feature analysis (SFA) in contourlet transform domain. This method firstly use contourlet transform to decompose the face image into low frequency and high frequency part, and then takes technological advantages of slow feature analysis for facial feature extraction. We named the new method combining the slow feature analysis and contourlet transform as CT-SFA. The experimental results on international standard face database demonstrate that the new face recognition method is effective and competitive.

  1. Proceedings of the NASTRAN (Tradename) Users’ Colloquium (17th) Held in San Antonio, Texas on 24-28 April 1989

    DTIC Science & Technology

    1989-03-01

    statistical energy analysis , the finite clement method, and the power flow method. Experimental solutions are the most common in the literature. The authors of...to the added weights and inertias of the transducers attached to an experimental structure. Statistical energy analysis (SEA) is a computational method...Analysis and Diagnosis," Journal of Sound and Vibration, Vol. 115, No. 3, pp. 405-422 (1987). 8. Lyon, R.L., Statistical Energy Analysis of Dynamical Systems

  2. Traditional Mold Analysis Compared to a DNA-based Method of Mold Analysis with Applications in Asthmatics' Homes

    EPA Science Inventory

    Traditional environmental mold analysis is based-on microscopic observations and counting of mold structures collected from the air on a sticky surface or culturing of molds on growth media for identification and quantification. A DNA-based method of mold analysis called mol...

  3. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    PubMed

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Analysis and development of adjoint-based h-adaptive direct discontinuous Galerkin method for the compressible Navier-Stokes equations

    NASA Astrophysics Data System (ADS)

    Cheng, Jian; Yue, Huiqiang; Yu, Shengjiao; Liu, Tiegang

    2018-06-01

    In this paper, an adjoint-based high-order h-adaptive direct discontinuous Galerkin method is developed and analyzed for the two dimensional steady state compressible Navier-Stokes equations. Particular emphasis is devoted to the analysis of the adjoint consistency for three different direct discontinuous Galerkin discretizations: including the original direct discontinuous Galerkin method (DDG), the direct discontinuous Galerkin method with interface correction (DDG(IC)) and the symmetric direct discontinuous Galerkin method (SDDG). Theoretical analysis shows the extra interface correction term adopted in the DDG(IC) method and the SDDG method plays a key role in preserving the adjoint consistency. To be specific, for the model problem considered in this work, we prove that the original DDG method is not adjoint consistent, while the DDG(IC) method and the SDDG method can be adjoint consistent with appropriate treatment of boundary conditions and correct modifications towards the underlying output functionals. The performance of those three DDG methods is carefully investigated and evaluated through typical test cases. Based on the theoretical analysis, an adjoint-based h-adaptive DDG(IC) method is further developed and evaluated, numerical experiment shows its potential in the applications of adjoint-based adaptation for simulating compressible flows.

  5. Asymptotic approximation method of force reconstruction: Application and analysis of stationary random forces

    NASA Astrophysics Data System (ADS)

    Sanchez, J.

    2018-06-01

    In this paper, the application and analysis of the asymptotic approximation method to a single degree-of-freedom has recently been produced. The original concepts are summarized, and the necessary probabilistic concepts are developed and applied to single degree-of-freedom systems. Then, these concepts are united, and the theoretical and computational models are developed. To determine the viability of the proposed method in a probabilistic context, numerical experiments are conducted, and consist of a frequency analysis, analysis of the effects of measurement noise, and a statistical analysis. In addition, two examples are presented and discussed.

  6. Determination of carrier yields for neutron activation analysis using energy dispersive X-ray spectrometry

    USGS Publications Warehouse

    Johnson, R.G.; Wandless, G.A.

    1984-01-01

    A new method is described for determining carrier yield in the radiochemical neutron activation analysis of rare-earth elements in silicate rocks by group separation. The method involves the determination of the rare-earth elements present in the carrier by means of energy-dispersive X-ray fluorescence analysis, eliminating the need to re-irradiate samples in a nuclear reactor after the gamma ray analysis is complete. Results from the analysis of USGS standards AGV-1 and BCR-1 compare favorably with those obtained using the conventional method. ?? 1984 Akade??miai Kiado??.

  7. Implementing EPA Method 537 | Science Inventory | US EPA

    EPA Pesticide Factsheets

    This presentation describes EPA Method 537 for the analysis of 14 perfluorinated alkyl acids in drinking water as well as the challenges associated with preparing a laboratory for analysis using Method 537. The impact of this presentation is to provide the scientific community with an introduction to EPA Method 537 and to provide suggestions on implementing this method in laboratories.

  8. Liquid chromatography/electrospray ionization/isotopic dilution mass spectrometry analysis of n-(phosphonomethyl) glycine and mass spectrometry analysis of aminomethyl phosphonic acid in environmental water and vegetation matrixes.

    PubMed

    Grey, L; Nguyen, B; Yang, P

    2001-01-01

    A liquid chromatography/electrospray/mass spectrometry (LC/ES/MS) method was developed for the analysis of glyphosate (n-phosphonomethyl glycine) and its metabolite, aminomethyl phosphonic acid (AMPA) using isotope-labelled glyphosate as a method surrogate. Optimized parameters were achieved to derivatize glyphosate and AMPA using 9-fluorenylmethyl chloroformate (FMOC-Cl) in borate buffer prior to a reversed-phase LC analysis. Method spike recovery data obtained using laboratory and real world sample matrixes indicated an excellent correlation between the recovery of the native and isotope-labelled glyphosate. Hence, the first performance-based, isotope dilution MS method with superior precision, accuracy, and data quality was developed for the analysis of glyphosate. There was, however, no observable correlation between the isotope-labelled glyphosate and AMPA. Thus, the use of this procedure for the accurate analysis of AMPA was not supported. Method detection limits established using standard U.S. Environmental Protection Agency protocol were 0.06 and 0.30 microg/L, respectively, for glyphosate and AMPA in water matrixes and 0.11 and 0.53 microg/g, respectively, in vegetation matrixes. Problems, solutions, and the method performance data related to the analysis of chlorine-treated drinking water samples are discussed. Applying this method to other environmental matrixes, e.g., soil, with minimum modifications is possible, assuring accurate, multimedia studies of glyphosate concentration in the environment and the delivery of useful multimedia information for regulatory applications.

  9. Training in metabolomics research. II. Processing and statistical analysis of metabolomics data, metabolite identification, pathway analysis, applications of metabolomics and its future

    PubMed Central

    Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.

    2017-01-01

    Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites, and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. PMID:28239968

  10. Evaluation of a cost-effective loads approach. [for Viking Orbiter light weight structural design

    NASA Technical Reports Server (NTRS)

    Garba, J. A.; Wada, B. K.; Bamford, R.; Trubert, M. R.

    1976-01-01

    A shock spectra/impedance method for loads prediction is used to estimate member loads for the Viking Orbiter, a 7800-lb interplanetary spacecraft that has been designed using transient loads analysis techniques. The transient loads analysis approach leads to a lightweight structure but requires complex and costly analyses. To reduce complexity and cost a shock spectra/impedance method is currently being used to design the Mariner Jupiter Saturn spacecraft. This method has the advantage of using low-cost in-house loads analysis techniques and typically results in more conservative structural loads. The method is evaluated by comparing the increase in Viking member loads to the loads obtained by the transient loads analysis approach. An estimate of the weight penalty incurred by using this method is presented. The paper also compares the calculated flight loads from the transient loads analyses and the shock spectra/impedance method to measured flight data.

  11. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Use of prior knowledge for the analysis of high-throughput transcriptomics and metabolomics data

    PubMed Central

    2014-01-01

    Background High-throughput omics technologies have enabled the measurement of many genes or metabolites simultaneously. The resulting high dimensional experimental data poses significant challenges to transcriptomics and metabolomics data analysis methods, which may lead to spurious instead of biologically relevant results. One strategy to improve the results is the incorporation of prior biological knowledge in the analysis. This strategy is used to reduce the solution space and/or to focus the analysis on biological meaningful regions. In this article, we review a selection of these methods used in transcriptomics and metabolomics. We combine the reviewed methods in three groups based on the underlying mathematical model: exploratory methods, supervised methods and estimation of the covariance matrix. We discuss which prior knowledge has been used, how it is incorporated and how it modifies the mathematical properties of the underlying methods. PMID:25033193

  13. Application of the variational-asymptotical method to composite plates

    NASA Technical Reports Server (NTRS)

    Hodges, Dewey H.; Lee, Bok W.; Atilgan, Ali R.

    1992-01-01

    A method is developed for the 3D analysis of laminated plate deformation which is an extension of a variational-asymptotical method by Atilgan and Hodges (1991). Both methods are based on the treatment of plate deformation by splitting the 3D analysis into linear through-the-thickness analysis and 2D plate analysis. Whereas the first technique tackles transverse shear deformation in the second asymptotical approximation, the present method simplifies its treatment and restricts it to the first approximation. Both analytical techniques are applied to the linear cylindrical bending problem, and the strain and stress distributions are derived and compared with those of the exact solution. The present theory provides more accurate results than those of the classical laminated-plate theory for the transverse displacement of 2-, 3-, and 4-layer cross-ply laminated plates. The method can give reliable estimates of the in-plane strain and displacement distributions.

  14. Rapid analysis of scattering from periodic dielectric structures using accelerated Cartesian expansions.

    PubMed

    Baczewski, Andrew D; Miller, Nicholas C; Shanker, Balasubramaniam

    2012-04-01

    The analysis of fields in periodic dielectric structures arise in numerous applications of recent interest, ranging from photonic bandgap structures and plasmonically active nanostructures to metamaterials. To achieve an accurate representation of the fields in these structures using numerical methods, dense spatial discretization is required. This, in turn, affects the cost of analysis, particularly for integral-equation-based methods, for which traditional iterative methods require O(N2) operations, N being the number of spatial degrees of freedom. In this paper, we introduce a method for the rapid solution of volumetric electric field integral equations used in the analysis of doubly periodic dielectric structures. The crux of our method is the accelerated Cartesian expansion algorithm, which is used to evaluate the requisite potentials in O(N) cost. Results are provided that corroborate our claims of acceleration without compromising accuracy, as well as the application of our method to a number of compelling photonics applications.

  15. A comparison of performance of automatic cloud coverage assessment algorithm for Formosat-2 image using clustering-based and spatial thresholding methods

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2012-11-01

    Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.

  16. Applications of Response Surface-Based Methods to Noise Analysis in the Conceptual Design of Revolutionary Aircraft

    NASA Technical Reports Server (NTRS)

    Hill, Geoffrey A.; Olson, Erik D.

    2004-01-01

    Due to the growing problem of noise in today's air transportation system, there have arisen needs to incorporate noise considerations in the conceptual design of revolutionary aircraft. Through the use of response surfaces, complex noise models may be converted into polynomial equations for rapid and simplified evaluation. This conversion allows many of the commonly used response surface-based trade space exploration methods to be applied to noise analysis. This methodology is demonstrated using a noise model of a notional 300 passenger Blended-Wing-Body (BWB) transport. Response surfaces are created relating source noise levels of the BWB vehicle to its corresponding FAR-36 certification noise levels and the resulting trade space is explored. Methods demonstrated include: single point analysis, parametric study, an optimization technique for inverse analysis, sensitivity studies, and probabilistic analysis. Extended applications of response surface-based methods in noise analysis are also discussed.

  17. A single-loop optimization method for reliability analysis with second order uncertainty

    NASA Astrophysics Data System (ADS)

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping

    2015-08-01

    Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.

  18. Linnorm: improved statistical analysis for single cell RNA-seq expression data

    PubMed Central

    Yip, Shun H.; Wang, Panwen; Kocher, Jean-Pierre A.; Sham, Pak Chung

    2017-01-01

    Abstract Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. PMID:28981748

  19. Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1989-01-01

    An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.

  20. Methods for assessing the stability of slopes during earthquakes-A retrospective

    USGS Publications Warehouse

    Jibson, R.W.

    2011-01-01

    During the twentieth century, several methods to assess the stability of slopes during earthquakes were developed. Pseudostatic analysis was the earliest method; it involved simply adding a permanent body force representing the earthquake shaking to a static limit-equilibrium analysis. Stress-deformation analysis, a later development, involved much more complex modeling of slopes using a mesh in which the internal stresses and strains within elements are computed based on the applied external loads, including gravity and seismic loads. Stress-deformation analysis provided the most realistic model of slope behavior, but it is very complex and requires a high density of high-quality soil-property data as well as an accurate model of soil behavior. In 1965, Newmark developed a method that effectively bridges the gap between these two types of analysis. His sliding-block model is easy to apply and provides a useful index of co-seismic slope performance. Subsequent modifications to sliding-block analysis have made it applicable to a wider range of landslide types. Sliding-block analysis provides perhaps the greatest utility of all the types of analysis. It is far easier to apply than stress-deformation analysis, and it yields much more useful information than does pseudostatic analysis. ?? 2010.

  1. TL and ESR based identification of gamma-irradiated frozen fish using different hydrolysis techniques

    NASA Astrophysics Data System (ADS)

    Ahn, Jae-Jun; Akram, Kashif; Shahbaz, Hafiz Muhammad; Kwon, Joong-Ho

    2014-12-01

    Frozen fish fillets (walleye Pollack and Japanese Spanish mackerel) were selected as samples for irradiation (0-10 kGy) detection trials using different hydrolysis methods. Photostimulated luminescence (PSL)-based screening analysis for gamma-irradiated frozen fillets showed low sensitivity due to limited silicate mineral contents on the samples. Same limitations were found in the thermoluminescence (TL) analysis on mineral samples isolated by density separation method. However, acid (HCl) and alkali (KOH) hydrolysis methods were effective in getting enough minerals to carry out TL analysis, which was reconfirmed through the normalization step by calculating the TL ratios (TL1/TL2). For improved electron spin resonance (ESR) analysis, alkali and enzyme (alcalase) hydrolysis methods were compared in separating minute-bone fractions. The enzymatic method provided more clear radiation-specific hydroxyapatite radicals than that of the alkaline method. Different hydrolysis methods could extend the application of TL and ESR techniques in identifying the irradiation history of frozen fish fillets.

  2. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  3. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.

  4. Microfluidic direct injection method for analysis of urinary 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanol (NNAL) using molecularly imprinted polymers coupled on-line with LC-MS/MS.

    PubMed

    Shah, Kumar A; Peoples, Michael C; Halquist, Matthew S; Rutan, Sarah C; Karnes, H Thomas

    2011-01-25

    The work described in this paper involves development of a high-throughput on-line microfluidic sample extraction method using capillary micro-columns packed with MIP beads coupled with tandem mass spectrometry for the analysis of urinary NNAL. The method was optimized and matrix effects were evaluated and resolved. The method enabled low sample volume (200 μL) and rapid analysis of urinary NNAL by direct injection onto the microfluidic column packed with molecularly imprinted beads engineered to NNAL. The method was validated according to the FDA bioanalytical method validation guidance. The dynamic range extended from 20.0 to 2500.0 pg/mL with a percent relative error of ±5.9% and a run time of 7.00 min. The lower limit of quantitation was 20.0 pg/mL. The method was used for the analysis of NNAL and NNAL-Gluc concentrations in smokers' urine. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Compensation of hospital-based physicians.

    PubMed Central

    Steinwald, B

    1983-01-01

    This study is concerned with methods of compensating hospital-based physicians (HBPs) in five medical specialties: anesthesiology, pathology, radiology, cardiology, and emergency medicine. Data on 2232 nonfederal, short-term general hospitals came from a mail questionnaire survey conducted in Fall 1979. The data indicate that numerous compensation methods exist but these methods, without much loss of precision, can be reduced to salary, percentage of department revenue, and fee-for-service. When HBPs are compensated by salary or percentage methods, most patient billing is conducted by the hospital. In contrast, most fee-for-service HBPs bill their patients directly. Determinants of HBP compensation methods are investigated via multinomial logit analysis. This analysis indicates that choice of HBP compensation methods are investigated via multinomial logit analysis. This analysis indicates that choice of HBP compensation methods is sensitive to a number of hospital characteristics and attributes of both the hospital and physicians' services markets. The empirical findings are discussed in light of past conceptual and empirical research on physician compensation, and current policy issues in the health services sector. PMID:6841112

  6. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain.

    PubMed

    Frolov, Alexander A; Húsek, Dušan; Polyakov, Pavel Yu

    2016-03-01

    An usual task in large data set analysis is searching for an appropriate data representation in a space of fewer dimensions. One of the most efficient methods to solve this task is factor analysis. In this paper, we compare seven methods for Boolean factor analysis (BFA) in solving the so-called bars problem (BP), which is a BFA benchmark. The performance of the methods is evaluated by means of information gain. Study of the results obtained in solving BP of different levels of complexity has allowed us to reveal strengths and weaknesses of these methods. It is shown that the Likelihood maximization Attractor Neural Network with Increasing Activity (LANNIA) is the most efficient BFA method in solving BP in many cases. Efficacy of the LANNIA method is also shown, when applied to the real data from the Kyoto Encyclopedia of Genes and Genomes database, which contains full genome sequencing for 1368 organisms, and to text data set R52 (from Reuters 21578) typically used for label categorization.

  7. Evaluation of qPCR curve analysis methods for reliable biomarker discovery: bias, resolution, precision, and implications.

    PubMed

    Ruijter, Jan M; Pfaffl, Michael W; Zhao, Sheng; Spiess, Andrej N; Boggy, Gregory; Blom, Jochen; Rutledge, Robert G; Sisti, Davide; Lievens, Antoon; De Preter, Katleen; Derveaux, Stefaan; Hellemans, Jan; Vandesompele, Jo

    2013-01-01

    RNA transcripts such as mRNA or microRNA are frequently used as biomarkers to determine disease state or response to therapy. Reverse transcription (RT) in combination with quantitative PCR (qPCR) has become the method of choice to quantify small amounts of such RNA molecules. In parallel with the democratization of RT-qPCR and its increasing use in biomedical research or biomarker discovery, we witnessed a growth in the number of gene expression data analysis methods. Most of these methods are based on the principle that the position of the amplification curve with respect to the cycle-axis is a measure for the initial target quantity: the later the curve, the lower the target quantity. However, most methods differ in the mathematical algorithms used to determine this position, as well as in the way the efficiency of the PCR reaction (the fold increase of product per cycle) is determined and applied in the calculations. Moreover, there is dispute about whether the PCR efficiency is constant or continuously decreasing. Together this has lead to the development of different methods to analyze amplification curves. In published comparisons of these methods, available algorithms were typically applied in a restricted or outdated way, which does not do them justice. Therefore, we aimed at development of a framework for robust and unbiased assessment of curve analysis performance whereby various publicly available curve analysis methods were thoroughly compared using a previously published large clinical data set (Vermeulen et al., 2009) [11]. The original developers of these methods applied their algorithms and are co-author on this study. We assessed the curve analysis methods' impact on transcriptional biomarker identification in terms of expression level, statistical significance, and patient-classification accuracy. The concentration series per gene, together with data sets from unpublished technical performance experiments, were analyzed in order to assess the algorithms' precision, bias, and resolution. While large differences exist between methods when considering the technical performance experiments, most methods perform relatively well on the biomarker data. The data and the analysis results per method are made available to serve as benchmark for further development and evaluation of qPCR curve analysis methods (http://qPCRDataMethods.hfrc.nl). Copyright © 2012 Elsevier Inc. All rights reserved.

  8. RELATIVE CONTRIBUTIONS OF THREE DESCRIPTIVE METHODS: IMPLICATIONS FOR BEHAVIORAL ASSESSMENT

    PubMed Central

    Pence, Sacha T; Roscoe, Eileen M; Bourret, Jason C; Ahearn, William H

    2009-01-01

    This study compared the outcomes of three descriptive analysis methods—the ABC method, the conditional probability method, and the conditional and background probability method—to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior participated. Functional analyses indicated that participants' problem behavior was maintained by social positive reinforcement (n  =  2), social negative reinforcement (n  =  2), or automatic reinforcement (n  =  2). Results showed that for all but 1 participant, descriptive analysis outcomes were similar across methods. In addition, for all but 1 participant, the descriptive analysis outcome differed substantially from the functional analysis outcome. This supports the general finding that descriptive analysis is a poor means of determining functional relations. PMID:19949536

  9. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendenhall, M.R.

    The present volume discusses tactical missile aerodynamic drag, drag-prediction methods for axisymmetric missile bodies, an aerodynamic heating analysis for supersonic missiles, a component buildup method for engineering analysis of missiles at low-to-high angles of attack, experimental and analytical methods for missiles with noncircular fuselages, and a vortex-cloud model for body vortex shedding and tracking. Also discussed are panel methods with vorticity effects and corrections for nonlinear compressibility, supersonic full-potential methods for missile body analysis, space-marching Euler solvers, the time-asymptotic Euler/Navier-Stokes methods for subsonic and transonic flows, 3D boundary layers on missiles, Navier-Stokes analyses of flows over slender airframes, and themore » interaction of exhaust plumes with missile airframes.« less

  11. Analysis of titanium content in titanium tetrachloride solution

    NASA Astrophysics Data System (ADS)

    Bi, Xiaoguo; Dong, Yingnan; Li, Shanshan; Guan, Duojiao; Wang, Jianyu; Tang, Meiling

    2018-03-01

    Strontium titanate, barium titan and lead titanate are new type of functional ceramic materials with good prospect, and titanium tetrachloride is a commonly in the production such products. Which excellent electrochemical performance of ferroelectric tempreature coefficient effect.In this article, three methods are used to calibrate the samples of titanium tetrachloride solution by back titration method, replacement titration method and gravimetric analysis method. The results show that the back titration method has many good points, for example, relatively simple operation, easy to judgment the titration end point, better accuracy and precision of analytical results, the relative standard deviation not less than 0.2%. So, it is the ideal of conventional analysis methods in the mass production.

  12. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  13. Method for predicting peptide detection in mass spectrometry

    DOEpatents

    Kangas, Lars [West Richland, WA; Smith, Richard D [Richland, WA; Petritis, Konstantinos [Richland, WA

    2010-07-13

    A method of predicting whether a peptide present in a biological sample will be detected by analysis with a mass spectrometer. The method uses at least one mass spectrometer to perform repeated analysis of a sample containing peptides from proteins with known amino acids. The method then generates a data set of peptides identified as contained within the sample by the repeated analysis. The method then calculates the probability that a specific peptide in the data set was detected in the repeated analysis. The method then creates a plurality of vectors, where each vector has a plurality of dimensions, and each dimension represents a property of one or more of the amino acids present in each peptide and adjacent peptides in the data set. Using these vectors, the method then generates an algorithm from the plurality of vectors and the calculated probabilities that specific peptides in the data set were detected in the repeated analysis. The algorithm is thus capable of calculating the probability that a hypothetical peptide represented as a vector will be detected by a mass spectrometry based proteomic platform, given that the peptide is present in a sample introduced into a mass spectrometer.

  14. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    NASA Technical Reports Server (NTRS)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  15. Automatic generation of the non-holonomic equations of motion for vehicle stability analysis

    NASA Astrophysics Data System (ADS)

    Minaker, B. P.; Rieveley, R. J.

    2010-09-01

    The mathematical analysis of vehicle stability has been utilised as an important tool in the design, development, and evaluation of vehicle architectures and stability controls. This paper presents a novel method for automatic generation of the linearised equations of motion for mechanical systems that is well suited to vehicle stability analysis. Unlike conventional methods for generating linearised equations of motion in standard linear second order form, the proposed method allows for the analysis of systems with non-holonomic constraints. In the proposed method, the algebraic constraint equations are eliminated after linearisation and reduction to first order. The described method has been successfully applied to an assortment of classic dynamic problems of varying complexity including the classic rolling coin, the planar truck-trailer, and the bicycle, as well as in more recent problems such as a rotor-stator and a benchmark road vehicle with suspension. This method has also been applied in the design and analysis of a novel three-wheeled narrow tilting vehicle with zero roll-stiffness. An application for determining passively stable configurations using the proposed method together with a genetic search algorithm is detailed. The proposed method and software implementation has been shown to be robust and provides invaluable conceptual insight into the stability of vehicles and mechanical systems.

  16. Droplet Microarray Based on Superhydrophobic-Superhydrophilic Patterns for Single Cell Analysis.

    PubMed

    Jogia, Gabriella E; Tronser, Tina; Popova, Anna A; Levkin, Pavel A

    2016-12-09

    Single-cell analysis provides fundamental information on individual cell response to different environmental cues and is a growing interest in cancer and stem cell research. However, current existing methods are still facing challenges in performing such analysis in a high-throughput manner whilst being cost-effective. Here we established the Droplet Microarray (DMA) as a miniaturized screening platform for high-throughput single-cell analysis. Using the method of limited dilution and varying cell density and seeding time, we optimized the distribution of single cells on the DMA. We established culturing conditions for single cells in individual droplets on DMA obtaining the survival of nearly 100% of single cells and doubling time of single cells comparable with that of cells cultured in bulk cell population using conventional methods. Our results demonstrate that the DMA is a suitable platform for single-cell analysis, which carries a number of advantages compared with existing technologies allowing for treatment, staining and spot-to-spot analysis of single cells over time using conventional analysis methods such as microscopy.

  17. Analytical quality assurance in veterinary drug residue analysis methods: matrix effects determination and monitoring for sulfonamides analysis.

    PubMed

    Hoff, Rodrigo Barcellos; Rübensam, Gabriel; Jank, Louise; Barreto, Fabiano; Peralba, Maria do Carmo Ruaro; Pizzolato, Tânia Mara; Silvia Díaz-Cruz, M; Barceló, Damià

    2015-01-01

    In residue analysis of veterinary drugs in foodstuff, matrix effects are one of the most critical points. This work present a discuss considering approaches used to estimate, minimize and monitoring matrix effects in bioanalytical methods. Qualitative and quantitative methods for estimation of matrix effects such as post-column infusion, slopes ratios analysis, calibration curves (mathematical and statistical analysis) and control chart monitoring are discussed using real data. Matrix effects varying in a wide range depending of the analyte and the sample preparation method: pressurized liquid extraction for liver samples show matrix effects from 15.5 to 59.2% while a ultrasound-assisted extraction provide values from 21.7 to 64.3%. The matrix influence was also evaluated: for sulfamethazine analysis, losses of signal were varying from -37 to -96% for fish and eggs, respectively. Advantages and drawbacks are also discussed considering a workflow for matrix effects assessment proposed and applied to real data from sulfonamides residues analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    PubMed Central

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  19. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  20. An Excel-based implementation of the spectral method of action potential alternans analysis.

    PubMed

    Pearman, Charles M

    2014-12-01

    Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. © 2014 The Author. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.

  1. Fingerprint analysis of Hibiscus mutabilis L. leaves based on ultra performance liquid chromatography with photodiode array detector combined with similarity analysis and hierarchical clustering analysis methods

    PubMed Central

    Liang, Xianrui; Ma, Meiling; Su, Weike

    2013-01-01

    Background: A method for chemical fingerprint analysis of Hibiscus mutabilis L. leaves was developed based on ultra performance liquid chromatography with photodiode array detector (UPLC-PAD) combined with similarity analysis (SA) and hierarchical clustering analysis (HCA). Materials and Methods: 10 batches of Hibiscus mutabilis L. leaves samples were collected from different regions of China. UPLC-PAD was employed to collect chemical fingerprints of Hibiscus mutabilis L. leaves. Results: The relative standard deviations (RSDs) of the relative retention times (RRT) and relative peak areas (RPA) of 10 characteristic peaks (one of them was identified as rutin) in precision, repeatability and stability test were less than 3%, and the method of fingerprint analysis was validated to be suitable for the Hibiscus mutabilis L. leaves. Conclusions: The chromatographic fingerprints showed abundant diversity of chemical constituents qualitatively in the 10 batches of Hibiscus mutabilis L. leaves samples from different locations by similarity analysis on basis of calculating the correlation coefficients between each two fingerprints. Moreover, the HCA method clustered the samples into four classes, and the HCA dendrogram showed the close or distant relations among the 10 samples, which was consistent to the SA result to some extent. PMID:23930008

  2. Combining task analysis and fault tree analysis for accident and incident analysis: a case study from Bulgaria.

    PubMed

    Doytchev, Doytchin E; Szwillus, Gerd

    2009-11-01

    Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation.

  3. A time-frequency analysis method to obtain stable estimates of magnetotelluric response function based on Hilbert-Huang transform

    NASA Astrophysics Data System (ADS)

    Cai, Jianhua

    2017-05-01

    The time-frequency analysis method represents signal as a function of time and frequency, and it is considered a powerful tool for handling arbitrary non-stationary time series by using instantaneous frequency and instantaneous amplitude. It also provides a possible alternative to the analysis of the non-stationary magnetotelluric (MT) signal. Based on the Hilbert-Huang transform (HHT), a time-frequency analysis method is proposed to obtain stable estimates of the magnetotelluric response function. In contrast to conventional methods, the response function estimation is performed in the time-frequency domain using instantaneous spectra rather than in the frequency domain, which allows for imaging the response parameter content as a function of time and frequency. The theory of the method is presented and the mathematical model and calculation procedure, which are used to estimate response function based on HHT time-frequency spectrum, are discussed. To evaluate the results, response function estimates are compared with estimates from a standard MT data processing method based on the Fourier transform. All results show that apparent resistivities and phases, which are calculated from the HHT time-frequency method, are generally more stable and reliable than those determined from the simple Fourier analysis. The proposed method overcomes the drawbacks of the traditional Fourier methods, and the resulting parameter minimises the estimation bias caused by the non-stationary characteristics of the MT data.

  4. On the effectiveness of recession analysis methods for capturing the characteristic storage-discharge relation: An intercomparison study

    NASA Astrophysics Data System (ADS)

    Chen, X.; Kumar, M.; Basso, S.; Marani, M.

    2017-12-01

    Storage-discharge (S-Q) relations are widely used to derive watershed properties and predict streamflow responses. These relations are often obtained using different recession analysis methods, which vary in recession period identification criteria and Q vs. -dQ/dt fitting scheme. Although previous studies have indicated that different recession analysis methods can result in significantly different S-Q relations and subsequently derived hydrological variables, this observation has often been overlooked and S-Q relations have been used in as is form. This study evaluated the effectiveness of four recession analysis methods in obtaining the characteristic S-Q relation and reconstructing the streamflow. Results indicate that while some methods generally performed better than others, none of them consistently outperformed the others. Even the best-performing method could not yield accurate reconstructed streamflow time series and its PDFs in some watersheds, implying that either derived S-Q relations might not be reliable or S-Q relations cannot be used for hydrological simulations. Notably, accuracy of the methods is influenced by the extent of scatter in the ln(-dQ/dt) vs. ln(Q) plot. In addition, the derived S-Q relation was very sensitive to the criteria used for identifying recession periods. This result raises a warning sign against indiscriminate application of recession analysis methods and derived S-Q relations for watershed characterizations or hydrologic simulations. Thorough evaluation of representativeness of the derived S-Q relation should be performed before it is used for hydrologic analysis.

  5. Iterative Strain-Gage Balance Calibration Data Analysis for Extended Independent Variable Sets

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred

    2011-01-01

    A new method was developed that makes it possible to use an extended set of independent calibration variables for an iterative analysis of wind tunnel strain gage balance calibration data. The new method permits the application of the iterative analysis method whenever the total number of balance loads and other independent calibration variables is greater than the total number of measured strain gage outputs. Iteration equations used by the iterative analysis method have the limitation that the number of independent and dependent variables must match. The new method circumvents this limitation. It simply adds a missing dependent variable to the original data set by using an additional independent variable also as an additional dependent variable. Then, the desired solution of the regression analysis problem can be obtained that fits each gage output as a function of both the original and additional independent calibration variables. The final regression coefficients can be converted to data reduction matrix coefficients because the missing dependent variables were added to the data set without changing the regression analysis result for each gage output. Therefore, the new method still supports the application of the two load iteration equation choices that the iterative method traditionally uses for the prediction of balance loads during a wind tunnel test. An example is discussed in the paper that illustrates the application of the new method to a realistic simulation of temperature dependent calibration data set of a six component balance.

  6. A Posteriori Error Analysis and Uncertainty Quantification for Adaptive Multiscale Operator Decomposition Methods for Multiphysics Problems

    DTIC Science & Technology

    2014-04-01

    Barrier methods for critical exponent problems in geometric analysis and mathematical physics, J. Erway and M. Holst, Submitted for publication ...TR-14-33 A Posteriori Error Analysis and Uncertainty Quantification for Adaptive Multiscale Operator Decomposition Methods for Multiphysics...Problems Approved for public release, distribution is unlimited. April 2014 HDTRA1-09-1-0036 Donald Estep and Michael

  7. Cost Analysis at the Local Level: Applications and Attitudes. Paper and Report Series No. 103.

    ERIC Educational Resources Information Center

    Smith, Jana Kay

    This study reports the results of a survey sent to 67 metropolitan school district evaluators. The survey assessed past and anticipated conduct of cost analysis methods, as well as attitudes toward the use of these methods. The instrument used contained many items taken from a survey instrument used in a previous study of cost analysis methods at…

  8. Scintillation Method of Analysis for Determination of Properties of Wear Particles in Lubricating Oils,

    DTIC Science & Technology

    1998-01-01

    Scintillation Method of Analysis for Determination of Properties of Wear Particles in Lubricating Oils Andrey B. Alkhimov Applied Physics Insitute...lubricating oils; metrological properties ; scintillation spectral analysis; spectrometer; unit-to-unit diagnostics; wear particles. Introduction: In...filter. The use of air reduces the metrological properties of the method, but it saves the operators the trouble and expense of using argon and

  9. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  10. General Framework for Meta-analysis of Rare Variants in Sequencing Association Studies

    PubMed Central

    Lee, Seunggeun; Teslovich, Tanya M.; Boehnke, Michael; Lin, Xihong

    2013-01-01

    We propose a general statistical framework for meta-analysis of gene- or region-based multimarker rare variant association tests in sequencing association studies. In genome-wide association studies, single-marker meta-analysis has been widely used to increase statistical power by combining results via regression coefficients and standard errors from different studies. In analysis of rare variants in sequencing studies, region-based multimarker tests are often used to increase power. We propose meta-analysis methods for commonly used gene- or region-based rare variants tests, such as burden tests and variance component tests. Because estimation of regression coefficients of individual rare variants is often unstable or not feasible, the proposed method avoids this difficulty by calculating score statistics instead that only require fitting the null model for each study and then aggregating these score statistics across studies. Our proposed meta-analysis rare variant association tests are conducted based on study-specific summary statistics, specifically score statistics for each variant and between-variant covariance-type (linkage disequilibrium) relationship statistics for each gene or region. The proposed methods are able to incorporate different levels of heterogeneity of genetic effects across studies and are applicable to meta-analysis of multiple ancestry groups. We show that the proposed methods are essentially as powerful as joint analysis by directly pooling individual level genotype data. We conduct extensive simulations to evaluate the performance of our methods by varying levels of heterogeneity across studies, and we apply the proposed methods to meta-analysis of rare variant effects in a multicohort study of the genetics of blood lipid levels. PMID:23768515

  11. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    PubMed

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  12. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  13. Yining Zeng | NREL

    Science.gov Websites

    pretreatment conditions and biological digestion methods, which might not be detected by large-scale ) "Coherent Raman Microscopy Analysis of Plant Cell Walls," Biomass Conversion: Methods and Protocols, Methods in Molecular Biology (2012) "Chemical, Ultrastructural and Supramolecular Analysis

  14. Application of the Bootstrap Methods in Factor Analysis.

    ERIC Educational Resources Information Center

    Ichikawa, Masanori; Konishi, Sadanori

    1995-01-01

    A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)

  15. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  16. The Structure of Mixed Method Studies in Educational Research: A Content Analysis

    ERIC Educational Resources Information Center

    Bryant, Lauren H.

    2011-01-01

    Educational researchers are beginning to use mixed methods designs to answer complex research questions. This content analysis investigates the structure and use of mixed methods in educational research in order to work toward a more standardized presentation. I used a concurrent mixed methods approach to analyze 30 studies from three prominent…

  17. An Improved Manual Method for NOx Emission Measurement.

    ERIC Educational Resources Information Center

    Dee, L. A.; And Others

    The current manual NO (x) sampling and analysis method was evaluated. Improved time-integrated sampling and rapid analysis methods were developed. In the new method, the sample gas is drawn through a heated bed of uniquely active, crystalline, Pb02 where NO (x) is quantitatively absorbed. Nitrate ion is later extracted with water and the…

  18. Effect of the absolute statistic on gene-sampling gene-set analysis methods.

    PubMed

    Nam, Dougu

    2017-06-01

    Gene-set enrichment analysis and its modified versions have commonly been used for identifying altered functions or pathways in disease from microarray data. In particular, the simple gene-sampling gene-set analysis methods have been heavily used for datasets with only a few sample replicates. The biggest problem with this approach is the highly inflated false-positive rate. In this paper, the effect of absolute gene statistic on gene-sampling gene-set analysis methods is systematically investigated. Thus far, the absolute gene statistic has merely been regarded as a supplementary method for capturing the bidirectional changes in each gene set. Here, it is shown that incorporating the absolute gene statistic in gene-sampling gene-set analysis substantially reduces the false-positive rate and improves the overall discriminatory ability. Its effect was investigated by power, false-positive rate, and receiver operating curve for a number of simulated and real datasets. The performances of gene-set analysis methods in one-tailed (genome-wide association study) and two-tailed (gene expression data) tests were also compared and discussed.

  19. The Effect of Laminar Flow on Rotor Hover Performance

    NASA Technical Reports Server (NTRS)

    Overmeyer, Austin D.; Martin, Preston B.

    2017-01-01

    The topic of laminar flow effects on hover performance is introduced with respect to some historical efforts where laminar flow was either measured or attempted. An analysis method is outlined using combined blade element, momentum method coupled to an airfoil analysis method, which includes the full e(sup N) transition model. The analysis results compared well with the measured hover performance including the measured location of transition on both the upper and lower blade surfaces. The analysis method is then used to understand the upper limits of hover efficiency as a function of disk loading. The impact of laminar flow is higher at low disk loading, but significant improvement in terms of power loading appears possible even up to high disk loading approaching 20 ps f. A optimum planform design equation is derived for cases of zero profile drag and finite drag levels. These results are intended to be a guide for design studies and as a benchmark to compare higher fidelity analysis results. The details of the analysis method are given to enable other researchers to use the same approach for comparison to other approaches.

  20. Quantitative Determination of Cannabinoids in Cannabis and Cannabis Products Using Ultra-High-Performance Supercritical Fluid Chromatography and Diode Array/Mass Spectrometric Detection.

    PubMed

    Wang, Mei; Wang, Yan-Hong; Avula, Bharathi; Radwan, Mohamed M; Wanas, Amira S; Mehmedic, Zlatko; van Antwerp, John; ElSohly, Mahmoud A; Khan, Ikhlas A

    2017-05-01

    Ultra-high-performance supercritical fluid chromatography (UHPSFC) is an efficient analytical technique and has not been fully employed for the analysis of cannabis. Here, a novel method was developed for the analysis of 30 cannabis plant extracts and preparations using UHPSFC/PDA-MS. Nine of the most abundant cannabinoids, viz. CBD, ∆ 8 -THC, THCV, ∆ 9 -THC, CBN, CBG, THCA-A, CBDA, and CBGA, were quantitatively determined (RSDs < 6.9%). Unlike GC methods, no derivatization or decarboxylation was required prior to UHPSFC analysis. The UHPSFC chromatographic separation of cannabinoids displayed an inverse elution order compared to UHPLC. Combining with PDA-MS, this orthogonality is valuable for discrimination of cannabinoids in complex matrices. The developed method was validated, and the quantification results were compared with a standard UHPLC method. The RSDs of these two methods were within ±13.0%. Finally, chemometric analysis including principal component analysis (PCA) and partial least squares-discriminant analysis (PLS-DA) were used to differentiate between cannabis samples. © 2016 American Academy of Forensic Sciences.

  1. Heuristic Analysis Model of Nitrided Layers’ Formation Consisting of the Image Processing and Analysis and Elements of Artificial Intelligence

    PubMed Central

    Wójcicki, Tomasz; Nowicki, Michał

    2016-01-01

    The article presents a selected area of research and development concerning the methods of material analysis based on the automatic image recognition of the investigated metallographic sections. The objectives of the analyses of the materials for gas nitriding technology are described. The methods of the preparation of nitrided layers, the steps of the process and the construction and operation of devices for gas nitriding are given. We discuss the possibility of using the methods of digital images processing in the analysis of the materials, as well as their essential task groups: improving the quality of the images, segmentation, morphological transformations and image recognition. The developed analysis model of the nitrided layers formation, covering image processing and analysis techniques, as well as selected methods of artificial intelligence are presented. The model is divided into stages, which are formalized in order to better reproduce their actions. The validation of the presented method is performed. The advantages and limitations of the developed solution, as well as the possibilities of its practical use, are listed. PMID:28773389

  2. SSAW: A new sequence similarity analysis method based on the stationary discrete wavelet transform.

    PubMed

    Lin, Jie; Wei, Jing; Adjeroh, Donald; Jiang, Bing-Hua; Jiang, Yue

    2018-05-02

    Alignment-free sequence similarity analysis methods often lead to significant savings in computational time over alignment-based counterparts. A new alignment-free sequence similarity analysis method, called SSAW is proposed. SSAW stands for Sequence Similarity Analysis using the Stationary Discrete Wavelet Transform (SDWT). It extracts k-mers from a sequence, then maps each k-mer to a complex number field. Then, the series of complex numbers formed are transformed into feature vectors using the stationary discrete wavelet transform. After these steps, the original sequence is turned into a feature vector with numeric values, which can then be used for clustering and/or classification. Using two different types of applications, namely, clustering and classification, we compared SSAW against the the-state-of-the-art alignment free sequence analysis methods. SSAW demonstrates competitive or superior performance in terms of standard indicators, such as accuracy, F-score, precision, and recall. The running time was significantly better in most cases. These make SSAW a suitable method for sequence analysis, especially, given the rapidly increasing volumes of sequence data required by most modern applications.

  3. A dimension-wise analysis method for the structural-acoustic system with interval parameters

    NASA Astrophysics Data System (ADS)

    Xu, Menghui; Du, Jianke; Wang, Chong; Li, Yunlong

    2017-04-01

    The interval structural-acoustic analysis is mainly accomplished by interval and subinterval perturbation methods. Potential limitations for these intrusive methods include overestimation or interval translation effect for the former and prohibitive computational cost for the latter. In this paper, a dimension-wise analysis method is thus proposed to overcome these potential limitations. In this method, a sectional curve of the system response surface along each input dimensionality is firstly extracted, the minimal and maximal points of which are identified based on its Legendre polynomial approximation. And two input vectors, i.e. the minimal and maximal input vectors, are dimension-wisely assembled by the minimal and maximal points of all sectional curves. Finally, the lower and upper bounds of system response are computed by deterministic finite element analysis at the two input vectors. Two numerical examples are studied to demonstrate the effectiveness of the proposed method and show that, compared to the interval and subinterval perturbation method, a better accuracy is achieved without much compromise on efficiency by the proposed method, especially for nonlinear problems with large interval parameters.

  4. Stability indicating HPLC-DAD method for analysis of Ketorolac binary and ternary mixtures in eye drops: Quantitative analysis in rabbit aqueous humor.

    PubMed

    El Yazbi, Fawzy A; Hassan, Ekram M; Khamis, Essam F; Ragab, Marwa A A; Hamdy, Mohamed M A

    2017-11-15

    Ketorolac tromethamine (KTC) with phenylephrine hydrochloride (PHE) binary mixture (mixture 1) and their ternary mixture with chlorpheniramine maleate (CPM) (mixture 2) were analyzed using a validated HPLC-DAD method. The developed method was suitable for the in vitro as well as quantitative analysis of the targeted mixtures in rabbit aqueous humor. The analysis in dosage form (eye drops) was a stability indicating one at which drugs were separated from possible degradation products arising from different stress conditions (in vitro analysis). For analysis in aqueous humor, Guaifenesin (GUF) was used as internal standard and the method was validated according to FDA regulation for analysis in biological fluids. Agilent 5 HC-C18(2) 150×4.6mm was used as stationary phase with a gradient eluting solvent of 20mM phosphate buffer pH 4.6 containing 0.2% triethylamine and acetonitrile. The drugs were resolved with retention times of 2.41, 5.26, 7.92 and 9.64min for PHE, GUF, KTC and CPM, respectively. The method was sensitive and selective to analyze simultaneously the three drugs in presence of possible forced degradation products and dosage form excipients (in vitro analysis) and also with the internal standard, in presence of aqueous humor interferences (analysis in biological fluid), at a single wavelength (261nm). No extraction procedure was required for analysis in aqueous humor. The simplicity of the method emphasizes its capability to analyze the drugs in vivo (in rabbit aqueous humor) and in vitro (in pharmaceutical formulations). Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Evaluation of Yogurt Microstructure Using Confocal Laser Scanning Microscopy and Image Analysis.

    PubMed

    Skytte, Jacob L; Ghita, Ovidiu; Whelan, Paul F; Andersen, Ulf; Møller, Flemming; Dahl, Anders B; Larsen, Rasmus

    2015-06-01

    The microstructure of protein networks in yogurts defines important physical properties of the yogurt and hereby partly its quality. Imaging this protein network using confocal scanning laser microscopy (CSLM) has shown good results, and CSLM has become a standard measuring technique for fermented dairy products. When studying such networks, hundreds of images can be obtained, and here image analysis methods are essential for using the images in statistical analysis. Previously, methods including gray level co-occurrence matrix analysis and fractal analysis have been used with success. However, a range of other image texture characterization methods exists. These methods describe an image by a frequency distribution of predefined image features (denoted textons). Our contribution is an investigation of the choice of image analysis methods by performing a comparative study of 7 major approaches to image texture description. Here, CSLM images from a yogurt fermentation study are investigated, where production factors including fat content, protein content, heat treatment, and incubation temperature are varied. The descriptors are evaluated through nearest neighbor classification, variance analysis, and cluster analysis. Our investigation suggests that the texton-based descriptors provide a fuller description of the images compared to gray-level co-occurrence matrix descriptors and fractal analysis, while still being as applicable and in some cases as easy to tune. © 2015 Institute of Food Technologists®

  6. 3-D surface profilometry based on modulation measurement by applying wavelet transform method

    NASA Astrophysics Data System (ADS)

    Zhong, Min; Chen, Feng; Xiao, Chao; Wei, Yongchao

    2017-01-01

    A new analysis of 3-D surface profilometry based on modulation measurement technique by the application of Wavelet Transform method is proposed. As a tool excelling for its multi-resolution and localization in the time and frequency domains, Wavelet Transform method with good localized time-frequency analysis ability and effective de-noizing capacity can extract the modulation distribution more accurately than Fourier Transform method. Especially for the analysis of complex object, more details of the measured object can be well remained. In this paper, the theoretical derivation of Wavelet Transform method that obtains the modulation values from a captured fringe pattern is given. Both computer simulation and elementary experiment are used to show the validity of the proposed method by making a comparison with the results of Fourier Transform method. The results show that the Wavelet Transform method has a better performance than the Fourier Transform method in modulation values retrieval.

  7. Security Analysis and Improvements to the PsychoPass Method

    PubMed Central

    2013-01-01

    Background In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. Objective To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. Methods We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. Results The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. Conclusions The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength. PMID:23942458

  8. A Comparison of the Effectiveness of Two Design Methodologies in a Secondary School Setting.

    ERIC Educational Resources Information Center

    Cannizzaro, Brenton; Boughton, Doug

    1998-01-01

    Examines the effectiveness of the analysis-synthesis and generator-conjuncture-analysis models of design education. Concludes that the generator-conjecture-analysis design method produced student design product of a slightly higher standard than the analysis-synthesis design method. Discusses the findings in more detail and considers implications.…

  9. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  10. Recent Advances in the Method of Forces: Integrated Force Method of Structural Analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1998-01-01

    Stress that can be induced in an elastic continuum can be determined directly through the simultaneous application of the equilibrium equations and the compatibility conditions. In the literature, this direct stress formulation is referred to as the integrated force method. This method, which uses forces as the primary unknowns, complements the popular equilibrium-based stiffness method, which considers displacements as the unknowns. The integrated force method produces accurate stress, displacement, and frequency results even for modest finite element models. This version of the force method should be developed as an alternative to the stiffness method because the latter method, which has been researched for the past several decades, may have entered its developmental plateau. Stress plays a primary role in the development of aerospace and other products, and its analysis is difficult. Therefore, it is advisable to use both methods to calculate stress and eliminate errors through comparison. This paper examines the role of the integrated force method in analysis, animation and design.

  11. Frequency analysis of uncertain structures using imprecise probability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Modares, Mehdi; Bergerson, Joshua

    2015-01-01

    Two new methods for finite element based frequency analysis of a structure with uncertainty are developed. An imprecise probability formulation based on enveloping p-boxes is used to quantify the uncertainty present in the mechanical characteristics of the structure. For each element, independent variations are considered. Using the two developed methods, P-box Frequency Analysis (PFA) and Interval Monte-Carlo Frequency Analysis (IMFA), sharp bounds on natural circular frequencies at different probability levels are obtained. These methods establish a framework for handling incomplete information in structural dynamics. Numerical example problems are presented that illustrate the capabilities of the new methods along with discussionsmore » on their computational efficiency.« less

  12. [Methods of statistical analysis in differential diagnostics of the degree of brain glioma anaplasia during preoperative stage].

    PubMed

    Glavatskiĭ, A Ia; Guzhovskaia, N V; Lysenko, S N; Kulik, A V

    2005-12-01

    The authors proposed a possible preoperative diagnostics of the degree of supratentorial brain gliom anaplasia using statistical analysis methods. It relies on a complex examination of 934 patients with I-IV degree anaplasias, which had been treated in the Institute of Neurosurgery from 1990 to 2004. The use of statistical analysis methods for differential diagnostics of the degree of brain gliom anaplasia may optimize a diagnostic algorithm, increase reliability of obtained data and in some cases avoid carrying out irrational operative intrusions. Clinically important signs for the use of statistical analysis methods directed to preoperative diagnostics of brain gliom anaplasia have been defined

  13. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A performance analysis method for distributed real-time robotic systems: A case study of remote teleoperation

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Sanderson, A. C.

    1994-01-01

    Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

  15. Analysis of the nature and cause of turbulence upset using airline flight records

    NASA Technical Reports Server (NTRS)

    Parks, E. K.; Bach, R. E., Jr.; Wingrove, R. C.

    1982-01-01

    The development and application of methods for determining aircraft motions and related winds, using data normally recorded during airline flight operations, are described. The methods are being developed, in cooperation with the National Transportation Safety Board, to aid in the analysis and understanding of circumstances associated with aircraft accidents or incidents. Data from a recent DC-10 encounter with severe, high-altitude turbulence are used to illustrate the methods. The analysis of this encounter shows the turbulence to be a series of equally spaced horizontal swirls known as 'cat's eyes' vortices. The use of flight-data analysis methods to identify this type of turbulence phenomenon is presented for the first time.

  16. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 2: Data

    NASA Technical Reports Server (NTRS)

    Waszak, M. R.; Schmidt, D. K.

    1985-01-01

    Two analysis methods are applied to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop modal analysis technique. This method considers the effect of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Both analyses indicated that dynamic aeroelastic effects caused a degradation in vehicle tracking performance, based on the evaluation of some simulation results. Volume 2 consists of the presentation of the state variable models of the flexible aircraft configurations used in the analysis applications mode shape plots for the structural modes, numerical results from the modal analysis frequency response plots from the pilot in the loop analysis and a listing of the modal analysis computer program.

  17. Social network analysis: Presenting an underused method for nursing research.

    PubMed

    Parnell, James Michael; Robinson, Jennifer C

    2018-06-01

    This paper introduces social network analysis as a versatile method with many applications in nursing research. Social networks have been studied for years in many social science fields. The methods continue to advance but remain unknown to most nursing scholars. Discussion paper. English language and interpreted literature was searched from Ovid Healthstar, CINAHL, PubMed Central, Scopus and hard copy texts from 1965 - 2017. Social network analysis first emerged in nursing literature in 1995 and appears minimally through present day. To convey the versatility and applicability of social network analysis in nursing, hypothetical scenarios are presented. The scenarios are illustrative of three approaches to social network analysis and include key elements of social network research design. The methods of social network analysis are underused in nursing research, primarily because they are unknown to most scholars. However, there is methodological flexibility and epistemological versatility capable of supporting quantitative and qualitative research. The analytic techniques of social network analysis can add new insight into many areas of nursing inquiry, especially those influenced by cultural norms. Furthermore, visualization techniques associated with social network analysis can be used to generate new hypotheses. Social network analysis can potentially uncover findings not accessible through methods commonly used in nursing research. Social networks can be analysed based on individual-level attributes, whole networks and subgroups within networks. Computations derived from social network analysis may stand alone to answer a research question or incorporated as variables into robust statistical models. © 2018 John Wiley & Sons Ltd.

  18. Quality Analysis of Chlorogenic Acid and Hyperoside in Crataegi fructus

    PubMed Central

    Weon, Jin Bae; Jung, Youn Sik; Ma, Choong Je

    2016-01-01

    Background: Crataegi fructus is a herbal medicine for strong stomach, sterilization, and alcohol detoxification. Chlorogenic acid and hyperoside are the major compounds in Crataegi fructus. Objective: In this study, we established novel high-performance liquid chromatography (HPLC)-diode array detection analysis method of chlorogenic acid and hyperoside for quality control of Crataegi fructus. Materials and Methods: HPLC analysis was achieved on a reverse-phase C18 column (5 μm, 4.6 mm × 250 mm) using water and acetonitrile as mobile phase with gradient system. The method was validated for linearity, precision, and accuracy. About 31 batches of Crataegi fructus samples collected from Korea and China were analyzed by using HPLC fingerprint of developed HPLC method. Then, the contents of chlorogenic acid and hyperoside were compared for quality evaluation of Crataegi fructus. Results: The results have shown that the average contents (w/w %) of chlorogenic acid and hyperoside in Crataegi fructus collected from Korea were 0.0438% and 0.0416%, respectively, and the average contents (w/w %) of 0.0399% and 0.0325%, respectively. Conclusion: In conclusion, established HPLC analysis method was stable and could provide efficient quality evaluation for monitoring of commercial Crataegi fructus. SUMMARY Quantitative analysis method of chlorogenic acid and hyperoside in Crataegi fructus is developed by high.performance liquid chromatography.(HPLC).diode array detectionEstablished HPLC analysis method is validated with linearity, precision, and accuracyThe developed method was successfully applied for quantitative analysis of Crataegi fructus sample collected from Korea and China. Abbreviations used: HPLC: High-performance liquid chromatography, GC: Gas chromatography, MS: Mass spectrometer, LOD: Limits of detection, LOQ: Limits of quantification, RSD: Relative standard deviation, RRT: Relative retention time, RPA: Relation peak area. PMID:27076744

  19. Visual Aggregate Analysis of Eligibility Features of Clinical Trials

    PubMed Central

    He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua

    2015-01-01

    Objective To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Methods Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. Results We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions “hypertension” and “Type 2 diabetes”, respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. Conclusions We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. PMID:25615940

  20. ANTONIA perfusion and stroke. A software tool for the multi-purpose analysis of MR perfusion-weighted datasets and quantitative ischemic stroke assessment.

    PubMed

    Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J

    2014-01-01

    The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.

  1. Correlation between Wavelength Dispersive X-ray Fluorescence (WDXRF) analysis of hardened concrete for chlorides vs. Atomic Absorption (AA) analysis in accordance with AASHTO T- 260; sampling and testing for chloride ion in concrete and concrete raw mater

    DOT National Transportation Integrated Search

    2014-04-01

    A correlation between Wavelength Dispersive X-ray Fluorescence(WDXRF) analysis of Hardened : Concrete for Chlorides and Atomic Absorption (AA) analysis (current method AASHTO T-260, procedure B) has been : found and a new method of analysis has been ...

  2. Survival analysis: Part I — analysis of time-to-event

    PubMed Central

    2018-01-01

    Length of time is a variable often encountered during data analysis. Survival analysis provides simple, intuitive results concerning time-to-event for events of interest, which are not confined to death. This review introduces methods of analyzing time-to-event. The Kaplan-Meier survival analysis, log-rank test, and Cox proportional hazards regression modeling method are described with examples of hypothetical data. PMID:29768911

  3. The Oxygen Flask Method

    ERIC Educational Resources Information Center

    Boulton, L. H.

    1973-01-01

    Discusses application of Schoniger's method of quantitative organic elemental analysis in teaching of qualitative analysis of the halogens, nitrogen, sulphur, and phosphorus. Indicates that the oxygen flask method is safe and suitable for both high school and college courses because of simple apparatus requirements. (CC)

  4. Validated Test Method 5030C: Purge-and-Trap for Aqueous Samples

    EPA Pesticide Factsheets

    This method describes a purge-and-trap procedure for the analysis of volatile organic compoundsin aqueous samples & water miscible liquid samples. It also describes the analysis of high concentration soil and waste sample extracts prepared in Method 5035.

  5. LAKE DATA ANALYSIS AND NUTRIENT BUDGET MODELING

    EPA Science Inventory

    Several quantitative methods that may be useful for lake trophic quality management planning are discussed and illustrated. An emphasis is placed on scientific methods in research, data analysis, and modeling. Proper use of statistical methods is also stressed, along with conside...

  6. Application of Gray Relational Analysis Method in Comprehensive Evaluation on the Customer Satisfaction of Automobile 4S Enterprises

    NASA Astrophysics Data System (ADS)

    Cenglin, Yao

    The car sales enterprises could continuously boost sales and expand customer groups, an important method is to enhance the customer satisfaction. The customer satisfaction of car sales enterprises (4S enterprises) depends on many factors. By using the grey relational analysis method, we could perfectly combine various factors in terms of customer satisfaction. And through the vertical contrast, car sales enterprises could find specific factors which will improve customer satisfaction, thereby increase sales volume and benefits. Gray relational analysis method has become a kind of good method and means to analyze and evaluate the enterprises.

  7. Analysis of Flexible Bars and Frames with Large Displacements of Nodes By Finite Element Method in the Form of Classical Mixed Method

    NASA Astrophysics Data System (ADS)

    Ignatyev, A. V.; Ignatyev, V. A.; Onischenko, E. V.

    2017-11-01

    This article is the continuation of the work made bt the authors on the development of the algorithms that implement the finite element method in the form of a classical mixed method for the analysis of geometrically nonlinear bar systems [1-3]. The paper describes an improved algorithm of the formation of the nonlinear governing equations system for flexible plane frames and bars with large displacements of nodes based on the finite element method in a mixed classical form and the use of the procedure of step-by-step loading. An example of the analysis is given.

  8. Least Squares Moving-Window Spectral Analysis.

    PubMed

    Lee, Young Jong

    2017-08-01

    Least squares regression is proposed as a moving-windows method for analysis of a series of spectra acquired as a function of external perturbation. The least squares moving-window (LSMW) method can be considered an extended form of the Savitzky-Golay differentiation for nonuniform perturbation spacing. LSMW is characterized in terms of moving-window size, perturbation spacing type, and intensity noise. Simulation results from LSMW are compared with results from other numerical differentiation methods, such as single-interval differentiation, autocorrelation moving-window, and perturbation correlation moving-window methods. It is demonstrated that this simple LSMW method can be useful for quantitative analysis of nonuniformly spaced spectral data with high frequency noise.

  9. An approximation method for configuration optimization of trusses

    NASA Technical Reports Server (NTRS)

    Hansen, Scott R.; Vanderplaats, Garret N.

    1988-01-01

    Two- and three-dimensional elastic trusses are designed for minimum weight by varying the areas of the members and the location of the joints. Constraints on member stresses and Euler buckling are imposed and multiple static loading conditions are considered. The method presented here utilizes an approximate structural analysis based on first order Taylor series expansions of the member forces. A numerical optimizer minimizes the weight of the truss using information from the approximate structural analysis. Comparisons with results from other methods are made. It is shown that the method of forming an approximate structural analysis based on linearized member forces leads to a highly efficient method of truss configuration optimization.

  10. Mathematical Methods in Wave Propagation: Part 2--Non-Linear Wave Front Analysis

    ERIC Educational Resources Information Center

    Jeffrey, Alan

    1971-01-01

    The paper presents applications and methods of analysis for non-linear hyperbolic partial differential equations. The paper is concluded by an account of wave front analysis as applied to the piston problem of gas dynamics. (JG)

  11. Two MIS Analysis Methods: An Experimental Comparison.

    ERIC Educational Resources Information Center

    Wang, Shouhong

    1996-01-01

    In China, 24 undergraduate business students applied data flow diagrams (DFD) to a mini-case, and 20 used object-oriented analysis (OOA). DFD seemed easier to learn, but after training, those using the OOA method for systems analysis made fewer errors. (SK)

  12. The use of the wavelet cluster analysis for asteroid family determination

    NASA Technical Reports Server (NTRS)

    Benjoya, Phillippe; Slezak, E.; Froeschle, Claude

    1992-01-01

    The asteroid family determination has been analysis method dependent for a longtime. A new cluster analysis based on the wavelet transform has allowed an automatic definition of families with a degree of significance versus randomness. Actually this method is rather general and can be applied to any kind of structural analysis. We will rather concentrate on the main features of the method. The analysis has been performed on the set of 4100 asteroid proper elements computed by Milani and Knezevic (see Milani and Knezevic 1990). Twenty one families have been found and influence of the chosen metric has been tested. The results have beem compared to Zappala et al.'s ones (see Zappala et al 1990) obtained by the use of a completely different method applied to the same set of data. For the first time, a good overlapping has been found between both method results, not only for the big well known families but also for the smallest ones.

  13. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  14. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less

  15. Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Roberts, Larry W.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  16. Tracking B-Cell Repertoires and Clonal Histories in Normal and Malignant Lymphocytes.

    PubMed

    Weston-Bell, Nicola J; Cowan, Graeme; Sahota, Surinder S

    2017-01-01

    Methods for tracking B-cell repertoires and clonal history in normal and malignant B-cells based on immunoglobulin variable region (IGV) gene analysis have developed rapidly with the advent of massive parallel next-generation sequencing (mpNGS) protocols. mpNGS permits a depth of analysis of IGV genes not hitherto feasible, and presents challenges of bioinformatics analysis, which can be readily met by current pipelines. This strategy offers a potential resolution of B-cell usage at a depth that may capture fully the natural state, in a given biological setting. Conventional methods based on RT-PCR amplification and Sanger sequencing are also available where mpNGS is not accessible. Each method offers distinct advantages. Conventional methods for IGV gene sequencing are readily adaptable to most laboratories and provide an ease of analysis to capture salient features of B-cell use. This chapter describes two methods in detail for analysis of IGV genes, mpNGS and conventional RT-PCR with Sanger sequencing.

  17. The Researches on Damage Detection Method for Truss Structures

    NASA Astrophysics Data System (ADS)

    Wang, Meng Hong; Cao, Xiao Nan

    2018-06-01

    This paper presents an effective method to detect damage in truss structures. Numerical simulation and experimental analysis were carried out on a damaged truss structure under instantaneous excitation. The ideal excitation point and appropriate hammering method were determined to extract time domain signals under two working conditions. The frequency response function and principal component analysis were used for data processing, and the angle between the frequency response function vectors was selected as a damage index to ascertain the location of a damaged bar in the truss structure. In the numerical simulation, the time domain signal of all nodes was extracted to determine the location of the damaged bar. In the experimental analysis, the time domain signal of a portion of the nodes was extracted on the basis of an optimal sensor placement method based on the node strain energy coefficient. The results of the numerical simulation and experimental analysis showed that the damage detection method based on the frequency response function and principal component analysis could locate the damaged bar accurately.

  18. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  19. The study on biomass fraction estimate methodology of municipal solid waste incinerator in Korea.

    PubMed

    Kang, Seongmin; Kim, Seungjin; Lee, Jeongwoo; Yun, Hyunki; Kim, Ki-Hyun; Jeon, Eui-Chan

    2016-10-01

    In Korea, the amount of greenhouse gases released due to waste materials was 14,800,000 t CO2eq in 2012, which increased from 5,000,000 t CO2eq in 2010. This included the amount released due to incineration, which has gradually increased since 2010. Incineration was found to be the biggest contributor to greenhouse gases, with 7,400,000 t CO2eq released in 2012. Therefore, with regards to the trading of greenhouse gases emissions initiated in 2015 and the writing of the national inventory report, it is important to increase the reliability of the measurements related to the incineration of waste materials. This research explored methods for estimating the biomass fraction at Korean MSW incinerator facilities and compared the biomass fractions obtained with the different biomass fraction estimation methods. The biomass fraction was estimated by the method using default values of fossil carbon fraction suggested by IPCC, the method using the solid waste composition, and the method using incinerator flue gas. The highest biomass fractions in Korean municipal solid waste incinerator facilities were estimated by the IPCC Default method, followed by the MSW analysis method and the Flue gas analysis method. Therefore, the difference in the biomass fraction estimate was the greatest between the IPCC Default and the Flue gas analysis methods. The difference between the MSW analysis and the flue gas analysis methods was smaller than the difference with IPCC Default method. This suggested that the use of the IPCC default method cannot reflect the characteristics of Korean waste incinerator facilities and Korean MSW. Incineration is one of most effective methods for disposal of municipal solid waste (MSW). This paper investigates the applicability of using biomass content to estimate the amount of CO2 released, and compares the biomass contents determined by different methods in order to establish a method for estimating biomass in the MSW incinerator facilities of Korea. After analyzing the biomass contents of the collected solid waste samples and the flue gas samples, the results were compared with the Intergovernmental Panel on Climate Change (IPCC) method, and it seems that to calculate the biomass fraction it is better to use the flue gas analysis method than the IPCC method. It is valuable to design and operate a real new incineration power plant, especially for the estimation of greenhouse gas emissions.

  20. An Error Analysis for the Finite Element Method Applied to Convection Diffusion Problems.

    DTIC Science & Technology

    1981-03-01

    D TFhG-]NOLOGY k 4b 00 \\" ) ’b Technical Note BN-962 AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONVECTION DIFFUSION PROBLEM by I...Babu~ka and W. G. Szym’czak March 1981 V.. UNVI I Of- ’i -S AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD P. - 0 w APPLIED TO CONVECTION DIFFUSION ...AOAO98 895 MARYLAND UNIVYCOLLEGE PARK INST FOR PHYSICAL SCIENCE--ETC F/G 12/I AN ERROR ANALYIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONV..ETC (U

  1. Application of integrated fluid-thermal-structural analysis methods

    NASA Technical Reports Server (NTRS)

    Wieting, Allan R.; Dechaumphai, Pramote; Bey, Kim S.; Thornton, Earl A.; Morgan, Ken

    1988-01-01

    Hypersonic vehicles operate in a hostile aerothermal environment which has a significant impact on their aerothermostructural performance. Significant coupling occurs between the aerodynamic flow field, structural heat transfer, and structural response creating a multidisciplinary interaction. Interfacing state-of-the-art disciplinary analysis methods is not efficient, hence interdisciplinary analysis methods integrated into a single aerothermostructural analyzer are needed. The NASA Langley Research Center is developing such methods in an analyzer called LIFTS (Langley Integrated Fluid-Thermal-Structural) analyzer. The evolution and status of LIFTS is reviewed and illustrated through applications.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, J.A.; Clauss, S.A.; Grant, K.E.

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work.

  3. EMC analysis of MOS-1

    NASA Astrophysics Data System (ADS)

    Ishizawa, Y.; Abe, K.; Shirako, G.; Takai, T.; Kato, H.

    The electromagnetic compatibility (EMC) control method, system EMC analysis method, and system test method which have been applied to test the components of the MOS-1 satellite are described. The merits and demerits of the problem solving, specification, and system approaches to EMC control are summarized, and the data requirements of the SEMCAP (specification and electromagnetic compatibility analysis program) computer program for verifying the EMI safety margin of the components are sumamrized. Examples of EMC design are mentioned, and the EMC design process and selection method for EMC critical points are shown along with sample EMC test results.

  4. Component-specific modeling

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.

    1985-01-01

    A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.

  5. Improving the sensitivity and accuracy of gamma activation analysis for the rapid determination of gold in mineral ores.

    PubMed

    Tickner, James; Ganly, Brianna; Lovric, Bojan; O'Dwyer, Joel

    2017-04-01

    Mining companies rely on chemical analysis methods to determine concentrations of gold in mineral ore samples. As gold is often mined commercially at concentrations around 1 part-per-million, it is necessary for any analysis method to provide good sensitivity as well as high absolute accuracy. We describe work to improve both the sensitivity and accuracy of the gamma activation analysis (GAA) method for gold. We present analysis results for several suites of ore samples and discuss the design of a GAA facility designed to replace conventional chemical assay in industrial applications. Copyright © 2017. Published by Elsevier Ltd.

  6. Foundation Analysis East Coast Air Combat Maneuvering Range Offshore Kitty Hawk, North Carolina.

    DTIC Science & Technology

    1976-09-01

    1976 86 2 3 025 TABLE OF CONTENTS SECTION TITLE PAGE 1.0 INTRODUCTION 1.1 Introduction 1.01 1.2 Methods of Analysis 1.01 1.3 Personnel Resumes 1.02...piling into the desired penetration. 1.2 METHODS OF ANALYSIS The method employed to perform the computation of pipe pile capacity curves, as presented...AD-A163 522 FOUNDATION ANALYSIS EAST COAST AIR COMBAT NANsUVERING 14S RANGE OFFSHORE KITT.. CU) CREST ENGINEERING INC TULSA OK SEP 76 27-M7-97 CNES

  7. Geometrical optics analysis of the structural imperfection of retroreflection corner cubes with a nonlinear conjugate gradient method.

    PubMed

    Kim, Hwi; Min, Sung-Wook; Lee, Byoungho

    2008-12-01

    Geometrical optics analysis of the structural imperfection of retroreflection corner cubes is described. In the analysis, a geometrical optics model of six-beam reflection patterns generated by an imperfect retroreflection corner cube is developed, and its structural error extraction is formulated as a nonlinear optimization problem. The nonlinear conjugate gradient method is employed for solving the nonlinear optimization problem, and its detailed implementation is described. The proposed method of analysis is a mathematical basis for the nondestructive optical inspection of imperfectly fabricated retroreflection corner cubes.

  8. Finite element modeling of truss structures with frequency-dependent material damping

    NASA Technical Reports Server (NTRS)

    Lesieutre, George A.

    1991-01-01

    A physically motivated modelling technique for structural dynamic analysis that accommodates frequency dependent material damping was developed. Key features of the technique are the introduction of augmenting thermodynamic fields (AFT) to interact with the usual mechanical displacement field, and the treatment of the resulting coupled governing equations using finite element analysis methods. The AFT method is fully compatible with current structural finite element analysis techniques. The method is demonstrated in the dynamic analysis of a 10-bay planar truss structure, a structure representative of those contemplated for use in future space systems.

  9. A Study on Multi-Swing Stability Analysis of Power System using Damping Rate Inversion

    NASA Astrophysics Data System (ADS)

    Tsuji, Takao; Morii, Yuki; Oyama, Tsutomu; Hashiguchi, Takuhei; Goda, Tadahiro; Nomiyama, Fumitoshi; Kosugi, Narifumi

    In recent years, much attention is paid to the nonlinear analysis method in the field of stability analysis of power systems. Especially for the multi-swing stability analysis, the unstable limit cycle has an important meaning as a stability margin. It is required to develop a high speed calculation method of stability boundary regarding multi-swing stability because the real-time calculation of ATC is necessary to realize the flexible wheeling trades. Therefore, the authors have developed a new method which can calculate the unstable limit cycle based on damping rate inversion method. Using the unstable limit cycle, it is possible to predict the multi-swing stability at the time when the fault transmission line is reclosed. The proposed method is tested in Lorenz equation, single-machine infinite-bus system model and IEEJ WEST10 system model.

  10. Linnorm: improved statistical analysis for single cell RNA-seq expression data.

    PubMed

    Yip, Shun H; Wang, Panwen; Kocher, Jean-Pierre A; Sham, Pak Chung; Wang, Junwen

    2017-12-15

    Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Participatory Design Methods for C2 Systems (Proceedings/Presentation)

    DTIC Science & Technology

    2006-01-01

    Cognitive Task Analysis (CTA) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION 18. NUMBER 19a. NAME OF RESPONSIBLE PERSON OF ABSTRACT OF PAGES Janet E. Miller...systems to support cognitive work such as is accomplished in a network-centric -environment. Cognitive task analysis (CTA) methods are used to...of cognitive task analysis methodologies exist (Schraagen et al., 2000). However, many of these methods are skeptically viewed by a domain’s

  12. A review of recent studies on the mechanisms and analysis methods of sub-synchronous oscillation in wind farms

    NASA Astrophysics Data System (ADS)

    Wang, Chenggen; Zhou, Qian; Gao, Shuning; Luo, Jia; Diao, Junchao; Zhao, Haoran; Bu, Jing

    2018-04-01

    This paper reviews the recent studies of Sub-Synchronous Oscillation(SSO) in wind farms. Mechanisms and analysis methods are the main concerns of this article. A classification method including new types of oscillation occurred between wind farms and HVDC systems and oscillation caused by Permanent Magnet Synchronous Generators(PMSG) is proposed. Characteristics of oscillation analysis techniques are summarized.

  13. A Statistical Discrimination Experiment for Eurasian Events Using a Twenty-Seven-Station Network

    DTIC Science & Technology

    1980-07-08

    to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...the weight assigned to each variable whenever a new one is added. Jennrich, R. I. (1977). Stepwise discriminant analysis , in Statistical Methods for

  14. Meta-Analysis of a Continuous Outcome Combining Individual Patient Data and Aggregate Data: A Method Based on Simulated Individual Patient Data

    ERIC Educational Resources Information Center

    Yamaguchi, Yusuke; Sakamoto, Wataru; Goto, Masashi; Staessen, Jan A.; Wang, Jiguang; Gueyffier, Francois; Riley, Richard D.

    2014-01-01

    When some trials provide individual patient data (IPD) and the others provide only aggregate data (AD), meta-analysis methods for combining IPD and AD are required. We propose a method that reconstructs the missing IPD for AD trials by a Bayesian sampling procedure and then applies an IPD meta-analysis model to the mixture of simulated IPD and…

  15. The Shock and Vibration Digest, Volume 17, Number 8

    DTIC Science & Technology

    1985-08-01

    ate, transmit, and radiate audible sound. dures are based on acoustic power flow, statistical energy analysis (SEA), and modal methods [22-283. A...modified partition area. features of the acoustic field. I.--1 85-1642 Statistical Energy Analysis , Structural Reso- nances, and Beam Networks BUILDING...energy methods, Structural resonance L.J. Lee Heriot-Watt Univ., Chambers St., Edinburgh The statistical energy analysis method is EHI 1HX, Scotland

  16. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less

  17. Parallel scalability and efficiency of vortex particle method for aeroelasticity analysis of bluff bodies

    NASA Astrophysics Data System (ADS)

    Tolba, Khaled Ibrahim; Morgenthal, Guido

    2018-01-01

    This paper presents an analysis of the scalability and efficiency of a simulation framework based on the vortex particle method. The code is applied for the numerical aerodynamic analysis of line-like structures. The numerical code runs on multicore CPU and GPU architectures using OpenCL framework. The focus of this paper is the analysis of the parallel efficiency and scalability of the method being applied to an engineering test case, specifically the aeroelastic response of a long-span bridge girder at the construction stage. The target is to assess the optimal configuration and the required computer architecture, such that it becomes feasible to efficiently utilise the method within the computational resources available for a regular engineering office. The simulations and the scalability analysis are performed on a regular gaming type computer.

  18. A Finite Element Analysis for Predicting the Residual Compressive Strength of Impact-Damaged Sandwich Panels

    NASA Technical Reports Server (NTRS)

    Ratcliffe, James G.; Jackson, Wade C.

    2008-01-01

    A simple analysis method has been developed for predicting the residual compressive strength of impact-damaged sandwich panels. The method is tailored for honeycomb core-based sandwich specimens that exhibit an indentation growth failure mode under axial compressive loading, which is driven largely by the crushing behavior of the core material. The analysis method is in the form of a finite element model, where the impact-damaged facesheet is represented using shell elements and the core material is represented using spring elements, aligned in the thickness direction of the core. The nonlinear crush response of the core material used in the analysis is based on data from flatwise compression tests. A comparison with a previous analysis method and some experimental data shows good agreement with results from this new approach.

  19. A Finite Element Analysis for Predicting the Residual Compression Strength of Impact-Damaged Sandwich Panels

    NASA Technical Reports Server (NTRS)

    Ratcliffe, James G.; Jackson, Wade C.

    2008-01-01

    A simple analysis method has been developed for predicting the residual compression strength of impact-damaged sandwich panels. The method is tailored for honeycomb core-based sandwich specimens that exhibit an indentation growth failure mode under axial compression loading, which is driven largely by the crushing behavior of the core material. The analysis method is in the form of a finite element model, where the impact-damaged facesheet is represented using shell elements and the core material is represented using spring elements, aligned in the thickness direction of the core. The nonlinear crush response of the core material used in the analysis is based on data from flatwise compression tests. A comparison with a previous analysis method and some experimental data shows good agreement with results from this new approach.

  20. Evaluation of quantification methods for real-time PCR minor groove binding hybridization probe assays.

    PubMed

    Durtschi, Jacob D; Stevenson, Jeffery; Hymas, Weston; Voelkerding, Karl V

    2007-02-01

    Real-time PCR data analysis for quantification has been the subject of many studies aimed at the identification of new and improved quantification methods. Several analysis methods have been proposed as superior alternatives to the common variations of the threshold crossing method. Notably, sigmoidal and exponential curve fit methods have been proposed. However, these studies have primarily analyzed real-time PCR with intercalating dyes such as SYBR Green. Clinical real-time PCR assays, in contrast, often employ fluorescent probes whose real-time amplification fluorescence curves differ from those of intercalating dyes. In the current study, we compared four analysis methods related to recent literature: two versions of the threshold crossing method, a second derivative maximum method, and a sigmoidal curve fit method. These methods were applied to a clinically relevant real-time human herpes virus type 6 (HHV6) PCR assay that used a minor groove binding (MGB) Eclipse hybridization probe as well as an Epstein-Barr virus (EBV) PCR assay that used an MGB Pleiades hybridization probe. We found that the crossing threshold method yielded more precise results when analyzing the HHV6 assay, which was characterized by lower signal/noise and less developed amplification curve plateaus. In contrast, the EBV assay, characterized by greater signal/noise and amplification curves with plateau regions similar to those observed with intercalating dyes, gave results with statistically similar precision by all four analysis methods.

  1. Structure identification methods for atomistic simulations of crystalline materials

    DOE PAGES

    Stukowski, Alexander

    2012-05-28

    Here, we discuss existing and new computational analysis techniques to classify local atomic arrangements in large-scale atomistic computer simulations of crystalline solids. This article includes a performance comparison of typical analysis algorithms such as common neighbor analysis (CNA), centrosymmetry analysis, bond angle analysis, bond order analysis and Voronoi analysis. In addition we propose a simple extension to the CNA method that makes it suitable for multi-phase systems. Finally, we introduce a new structure identification algorithm, the neighbor distance analysis, which is designed to identify atomic structure units in grain boundaries.

  2. Evaluating wood failure in plywood shear by optical image analysis

    Treesearch

    Charles W. McMillin

    1984-01-01

    This exploratory study evaulates the potential of using an automatic image analysis method to measure percent wood failure in plywood shear specimens. The results suggest that this method my be as accurate as the visual method in tracking long-term gluebond quality. With further refinement, the method could lead to automated equipment replacing the subjective visual...

  3. Screening for Multiple Genes Influencing Dyslexia.

    ERIC Educational Resources Information Center

    Smith, Shelley D.; And Others

    1991-01-01

    Examines the "sib pair" method of linkage analysis designed to locate genes influencing dyslexia, which has several advantages over the "LOD" score method. Notes that the sib pair analysis was able to detect the same linkages as the LOD method, plus a possible third region. Confirms that the sib pair method is an effective means of screening. (RS)

  4. A (2+1)-dimensional Korteweg-de Vries type equation in water waves: Lie symmetry analysis; multiple exp-function method; conservation laws

    NASA Astrophysics Data System (ADS)

    Adem, Abdullahi Rashid

    2016-05-01

    We consider a (2+1)-dimensional Korteweg-de Vries type equation which models the shallow-water waves, surface and internal waves. In the analysis, we use the Lie symmetry method and the multiple exp-function method. Furthermore, conservation laws are computed using the multiplier method.

  5. An overview of computational simulation methods for composite structures failure and life analysis

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1993-01-01

    Three parallel computational simulation methods are being developed at the LeRC Structural Mechanics Branch (SMB) for composite structures failure and life analysis: progressive fracture CODSTRAN; hierarchical methods for high-temperature composites; and probabilistic evaluation. Results to date demonstrate that these methods are effective in simulating composite structures failure/life/reliability.

  6. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  7. Nodal Analysis Optimization Based on the Use of Virtual Current Sources: A Powerful New Pedagogical Method

    ERIC Educational Resources Information Center

    Chatzarakis, G. E.

    2009-01-01

    This paper presents a new pedagogical method for nodal analysis optimization based on the use of virtual current sources, applicable to any linear electric circuit (LEC), regardless of its complexity. The proposed method leads to straightforward solutions, mostly arrived at by inspection. Furthermore, the method is easily adapted to computer…

  8. An efficient scan diagnosis methodology according to scan failure mode for yield enhancement

    NASA Astrophysics Data System (ADS)

    Kim, Jung-Tae; Seo, Nam-Sik; Oh, Ghil-Geun; Kim, Dae-Gue; Lee, Kyu-Taek; Choi, Chi-Young; Kim, InSoo; Min, Hyoung Bok

    2008-12-01

    Yield has always been a driving consideration during fabrication of modern semiconductor industry. Statistically, the largest portion of wafer yield loss is defective scan failure. This paper presents efficient failure analysis methods for initial yield ramp up and ongoing product with scan diagnosis. Result of our analysis shows that more than 60% of the scan failure dies fall into the category of shift mode in the very deep submicron (VDSM) devices. However, localization of scan shift mode failure is very difficult in comparison to capture mode failure because it is caused by the malfunction of scan chain. Addressing the biggest challenge, we propose the most suitable analysis method according to scan failure mode (capture / shift) for yield enhancement. In the event of capture failure mode, this paper describes the method that integrates scan diagnosis flow and backside probing technology to obtain more accurate candidates. We also describe several unique techniques, such as bulk back-grinding solution, efficient backside probing and signal analysis method. Lastly, we introduce blocked chain analysis algorithm for efficient analysis of shift failure mode. In this paper, we contribute to enhancement of the yield as a result of the combination of two methods. We confirm the failure candidates with physical failure analysis (PFA) method. The direct feedback of the defective visualization is useful to mass-produce devices in a shorter time. The experimental data on mass products show that our method produces average reduction by 13.7% in defective SCAN & SRAM-BIST failure rates and by 18.2% in wafer yield rates.

  9. A brain-region-based meta-analysis method utilizing the Apriori algorithm.

    PubMed

    Niu, Zhendong; Nie, Yaoxin; Zhou, Qian; Zhu, Linlin; Wei, Jieyao

    2016-05-18

    Brain network connectivity modeling is a crucial method for studying the brain's cognitive functions. Meta-analyses can unearth reliable results from individual studies. Meta-analytic connectivity modeling is a connectivity analysis method based on regions of interest (ROIs) which showed that meta-analyses could be used to discover brain network connectivity. In this paper, we propose a new meta-analysis method that can be used to find network connectivity models based on the Apriori algorithm, which has the potential to derive brain network connectivity models from activation information in the literature, without requiring ROIs. This method first extracts activation information from experimental studies that use cognitive tasks of the same category, and then maps the activation information to corresponding brain areas by using the automatic anatomical label atlas, after which the activation rate of these brain areas is calculated. Finally, using these brain areas, a potential brain network connectivity model is calculated based on the Apriori algorithm. The present study used this method to conduct a mining analysis on the citations in a language review article by Price (Neuroimage 62(2):816-847, 2012). The results showed that the obtained network connectivity model was consistent with that reported by Price. The proposed method is helpful to find brain network connectivity by mining the co-activation relationships among brain regions. Furthermore, results of the co-activation relationship analysis can be used as a priori knowledge for the corresponding dynamic causal modeling analysis, possibly achieving a significant dimension-reducing effect, thus increasing the efficiency of the dynamic causal modeling analysis.

  10. Bioanalytical methods for food contaminant analysis.

    PubMed

    Van Emon, Jeanette M

    2010-01-01

    Foods are complex mixtures of lipids, carbohydrates, proteins, vitamins, organic compounds, and other naturally occurring substances. Sometimes added to this mixture are residues of pesticides, veterinary and human drugs, microbial toxins, preservatives, contaminants from food processing and packaging, and other residues. This milieu of compounds can pose difficulties in the analysis of food contaminants. There is an expanding need for rapid and cost-effective residue methods for difficult food matrixes to safeguard our food supply. Bioanalytical methods are established for many food contaminants such as mycotoxins and are the method of choice for many food allergens. Bioanalytical methods are often more cost-effective and sensitive than instrumental procedures. Recent developments in bioanalytical methods may provide more applications for their use in food analysis.

  11. Principal Component Analysis for pulse-shape discrimination of scintillation radiation detectors

    NASA Astrophysics Data System (ADS)

    Alharbi, T.

    2016-01-01

    In this paper, we report on the application of Principal Component analysis (PCA) for pulse-shape discrimination (PSD) of scintillation radiation detectors. The details of the method are described and the performance of the method is experimentally examined by discriminating between neutrons and gamma-rays with a liquid scintillation detector in a mixed radiation field. The performance of the method is also compared against that of the conventional charge-comparison method, demonstrating the superior performance of the method particularly at low light output range. PCA analysis has the important advantage of automatic extraction of the pulse-shape characteristics which makes the PSD method directly applicable to various scintillation detectors without the need for the adjustment of a PSD parameter.

  12. Shear Lag in Box Beams Methods of Analysis and Experimental Investigations

    NASA Technical Reports Server (NTRS)

    Kuhn, Paul; Chiarito, Patrick T

    1942-01-01

    The bending stresses in the covers of box beams or wide-flange beams differ appreciably from the stresses predicted by the ordinary bending theory on account of shear deformation of the flanges. The problem of predicting these differences has become known as the shear-lag problem. The first part of this paper deals with methods of shear-lag analysis suitable for practical use. The second part of the paper describes strain-gage tests made by the NACA to verify the theory. Three tests published by other investigators are also analyzed by the proposed method. The third part of the paper gives numerical examples illustrating the methods of analysis. An appendix gives comparisons with other methods, particularly with the method of Ebner and Koller.

  13. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.

  14. Advancing the detection of steady-state visual evoked potentials in brain-computer interfaces.

    PubMed

    Abu-Alqumsan, Mohammad; Peer, Angelika

    2016-06-01

    Spatial filtering has proved to be a powerful pre-processing step in detection of steady-state visual evoked potentials and boosted typical detection rates both in offline analysis and online SSVEP-based brain-computer interface applications. State-of-the-art detection methods and the spatial filters used thereby share many common foundations as they all build upon the second order statistics of the acquired Electroencephalographic (EEG) data, that is, its spatial autocovariance and cross-covariance with what is assumed to be a pure SSVEP response. The present study aims at highlighting the similarities and differences between these methods. We consider the canonical correlation analysis (CCA) method as a basis for the theoretical and empirical (with real EEG data) analysis of the state-of-the-art detection methods and the spatial filters used thereby. We build upon the findings of this analysis and prior research and propose a new detection method (CVARS) that combines the power of the canonical variates and that of the autoregressive spectral analysis in estimating the signal and noise power levels. We found that the multivariate synchronization index method and the maximum contrast combination method are variations of the CCA method. All three methods were found to provide relatively unreliable detections in low signal-to-noise ratio (SNR) regimes. CVARS and the minimum energy combination methods were found to provide better estimates for different SNR levels. Our theoretical and empirical results demonstrate that the proposed CVARS method outperforms other state-of-the-art detection methods when used in an unsupervised fashion. Furthermore, when used in a supervised fashion, a linear classifier learned from a short training session is able to estimate the hidden user intention, including the idle state (when the user is not attending to any stimulus), rapidly, accurately and reliably.

  15. Computational structural mechanics methods research using an evolving framework

    NASA Technical Reports Server (NTRS)

    Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.

    1990-01-01

    Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.

  16. An Evaluation of a Computer-Based Training on the Visual Analysis of Single-Subject Data

    ERIC Educational Resources Information Center

    Snyder, Katie

    2013-01-01

    Visual analysis is the primary method of analyzing data in single-subject methodology, which is the predominant research method used in the fields of applied behavior analysis and special education. Previous research on the reliability of visual analysis suggests that judges often disagree about what constitutes an intervention effect. Considering…

  17. Using Module Analysis for Multiple Choice Responses: A New Method Applied to Force Concept Inventory Data

    ERIC Educational Resources Information Center

    Brewe, Eric; Bruun, Jesper; Bearden, Ian G.

    2016-01-01

    We describe "Module Analysis for Multiple Choice Responses" (MAMCR), a new methodology for carrying out network analysis on responses to multiple choice assessments. This method is used to identify modules of non-normative responses which can then be interpreted as an alternative to factor analysis. MAMCR allows us to identify conceptual…

  18. Determining the Number of Factors in P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  19. Security analysis and improvements to the PsychoPass method.

    PubMed

    Brumen, Bostjan; Heričko, Marjan; Rozman, Ivan; Hölbl, Marko

    2013-08-13

    In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength.

  20. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

Top