Temperature analysis with voltage-current time differential operation of electrochemical sensors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woo, Leta Yar-Li; Glass, Robert Scott; Fitzpatrick, Joseph Jay
A method for temperature analysis of a gas stream. The method includes identifying a temperature parameter of an affected waveform signal. The method also includes calculating a change in the temperature parameter by comparing the affected waveform signal with an original waveform signal. The method also includes generating a value from the calculated change which corresponds to the temperature of the gas stream.
Computational methods for global/local analysis
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.
1992-01-01
Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.
Method and apparatus for ceramic analysis
Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr
2003-04-01
The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338
Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.
West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID
2012-05-29
Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.
West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID
2011-09-27
Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.
A Multidimensional Analysis Tool for Visualizing Online Interactions
ERIC Educational Resources Information Center
Kim, Minjeong; Lee, Eunchul
2012-01-01
This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…
Global/local methods research using a common structural analysis framework
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.
1991-01-01
Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
Finnveden, Göran; Björklund, Anna; Moberg, Asa; Ekvall, Tomas
2007-06-01
A large number of methods and approaches that can be used for supporting waste management decisions at different levels in society have been developed. In this paper an overview of methods is provided and preliminary guidelines for the choice of methods are presented. The methods introduced include: Environmental Impact Assessment, Strategic Environmental Assessment, Life Cycle Assessment, Cost-Benefit Analysis, Cost-effectiveness Analysis, Life-cycle Costing, Risk Assessment, Material Flow Accounting, Substance Flow Analysis, Energy Analysis, Exergy Analysis, Entropy Analysis, Environmental Management Systems, and Environmental Auditing. The characteristics used are the types of impacts included, the objects under study and whether the method is procedural or analytical. The different methods can be described as systems analysis methods. Waste management systems thinking is receiving increasing attention. This is, for example, evidenced by the suggested thematic strategy on waste by the European Commission where life-cycle analysis and life-cycle thinking get prominent positions. Indeed, life-cycle analyses have been shown to provide policy-relevant and consistent results. However, it is also clear that the studies will always be open to criticism since they are simplifications of reality and include uncertainties. This is something all systems analysis methods have in common. Assumptions can be challenged and it may be difficult to generalize from case studies to policies. This suggests that if decisions are going to be made, they are likely to be made on a less than perfect basis.
Global/local methods research using the CSM testbed
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. Hayden, Jr.; Thompson, Danniella M.
1990-01-01
Research activities in global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
Gourley, Paul L.; Gourley, Mark F.
1997-01-01
An apparatus and method for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis thereof.
Gourley, P.L.; Gourley, M.F.
1997-03-04
An apparatus and method are disclosed for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis. 20 figs.
Military applications and examples of near-surface seismic surface wave methods (Invited)
NASA Astrophysics Data System (ADS)
sloan, S.; Stevens, R.
2013-12-01
Although not always widely known or publicized, the military uses a variety of geophysical methods for a wide range of applications--some that are already common practice in the industry while others are truly novel. Some of those applications include unexploded ordnance detection, general site characterization, anomaly detection, countering improvised explosive devices (IEDs), and security monitoring, to name a few. Techniques used may include, but are not limited to, ground penetrating radar, seismic, electrical, gravity, and electromagnetic methods. Seismic methods employed include surface wave analysis, refraction tomography, and high-resolution reflection methods. Although the military employs geophysical methods, that does not necessarily mean that those methods enable or support combat operations--often times they are being used for humanitarian applications within the military's area of operations to support local populations. The work presented here will focus on the applied use of seismic surface wave methods, including multichannel analysis of surface waves (MASW) and backscattered surface waves, often in conjunction with other methods such as refraction tomography or body-wave diffraction analysis. Multiple field examples will be shown, including explosives testing, tunnel detection, pre-construction site characterization, and cavity detection.
USDA-ARS?s Scientific Manuscript database
Analysis of 9 macrolides is presented, including tulathromycin A (Draxxin), in beef, poultry and pork muscle with a simple multi-residue extraction and analysis method using high performance liquid chromatography coupled to electrospray ionization tandem mass spectrometry. The extraction method inv...
Recent developments of the NESSUS probabilistic structural analysis computer program
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.
1992-01-01
The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.
Multifidelity Analysis and Optimization for Supersonic Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Willcox, Karen; March, Andrew; Haas, Alex; Rajnarayan, Dev; Kays, Cory
2010-01-01
Supersonic aircraft design is a computationally expensive optimization problem and multifidelity approaches over a significant opportunity to reduce design time and computational cost. This report presents tools developed to improve supersonic aircraft design capabilities including: aerodynamic tools for supersonic aircraft configurations; a systematic way to manage model uncertainty; and multifidelity model management concepts that incorporate uncertainty. The aerodynamic analysis tools developed are appropriate for use in a multifidelity optimization framework, and include four analysis routines to estimate the lift and drag of a supersonic airfoil, a multifidelity supersonic drag code that estimates the drag of aircraft configurations with three different methods: an area rule method, a panel method, and an Euler solver. In addition, five multifidelity optimization methods are developed, which include local and global methods as well as gradient-based and gradient-free techniques.
Nonlinear analysis of structures. [within framework of finite element method
NASA Technical Reports Server (NTRS)
Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.
1974-01-01
The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.
NASA Technical Reports Server (NTRS)
Shiau, Jyh-Jen; Wahba, Grace; Johnson, Donald R.
1986-01-01
A new method, based on partial spline models, is developed for including specified discontinuities in otherwise smooth two- and three-dimensional objective analyses. The method is appropriate for including tropopause height information in two- and three-dimensinal temperature analyses, using the O'Sullivan-Wahba physical variational method for analysis of satellite radiance data, and may in principle be used in a combined variational analysis of observed, forecast, and climate information. A numerical method for its implementation is described and a prototype two-dimensional analysis based on simulated radiosonde and tropopause height data is shown. The method may also be appropriate for other geophysical problems, such as modeling the ocean thermocline, fronts, discontinuities, etc.
Systems and methods for sample analysis
Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng
2015-01-13
The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.
[Recent advances in sample preparation methods of plant hormones].
Wu, Qian; Wang, Lus; Wu, Dapeng; Duan, Chunfeng; Guan, Yafeng
2014-04-01
Plant hormones are a group of naturally occurring trace substances which play a crucial role in controlling the plant development, growth and environment response. With the development of the chromatography and mass spectroscopy technique, chromatographic analytical method has become a widely used way for plant hormone analysis. Among the steps of chromatographic analysis, sample preparation is undoubtedly the most vital one. Thus, a highly selective and efficient sample preparation method is critical for accurate identification and quantification of phytohormones. For the three major kinds of plant hormones including acidic plant hormones & basic plant hormones, brassinosteroids and plant polypeptides, the sample preparation methods are reviewed in sequence especially the recently developed methods. The review includes novel methods, devices, extractive materials and derivative reagents for sample preparation of phytohormones analysis. Especially, some related works of our group are included. At last, the future developments in this field are also prospected.
Linnorm: improved statistical analysis for single cell RNA-seq expression data
Yip, Shun H.; Wang, Panwen; Kocher, Jean-Pierre A.; Sham, Pak Chung
2017-01-01
Abstract Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. PMID:28981748
Kotter, Dale K [Shelley, ID; Rohrbaugh, David T [Idaho Falls, ID
2010-09-07
A frequency selective surface (FSS) and associated methods for modeling, analyzing and designing the FSS are disclosed. The FSS includes a pattern of conductive material formed on a substrate to form an array of resonance elements. At least one aspect of the frequency selective surface is determined by defining a frequency range including multiple frequency values, determining a frequency dependent permittivity across the frequency range for the substrate, determining a frequency dependent conductivity across the frequency range for the conductive material, and analyzing the frequency selective surface using a method of moments analysis at each of the multiple frequency values for an incident electromagnetic energy impinging on the frequency selective surface. The frequency dependent permittivity and the frequency dependent conductivity are included in the method of moments analysis.
Three-dimensional Stress Analysis Using the Boundary Element Method
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Banerjee, P. K.
1984-01-01
The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.
and algal biomass analysis methods and applications of these methods to different processes. Templeton , internally funded research project to develop microalgal compositional analysis methods that included setting methods Closing mass and component balances around pretreatment, saccharification, and fermentation unit
Design and Analysis Tools for Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.; Folk, Thomas C.
2009-01-01
Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.
Methods for Analysis and Simulation of Ballistic Impact
2017-04-01
ARL-RP-0597 ● Apr 2017 US Army Research Laboratory Methods for Analysis and Simulation of Ballistic Impact by John D Clayton...Laboratory Methods for Analysis and Simulation of Ballistic Impact by John D Clayton Weapons and Materials Research Directorate, ARL...analytical, and numerical methods of ballistics research . Similar lengthy references dealing with pertinent aspects include [8, 9]. In contrast, the
Text analysis methods, text analysis apparatuses, and articles of manufacture
Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M
2014-10-28
Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.
Linnorm: improved statistical analysis for single cell RNA-seq expression data.
Yip, Shun H; Wang, Panwen; Kocher, Jean-Pierre A; Sham, Pak Chung; Wang, Junwen
2017-12-15
Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P; Cowell, Andrew J; Gregory, Michelle L
Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Darden, Christine M.
1987-01-01
Low-speed experimental force and data on a series of thin swept wings with sharp leading edges and leading and trailing-edge flaps are compared with predictions made using a linearized-theory method which includes estimates of vortex forces. These comparisons were made to assess the effectiveness of linearized-theory methods for use in the design and analysis of flap systems in subsonic flow. Results demonstrate that linearized-theory, attached-flow methods (with approximate representation of vortex forces) can form the basis of a rational system for flap design and analysis. Even attached-flow methods that do not take vortex forces into account can be used for the selection of optimized flap-system geometry, but design-point performance levels tend to be underestimated unless vortex forces are included. Illustrative examples of the use of these methods in the design of efficient low-speed flap systems are included.
NASA Technical Reports Server (NTRS)
Cruse, T. A.
1987-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.
1988-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
Price, Jeffery R.; Bingham, Philip R.
2005-11-08
Systems and methods are described for rapid acquisition of fused off-axis illumination direct-to-digital holography. A method of recording a plurality of off-axis object illuminated spatially heterodyne holograms, each of the off-axis object illuminated spatially heterodyne holograms including spatially heterodyne fringes for Fourier analysis, includes digitally recording, with a first illumination source of an interferometer, a first off-axis object illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; and digitally recording, with a second illumination source of the interferometer, a second off-axis object illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-01-01
The set of documents discusses the new draft methods (EPA method 551, EPA method 552) for the analysis of disinfection byproducts contained in drinking water. The methods use the techniques of liquid/liquid extraction and gas chromatography with electron capture detection.
Leonard, Annemarie K; Loughran, Elizabeth A; Klymenko, Yuliya; Liu, Yueying; Kim, Oleg; Asem, Marwa; McAbee, Kevin; Ravosa, Matthew J; Stack, M Sharon
2018-01-01
This chapter highlights methods for visualization and analysis of extracellular matrix (ECM) proteins, with particular emphasis on collagen type I, the most abundant protein in mammals. Protocols described range from advanced imaging of complex in vivo matrices to simple biochemical analysis of individual ECM proteins. The first section of this chapter describes common methods to image ECM components and includes protocols for second harmonic generation, scanning electron microscopy, and several histological methods of ECM localization and degradation analysis, including immunohistochemistry, Trichrome staining, and in situ zymography. The second section of this chapter details both a common transwell invasion assay and a novel live imaging method to investigate cellular behavior with respect to collagen and other ECM proteins of interest. The final section consists of common electrophoresis-based biochemical methods that are used in analysis of ECM proteins. Use of the methods described herein will enable researchers to gain a greater understanding of the role of ECM structure and degradation in development and matrix-related diseases such as cancer and connective tissue disorders. © 2018 Elsevier Inc. All rights reserved.
Design sensitivity analysis with Applicon IFAD using the adjoint variable method
NASA Technical Reports Server (NTRS)
Frederick, Marjorie C.; Choi, Kyung K.
1984-01-01
A numerical method is presented to implement structural design sensitivity analysis using the versatility and convenience of existing finite element structural analysis program and the theoretical foundation in structural design sensitivity analysis. Conventional design variables, such as thickness and cross-sectional areas, are considered. Structural performance functionals considered include compliance, displacement, and stress. It is shown that calculations can be carried out outside existing finite element codes, using postprocessing data only. That is, design sensitivity analysis software does not have to be imbedded in an existing finite element code. The finite element structural analysis program used in the implementation presented is IFAD. Feasibility of the method is shown through analysis of several problems, including built-up structures. Accurate design sensitivity results are obtained without the uncertainty of numerical accuracy associated with selection of a finite difference perturbation.
Analysis and application of Fourier transform spectroscopy in atmospheric remote sensing
NASA Technical Reports Server (NTRS)
Park, J. H.
1984-01-01
An analysis method for Fourier transform spectroscopy is summarized with applications to various types of distortion in atmospheric absorption spectra. This analysis method includes the fast Fourier transform method for simulating the interferometric spectrum and the nonlinear least-squares method for retrieving the information from a measured spectrum. It is shown that spectral distortions can be simulated quite well and that the correct information can be retrieved from a distorted spectrum by this analysis technique.
Factor Retention in Exploratory Factor Analysis: A Comparison of Alternative Methods.
ERIC Educational Resources Information Center
Mumford, Karen R.; Ferron, John M.; Hines, Constance V.; Hogarty, Kristine Y.; Kromrey, Jeffery D.
This study compared the effectiveness of 10 methods of determining the number of factors to retain in exploratory common factor analysis. The 10 methods included the Kaiser rule and a modified Kaiser criterion, 3 variations of parallel analysis, 4 regression-based variations of the scree procedure, and the minimum average partial procedure. The…
Climate Change Discourse in Mass Media: Application of Computer-Assisted Content Analysis
ERIC Educational Resources Information Center
Kirilenko, Andrei P.; Stepchenkova, Svetlana O.
2012-01-01
Content analysis of mass media publications has become a major scientific method used to analyze public discourse on climate change. We propose a computer-assisted content analysis method to extract prevalent themes and analyze discourse changes over an extended period in an objective and quantifiable manner. The method includes the following: (1)…
Methods to control for unmeasured confounding in pharmacoepidemiology: an overview.
Uddin, Md Jamal; Groenwold, Rolf H H; Ali, Mohammed Sanni; de Boer, Anthonius; Roes, Kit C B; Chowdhury, Muhammad A B; Klungel, Olaf H
2016-06-01
Background Unmeasured confounding is one of the principal problems in pharmacoepidemiologic studies. Several methods have been proposed to detect or control for unmeasured confounding either at the study design phase or the data analysis phase. Aim of the Review To provide an overview of commonly used methods to detect or control for unmeasured confounding and to provide recommendations for proper application in pharmacoepidemiology. Methods/Results Methods to control for unmeasured confounding in the design phase of a study are case only designs (e.g., case-crossover, case-time control, self-controlled case series) and the prior event rate ratio adjustment method. Methods that can be applied in the data analysis phase include, negative control method, perturbation variable method, instrumental variable methods, sensitivity analysis, and ecological analysis. A separate group of methods are those in which additional information on confounders is collected from a substudy. The latter group includes external adjustment, propensity score calibration, two-stage sampling, and multiple imputation. Conclusion As the performance and application of the methods to handle unmeasured confounding may differ across studies and across databases, we stress the importance of using both statistical evidence and substantial clinical knowledge for interpretation of the study results.
Computer program for preliminary design analysis of axial-flow turbines
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1972-01-01
The program method is based on a mean-diameter flow analysis. Input design requirements include power or pressure ratio, flow, temperature, pressure, and speed. Turbine designs are generated for any specified number of stages and for any of three types of velocity diagrams (symmetrical, zero exit swirl, or impulse). Exit turning vanes can be included in the design. Program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, blading angles, and last-stage critical velocity ratios. The report presents the analysis method, a description of input and output with sample cases, and the program listing.
Gas stream analysis using voltage-current time differential operation of electrochemical sensors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woo, Leta Yar-Li; Glass, Robert Scott; Fitzpatrick, Joseph Jay
A method for analysis of a gas stream. The method includes identifying an affected region of an affected waveform signal corresponding to at least one characteristic of the gas stream. The method also includes calculating a voltage-current time differential between the affected region of the affected waveform signal and a corresponding region of an original waveform signal. The affected region and the corresponding region of the waveform signals have a sensitivity specific to the at least one characteristic of the gas stream. The method also includes generating a value for the at least one characteristic of the gas stream basedmore » on the calculated voltage-current time differential.« less
Nonlinear Dynamics, Poor Data, and What to Make of Them?
NASA Astrophysics Data System (ADS)
Ghil, M.; Zaliapin, I. V.
2005-12-01
The analysis of univariate or multivariate time series provides crucial information to describe, understand, and predict variability in the geosciences. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical field of study. Considerable progress has also been made in interpreting the information so obtained in terms of dynamical systems theory. In this talk we will describe the connections between time series analysis and nonlinear dynamics, discuss signal-to-noise enhancement, and present some of the novel methods for spectral analysis. These fall into two broad categories: (i) methods that try to ferret out regularities of the time series; and (ii) methods aimed at describing the characteristics of irregular processes. The former include singular-spectrum analysis (SSA), the multi-taper method (MTM), and the maximum-entropy method (MEM). The various steps, as well as the advantages and disadvantages of these methods, will be illustrated by their application to several important climatic time series, such as the Southern Oscillation Index (SOI), paleoclimatic time series, and instrumental temperature time series. The SOI index captures major features of interannual climate variability and is used extensively in its prediction. The other time series cover interdecadal and millennial time scales. The second category includes the calculation of fractional dimension, leading Lyapunov exponents, and Hurst exponents. More recently, multi-trend analysis (MTA), binary-decomposition analysis (BDA), and related methods have attempted to describe the structure of time series that include both regular and irregular components. Within the time available, I will try to give a feeling for how these methods work, and how well.
Similitude design for the vibration problems of plates and shells: A review
NASA Astrophysics Data System (ADS)
Zhu, Yunpeng; Wang, You; Luo, Zhong; Han, Qingkai; Wang, Deyou
2017-06-01
Similitude design plays a vital role in the analysis of vibration and shock problems encountered in large engineering equipment. Similitude design, including dimensional analysis and governing equation method, is founded on the dynamic similitude theory. This study reviews the application of similitude design methods in engineering practice and summarizes the major achievements of the dynamic similitude theory in structural vibration and shock problems in different fields, including marine structures, civil engineering structures, and large power equipment. This study also reviews the dynamic similitude design methods for thin-walled and composite material plates and shells, including the most recent work published by the authors. Structure sensitivity analysis is used to evaluate the scaling factors to attain accurate distorted scaling laws. Finally, this study discusses the existing problems and the potential of the dynamic similitude theory for the analysis of vibration and shock problems of structures.
Computational Methods for Structural Mechanics and Dynamics, part 1
NASA Technical Reports Server (NTRS)
Stroud, W. Jefferson (Editor); Housner, Jerrold M. (Editor); Tanner, John A. (Editor); Hayduk, Robert J. (Editor)
1989-01-01
The structural analysis methods research has several goals. One goal is to develop analysis methods that are general. This goal of generality leads naturally to finite-element methods, but the research will also include other structural analysis methods. Another goal is that the methods be amenable to error analysis; that is, given a physical problem and a mathematical model of that problem, an analyst would like to know the probable error in predicting a given response quantity. The ultimate objective is to specify the error tolerances and to use automated logic to adjust the mathematical model or solution strategy to obtain that accuracy. A third goal is to develop structural analysis methods that can exploit parallel processing computers. The structural analysis methods research will focus initially on three types of problems: local/global nonlinear stress analysis, nonlinear transient dynamics, and tire modeling.
Fasihi, Yasser; Fooladi, Saba; Mohammadi, Mohammad Ali; Emaneini, Mohammad; Kalantar-Neyestanaki, Davood
2017-09-06
Molecular typing is an important tool for control and prevention of infection. A suitable molecular typing method for epidemiological investigation must be easy to perform, highly reproducible, inexpensive, rapid and easy to interpret. In this study, two molecular typing methods including the conventional PCR-sequencing method and high resolution melting (HRM) analysis were used for staphylococcal protein A (spa) typing of 30 Methicillin-resistant Staphylococcus aureus (MRSA) isolates recovered from clinical samples. Based on PCR-sequencing method results, 16 different spa types were identified among the 30 MRSA isolates. Among the 16 different spa types, 14 spa types separated by HRM method. Two spa types including t4718 and t2894 were not separated from each other. According to our results, spa typing based on HRM analysis method is very rapid, easy to perform and cost-effective, but this method must be standardized for different regions, spa types, and real-time machinery.
USDA-ARS?s Scientific Manuscript database
The development of genomic selection methodology, with accompanying substantial gains in reliability for low-heritability traits, may dramatically improve the feasibility of genetic improvement of dairy cow health. Many methods for genomic analysis have now been developed, including the “Bayesian Al...
The Effect of Laminar Flow on Rotor Hover Performance
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.; Martin, Preston B.
2017-01-01
The topic of laminar flow effects on hover performance is introduced with respect to some historical efforts where laminar flow was either measured or attempted. An analysis method is outlined using combined blade element, momentum method coupled to an airfoil analysis method, which includes the full e(sup N) transition model. The analysis results compared well with the measured hover performance including the measured location of transition on both the upper and lower blade surfaces. The analysis method is then used to understand the upper limits of hover efficiency as a function of disk loading. The impact of laminar flow is higher at low disk loading, but significant improvement in terms of power loading appears possible even up to high disk loading approaching 20 ps f. A optimum planform design equation is derived for cases of zero profile drag and finite drag levels. These results are intended to be a guide for design studies and as a benchmark to compare higher fidelity analysis results. The details of the analysis method are given to enable other researchers to use the same approach for comparison to other approaches.
Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.
Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M
2017-07-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.
Optimal Multicomponent Analysis Using the Generalized Standard Addition Method.
ERIC Educational Resources Information Center
Raymond, Margaret; And Others
1983-01-01
Describes an experiment on the simultaneous determination of chromium and magnesium by spectophotometry modified to include the Generalized Standard Addition Method computer program, a multivariate calibration method that provides optimal multicomponent analysis in the presence of interference and matrix effects. Provides instructions for…
Advances in Statistical Methods for Substance Abuse Prevention Research
MacKinnon, David P.; Lockwood, Chondra M.
2010-01-01
The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467
NASA Technical Reports Server (NTRS)
Kim, Sang-Wook
1987-01-01
Various experimental, analytical, and numerical analysis methods for flow-solid interaction of a nest of cylinders subjected to cross flows are reviewed. A nest of cylinders subjected to cross flows can be found in numerous engineering applications including the Space Shuttle Maine Engine-Main Injector Assembly (SSME-MIA) and nuclear reactor heat exchangers. Despite its extreme importance in engineering applications, understanding of the flow-solid interaction process is quite limited and design of the tube banks are mostly dependent on experiments and/or experimental correlation equations. For future development of major numerical analysis methods for the flow-solid interaction of a nest of cylinders subjected to cross flow, various turbulence models, nonlinear structural dynamics, and existing laminar flow-solid interaction analysis methods are included.
Modal analysis applied to circular, rectangular, and coaxial waveguides
NASA Technical Reports Server (NTRS)
Hoppe, D. J.
1988-01-01
Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.
Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana
2016-01-01
Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment effects are unclear. PMID:27531725
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Benthem, Mark Hilary; Mowry, Curtis Dale; Kotula, Paul Gabriel
Thermal decomposition of poly dimethyl siloxane compounds, Sylgard{reg_sign} 184 and 186, were examined using thermal desorption coupled gas chromatography-mass spectrometry (TD/GC-MS) and multivariate analysis. This work describes a method of producing multiway data using a stepped thermal desorption. The technique involves sequentially heating a sample of the material of interest with subsequent analysis in a commercial GC/MS system. The decomposition chromatograms were analyzed using multivariate analysis tools including principal component analysis (PCA), factor rotation employing the varimax criterion, and multivariate curve resolution. The results of the analysis show seven components related to offgassing of various fractions of siloxanes that varymore » as a function of temperature. Thermal desorption coupled with gas chromatography-mass spectrometry (TD/GC-MS) is a powerful analytical technique for analyzing chemical mixtures. It has great potential in numerous analytic areas including materials analysis, sports medicine, in the detection of designer drugs; and biological research for metabolomics. Data analysis is complicated, far from automated and can result in high false positive or false negative rates. We have demonstrated a step-wise TD/GC-MS technique that removes more volatile compounds from a sample before extracting the less volatile compounds. This creates an additional dimension of separation before the GC column, while simultaneously generating three-way data. Sandia's proven multivariate analysis methods, when applied to these data, have several advantages over current commercial options. It also has demonstrated potential for success in finding and enabling identification of trace compounds. Several challenges remain, however, including understanding the sources of noise in the data, outlier detection, improving the data pretreatment and analysis methods, developing a software tool for ease of use by the chemist, and demonstrating our belief that this multivariate analysis will enable superior differentiation capabilities. In addition, noise and system artifacts challenge the analysis of GC-MS data collected on lower cost equipment, ubiquitous in commercial laboratories. This research has the potential to affect many areas of analytical chemistry including materials analysis, medical testing, and environmental surveillance. It could also provide a method to measure adsorption parameters for chemical interactions on various surfaces by measuring desorption as a function of temperature for mixtures. We have presented results of a novel method for examining offgas products of a common PDMS material. Our method involves utilizing a stepped TD/GC-MS data acquisition scheme that may be almost totally automated, coupled with multivariate analysis schemes. This method of data generation and analysis can be applied to a number of materials aging and thermal degradation studies.« less
Assessment methods for the evaluation of vitiligo.
Alghamdi, K M; Kumar, A; Taïeb, A; Ezzedine, K
2012-12-01
There is no standardized method for assessing vitiligo. In this article, we review the literature from 1981 to 2011 on different vitiligo assessment methods. We aim to classify the techniques available for vitiligo assessment as subjective, semi-objective or objective; microscopic or macroscopic; and as based on morphometry or colorimetry. Macroscopic morphological measurements include visual assessment, photography in natural or ultraviolet light, photography with computerized image analysis and tristimulus colorimetry or spectrophotometry. Non-invasive micromorphological methods include confocal laser microscopy (CLM). Subjective methods include clinical evaluation by a dermatologist and a vitiligo disease activity score. Semi-objective methods include the Vitiligo Area Scoring Index (VASI) and point-counting methods. Objective methods include software-based image analysis, tristimulus colorimetry, spectrophotometry and CLM. Morphometry is the measurement of the vitiliginous surface area, whereas colorimetry quantitatively analyses skin colour changes caused by erythema or pigment. Most methods involve morphometry, except for the chromameter method, which assesses colorimetry. Some image analysis software programs can assess both morphometry and colorimetry. The details of these programs (Corel Draw, Image Pro Plus, AutoCad and Photoshop) are discussed in the review. Reflectance confocal microscopy provides real-time images and has great potential for the non-invasive assessment of pigmentary lesions. In conclusion, there is no single best method for assessing vitiligo. This review revealed that VASI, the rule of nine and Wood's lamp are likely to be the best techniques available for assessing the degree of pigmentary lesions and measuring the extent and progression of vitiligo in the clinic and in clinical trials. © 2012 The Authors. Journal of the European Academy of Dermatology and Venereology © 2012 European Academy of Dermatology and Venereology.
Computer-assisted techniques to evaluate fringe patterns
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Bhat, Gopalakrishna K.
1992-01-01
Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.
Planning for Downtown Circulation Systems. Volume 2. Analysis Techniques.
DOT National Transportation Integrated Search
1983-10-01
This volume contains the analysis and refinement stages of downtown circulator planning. Included are sections on methods for estimating patronage, costs, revenues, and impacts, and a section on methods for performing micro-level analyses.
COMPARISON OF TWO DIFFERENT SOLID PHASE EXTRACTION/LARGE VOLUME INJECTION PROCEDURES FOR METHOD 8270
Two solid phase (SPE) and one traditional continuous liquid-liquid extraction method are compared for analysis of Method 8270 SVOCs. Productivity parameters include data quality, sample volume, analysis time and solvent waste.
One SPE system, unique in the U.S., uses aut...
Airbreathing hypersonic vehicle design and analysis methods
NASA Technical Reports Server (NTRS)
Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.
1996-01-01
The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.
Railroad Classification Yard Technology Manual. Volume I : Yard Design Methods
DOT National Transportation Integrated Search
1981-02-01
This volume documents the procedures and methods associated with the design of railroad classification yards. Subjects include: site location, economic analysis, yard capacity analysis, design of flat yards, overall configuration of hump yards, hump ...
McGarvey, Daniel J.; Falke, Jeffrey A.; Li, Hiram W.; Li, Judith; Hauer, F. Richard; Lamberti, G.A.
2017-01-01
Methods to sample fishes in stream ecosystems and to analyze the raw data, focusing primarily on assemblage-level (all fish species combined) analyses, are presented in this chapter. We begin with guidance on sample site selection, permitting for fish collection, and information-gathering steps to be completed prior to conducting fieldwork. Basic sampling methods (visual surveying, electrofishing, and seining) are presented with specific instructions for estimating population sizes via visual, capture-recapture, and depletion surveys, in addition to new guidance on environmental DNA (eDNA) methods. Steps to process fish specimens in the field including the use of anesthesia and preservation of whole specimens or tissue samples (for genetic or stable isotope analysis) are also presented. Data analysis methods include characterization of size-structure within populations, estimation of species richness and diversity, and application of fish functional traits. We conclude with three advanced topics in assemblage-level analysis: multidimensional scaling (MDS), ecological networks, and loop analysis.
Multi-parametric centrality method for graph network models
NASA Astrophysics Data System (ADS)
Ivanov, Sergei Evgenievich; Gorlushkina, Natalia Nikolaevna; Ivanova, Lubov Nikolaevna
2018-04-01
The graph model networks are investigated to determine centrality, weights and the significance of vertices. For centrality analysis appliesa typical method that includesany one of the properties of graph vertices. In graph theory, methods of analyzing centrality are used: in terms by degree, closeness, betweenness, radiality, eccentricity, page-rank, status, Katz and eigenvector. We have proposed a new method of multi-parametric centrality, which includes a number of basic properties of the network member. The mathematical model of multi-parametric centrality method is developed. Comparison of results for the presented method with the centrality methods is carried out. For evaluate the results for the multi-parametric centrality methodthe graph model with hundreds of vertices is analyzed. The comparative analysis showed the accuracy of presented method, includes simultaneously a number of basic properties of vertices.
Jibson, Randall W.; Jibson, Matthew W.
2003-01-01
Landslides typically cause a large proportion of earthquake damage, and the ability to predict slope performance during earthquakes is important for many types of seismic-hazard analysis and for the design of engineered slopes. Newmark's method for modeling a landslide as a rigid-plastic block sliding on an inclined plane provides a useful method for predicting approximate landslide displacements. Newmark's method estimates the displacement of a potential landslide block as it is subjected to earthquake shaking from a specific strong-motion record (earthquake acceleration-time history). A modification of Newmark's method, decoupled analysis, allows modeling landslides that are not assumed to be rigid blocks. This open-file report is available on CD-ROM and contains Java programs intended to facilitate performing both rigorous and simplified Newmark sliding-block analysis and a simplified model of decoupled analysis. For rigorous analysis, 2160 strong-motion records from 29 earthquakes are included along with a search interface for selecting records based on a wide variety of record properties. Utilities are available that allow users to add their own records to the program and use them for conducting Newmark analyses. Also included is a document containing detailed information about how to use Newmark's method to model dynamic slope performance. This program will run on any platform that supports the Java Runtime Environment (JRE) version 1.3, including Windows, Mac OSX, Linux, Solaris, etc. A minimum of 64 MB of available RAM is needed, and the fully installed program requires 400 MB of disk space.
AN EULERIAN-LAGRANGIAN LOCALIZED ADJOINT METHOD FOR THE ADVECTION-DIFFUSION EQUATION
Many numerical methods use characteristic analysis to accommodate the advective component of transport. Such characteristic methods include Eulerian-Lagrangian methods (ELM), modified method of characteristics (MMOC), and operator splitting methods. A generalization of characteri...
Advanced Productivity Analysis Methods for Air Traffic Control Operations.
DOT National Transportation Integrated Search
1976-12-01
This report gives a description of the Air Traffic Control (ATC) productivity analysis methods developed, implemented, and refined by the Stanford Research Institute (SRI) under the sponsorship of FAA and TSC. Two models are included in the productiv...
Development of new methodologies for evaluating the energy performance of new commercial buildings
NASA Astrophysics Data System (ADS)
Song, Suwon
The concept of Measurement and Verification (M&V) of a new building continues to become more important because efficient design alone is often not sufficient to deliver an efficient building. Simulation models that are calibrated to measured data can be used to evaluate the energy performance of new buildings if they are compared to energy baselines such as similar buildings, energy codes, and design standards. Unfortunately, there is a lack of detailed M&V methods and analysis methods to measure energy savings from new buildings that would have hypothetical energy baselines. Therefore, this study developed and demonstrated several new methodologies for evaluating the energy performance of new commercial buildings using a case-study building in Austin, Texas. First, three new M&V methods were developed to enhance the previous generic M&V framework for new buildings, including: (1) The development of a method to synthesize weather-normalized cooling energy use from a correlation of Motor Control Center (MCC) electricity use when chilled water use is unavailable, (2) The development of an improved method to analyze measured solar transmittance against incidence angle for sample glazing using different solar sensor types, including Eppley PSP and Li-Cor sensors, and (3) The development of an improved method to analyze chiller efficiency and operation at part-load conditions. Second, three new calibration methods were developed and analyzed, including: (1) A new percentile analysis added to the previous signature method for use with a DOE-2 calibration, (2) A new analysis to account for undocumented exhaust air in DOE-2 calibration, and (3) An analysis of the impact of synthesized direct normal solar radiation using the Erbs correlation on DOE-2 simulation. Third, an analysis of the actual energy savings compared to three different energy baselines was performed, including: (1) Energy Use Index (EUI) comparisons with sub-metered data, (2) New comparisons against Standards 90.1-1989 and 90.1-2001, and (3) A new evaluation of the performance of selected Energy Conservation Design Measures (ECDMs). Finally, potential energy savings were also simulated from selected improvements, including: minimum supply air flow, undocumented exhaust air, and daylighting.
Proceedings of the Third Annual Symposium on Mathematical Pattern Recognition and Image Analysis
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.
1985-01-01
Topics addressed include: multivariate spline method; normal mixture analysis applied to remote sensing; image data analysis; classifications in spatially correlated environments; probability density functions; graphical nonparametric methods; subpixel registration analysis; hypothesis integration in image understanding systems; rectification of satellite scanner imagery; spatial variation in remotely sensed images; smooth multidimensional interpolation; and optimal frequency domain textural edge detection filters.
NASA Astrophysics Data System (ADS)
Wang, Chenggen; Zhou, Qian; Gao, Shuning; Luo, Jia; Diao, Junchao; Zhao, Haoran; Bu, Jing
2018-04-01
This paper reviews the recent studies of Sub-Synchronous Oscillation(SSO) in wind farms. Mechanisms and analysis methods are the main concerns of this article. A classification method including new types of oscillation occurred between wind farms and HVDC systems and oscillation caused by Permanent Magnet Synchronous Generators(PMSG) is proposed. Characteristics of oscillation analysis techniques are summarized.
Experimental analysis of computer system dependability
NASA Technical Reports Server (NTRS)
Iyer, Ravishankar, K.; Tang, Dong
1993-01-01
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.
Decerns: A framework for multi-criteria decision analysis
Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...
2015-02-27
A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.
NASA Technical Reports Server (NTRS)
Heldenfels, Richard R
1951-01-01
A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.
Analyzing the Validity of the Adult-Adolescent Parenting Inventory for Low-Income Populations
ERIC Educational Resources Information Center
Lawson, Michael A.; Alameda-Lawson, Tania; Byrnes, Edward
2017-01-01
Objectives: The purpose of this study was to examine the construct and predictive validity of the Adult-Adolescent Parenting Inventory (AAPI-2). Methods: The validity of the AAPI-2 was evaluated using multiple statistical methods, including exploratory factor analysis, confirmatory factor analysis, and latent class analysis. These analyses were…
Langholz, Bryan; Thomas, Duncan C.; Stovall, Marilyn; Smith, Susan A.; Boice, John D.; Shore, Roy E.; Bernstein, Leslie; Lynch, Charles F.; Zhang, Xinbo; Bernstein, Jonine L.
2009-01-01
Summary Methods for the analysis of individually matched case-control studies with location-specific radiation dose and tumor location information are described. These include likelihood methods for analyses that just use cases with precise location of tumor information and methods that also include cases with imprecise tumor location information. The theory establishes that each of these likelihood based methods estimates the same radiation rate ratio parameters, within the context of the appropriate model for location and subject level covariate effects. The underlying assumptions are characterized and the potential strengths and limitations of each method are described. The methods are illustrated and compared using the WECARE study of radiation and asynchronous contralateral breast cancer. PMID:18647297
Evaluation of Yogurt Microstructure Using Confocal Laser Scanning Microscopy and Image Analysis.
Skytte, Jacob L; Ghita, Ovidiu; Whelan, Paul F; Andersen, Ulf; Møller, Flemming; Dahl, Anders B; Larsen, Rasmus
2015-06-01
The microstructure of protein networks in yogurts defines important physical properties of the yogurt and hereby partly its quality. Imaging this protein network using confocal scanning laser microscopy (CSLM) has shown good results, and CSLM has become a standard measuring technique for fermented dairy products. When studying such networks, hundreds of images can be obtained, and here image analysis methods are essential for using the images in statistical analysis. Previously, methods including gray level co-occurrence matrix analysis and fractal analysis have been used with success. However, a range of other image texture characterization methods exists. These methods describe an image by a frequency distribution of predefined image features (denoted textons). Our contribution is an investigation of the choice of image analysis methods by performing a comparative study of 7 major approaches to image texture description. Here, CSLM images from a yogurt fermentation study are investigated, where production factors including fat content, protein content, heat treatment, and incubation temperature are varied. The descriptors are evaluated through nearest neighbor classification, variance analysis, and cluster analysis. Our investigation suggests that the texton-based descriptors provide a fuller description of the images compared to gray-level co-occurrence matrix descriptors and fractal analysis, while still being as applicable and in some cases as easy to tune. © 2015 Institute of Food Technologists®
C4 Software Technology Reference Guide - A Prototype.
1997-01-10
domain analysis methods include • Feature-oriented domain analysis ( FODA ) (see pg. 185), a domain analysis method based upon identifying the... Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR-21, ADA 235785). Pittsburgh, PA: Software En- gineering Institute, Carnegie Mellon University, 1990. 178...domain analysis ( FODA ) (see pg. 185), in which a feature is a user-visible aspect or char- acteristic of the domain [Kang 90].) The features in a system
Applied Graph-Mining Algorithms to Study Biomolecular Interaction Networks
2014-01-01
Protein-protein interaction (PPI) networks carry vital information on the organization of molecular interactions in cellular systems. The identification of functionally relevant modules in PPI networks is one of the most important applications of biological network analysis. Computational analysis is becoming an indispensable tool to understand large-scale biomolecular interaction networks. Several types of computational methods have been developed and employed for the analysis of PPI networks. Of these computational methods, graph comparison and module detection are the two most commonly used strategies. This review summarizes current literature on graph kernel and graph alignment methods for graph comparison strategies, as well as module detection approaches including seed-and-extend, hierarchical clustering, optimization-based, probabilistic, and frequent subgraph methods. Herein, we provide a comprehensive review of the major algorithms employed under each theme, including our recently published frequent subgraph method, for detecting functional modules commonly shared across multiple cancer PPI networks. PMID:24800226
A Century of Enzyme Kinetic Analysis, 1913 to 2013
Johnson, Kenneth A.
2013-01-01
This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. PMID:23850893
Comparison and evaluation of fusion methods used for GF-2 satellite image in coastal mangrove area
NASA Astrophysics Data System (ADS)
Ling, Chengxing; Ju, Hongbo; Liu, Hua; Zhang, Huaiqing; Sun, Hua
2018-04-01
GF-2 satellite is the highest spatial resolution Remote Sensing Satellite of the development history of China's satellite. In this study, three traditional fusion methods including Brovey, Gram-Schmidt and Color Normalized (CN were used to compare with the other new fusion method NNDiffuse, which used the qualitative assessment and quantitative fusion quality index, including information entropy, variance, mean gradient, deviation index, spectral correlation coefficient. Analysis results show that NNDiffuse method presented the optimum in qualitative and quantitative analysis. It had more effective for the follow up of remote sensing information extraction and forest, wetland resources monitoring applications.
Hauber, A Brett; González, Juan Marcos; Groothuis-Oudshoorn, Catharina G M; Prior, Thomas; Marshall, Deborah A; Cunningham, Charles; IJzerman, Maarten J; Bridges, John F P
2016-06-01
Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice experiments (DCEs), have been increasingly used to quantify preferences of patients, caregivers, physicians, and other stakeholders. Recent consensus-based guidance on good research practices, including two recent task force reports from the International Society for Pharmacoeconomics and Outcomes Research, has aided in improving the quality of conjoint analyses and DCEs in outcomes research. Nevertheless, uncertainty regarding good research practices for the statistical analysis of data from DCEs persists. There are multiple methods for analyzing DCE data. Understanding the characteristics and appropriate use of different analysis methods is critical to conducting a well-designed DCE study. This report will assist researchers in evaluating and selecting among alternative approaches to conducting statistical analysis of DCE data. We first present a simplistic DCE example and a simple method for using the resulting data. We then present a pedagogical example of a DCE and one of the most common approaches to analyzing data from such a question format-conditional logit. We then describe some common alternative methods for analyzing these data and the strengths and weaknesses of each alternative. We present the ESTIMATE checklist, which includes a list of questions to consider when justifying the choice of analysis method, describing the analysis, and interpreting the results. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
ANALYSIS OF SWINE LAGOONS AND GROUND WATER FOR ENVIRONMENTAL ESTROGENS
A method was developed for analysis of low levels of natural (estradiol, estrone, estriol) and synthetic (ethynylestradiol) estrogens in ground water and swine waste lagoon effluent. The method includes solid phase extraction of the estrogens, preparation of pentafluorobenzyl der...
System safety engineering analysis handbook
NASA Technical Reports Server (NTRS)
Ijams, T. E.
1972-01-01
The basic requirements and guidelines for the preparation of System Safety Engineering Analysis are presented. The philosophy of System Safety and the various analytic methods available to the engineering profession are discussed. A text-book description of each of the methods is included.
Methods for the evaluation of alternative disaster warning systems
NASA Technical Reports Server (NTRS)
Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.
1977-01-01
For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.
ERIC Educational Resources Information Center
Potoczak, Kathryn; Carr, James E.; Michael, Jack
2007-01-01
Two distinct analytic methods have been used to identify the function of problem behavior. The antecedent-behavior-consequence (ABC) method (Iwata, Dorsey, Slifer, Bauman, & Richman, 1982/1994) includes the delivery of consequences for problem behavior. The AB method (Carr & Durand, 1985) does not include consequence delivery, instead relying…
Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin
2017-11-10
The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Berendsen, Bjorn J A; Gerritsen, Henk W; Wegh, Robin S; Lameris, Steven; van Sebille, Ralph; Stolker, Alida A M; Nielen, Michel W F
2013-09-01
A comprehensive method for the quantitative residue analysis of trace levels of 22 ß-lactam antibiotics, including penicillins, cephalosporins, and carbapenems, in poultry muscle by liquid chromatography in combination with tandem mass spectrometric detection is reported. The samples analyzed for ß-lactam residues are hydrolyzed using piperidine in order to improve compound stability and to include the total residue content of the cephalosporin ceftifour. The reaction procedure was optimized using a full experimental design. Following detailed isotope labeling, tandem mass spectrometry studies and exact mass measurements using high-resolution mass spectrometry reaction schemes could be proposed for all ß-lactams studied. The main reaction occurring is the hydrolysis of the ß-lactam ring under formation of the piperidine substituted amide. For some ß-lactams, multiple isobaric hydrolysis reaction products are obtained, in accordance with expectations, but this did not hamper quantitative analysis. The final method was fully validated as a quantitative confirmatory residue analysis method according to Commission Decision 2002/657/EC and showed satisfactory quantitative performance for all compounds with trueness between 80 and 110% and within-laboratory reproducibility below 22% at target level, except for biapenem. For biapenem, the method proved to be suitable for qualitative analysis only.
Buckling analysis and test correlation of hat stiffened panels for hypersonic vehicles
NASA Technical Reports Server (NTRS)
Percy, Wendy C.; Fields, Roger A.
1990-01-01
The paper discusses the design, analysis, and test of hat stiffened panels subjected to a variety of thermal and mechanical load conditions. The panels were designed using data from structural optimization computer codes and finite element analysis. Test methods included the grid shadow moire method and a single gage force stiffness method. The agreement between the test data and analysis provides confidence in the methods that are currently being used to design structures for hypersonic vehicles. The agreement also indicates that post buckled strength may potentially be used to reduce the vehicle weight.
Integrative Analysis of “-Omics” Data Using Penalty Functions
Zhao, Qing; Shi, Xingjie; Huang, Jian; Liu, Jin; Li, Yang; Ma, Shuangge
2014-01-01
In the analysis of omics data, integrative analysis provides an effective way of pooling information across multiple datasets or multiple correlated responses, and can be more effective than single-dataset (response) analysis. Multiple families of integrative analysis methods have been proposed in the literature. The current review focuses on the penalization methods. Special attention is paid to sparse meta-analysis methods that pool summary statistics across datasets, and integrative analysis methods that pool raw data across datasets. We discuss their formulation and rationale. Beyond “standard” penalized selection, we also review contrasted penalization and Laplacian penalization which accommodate finer data structures. The computational aspects, including computational algorithms and tuning parameter selection, are examined. This review concludes with possible limitations and extensions. PMID:25691921
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.
Laboratory theory and methods for sediment analysis
Guy, Harold P.
1969-01-01
The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.
A manual for inexpensive methods of analyzing and utilizing remote sensor data
NASA Technical Reports Server (NTRS)
Elifrits, C. D.; Barr, D. J.
1978-01-01
Instructions are provided for inexpensive methods of using remote sensor data to assist in the completion of the need to observe the earth's surface. When possible, relative costs were included. Equipment need for analysis of remote sensor data is described, and methods of use of these equipment items are included, as well as advantages and disadvantages of the use of individual items. Interpretation and analysis of stereo photos and the interpretation of typical patterns such as tone and texture, landcover, drainage, and erosional form are described. Similar treatment is given to monoscopic image interpretation, including LANDSAT MSS data. Enhancement techniques are detailed with respect to their application and simple techniques of creating an enhanced data item. Techniques described include additive and subtractive (Diazo processes) color techniques and enlargement of photos or images. Applications of these processes, including mappings of land resources, engineering soils, geology, water resources, environmental conditions, and crops and/or vegetation, are outlined.
ANALYSIS OF SWINE LAGOONS AND GROUND WATER FOR ENVIRONMENTAL ESTROGENS
A method was developed for analysis of low levels of natural (estradiol, estrone, estriol) and synthetic (ethinyl estradiol) estrogens in ground water and swine waste lagoon effluent. The method includes solid phase extraction of the estrogens, preparation of pentafluorobenzyl de...
Nested PCR and RFLP analysis based on the 16S rRNA gene
USDA-ARS?s Scientific Manuscript database
Current phytoplasma detection and identification method is primarily based on nested PCR followed by restriction fragment length polymorphism analysis and gel electrophoresis. This method can potentially detect and differentiate all phytoplasmas including those previously not described. The present ...
Topics on Test Methods for Space Systems and Operations Safety: Applicability of Experimental Data
NASA Technical Reports Server (NTRS)
Hirsch, David B.
2009-01-01
This viewgraph presentation reviews topics on test methods for space systems and operations safety through experimentation and analysis. The contents include: 1) Perception of reality through experimentation and analysis; 2) Measurements, methods, and correlations with real life; and 3) Correlating laboratory aerospace materials flammability data with data in spacecraft environments.
Operation Team Spirit: Program Review and Analysis
2009-06-01
research was strictly qualitative in nature. The specific method being used was exploratory case study analysis... study , method is used for the specific acts of conducting research , while methodology refers to the qualitative nature of research performed. While...the researcher . According to Leedy and Ormrod (2005), some of the types of methods and their respective purposes include: • Case study :
Multivariate Methods for Meta-Analysis of Genetic Association Studies.
Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G
2018-01-01
Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.
Validity and consistency assessment of accident analysis methods in the petroleum industry.
Ahmadi, Omran; Mortazavi, Seyed Bagher; Khavanin, Ali; Mokarami, Hamidreza
2017-11-17
Accident analysis is the main aspect of accident investigation. It includes the method of connecting different causes in a procedural way. Therefore, it is important to use valid and reliable methods for the investigation of different causal factors of accidents, especially the noteworthy ones. This study aimed to prominently assess the accuracy (sensitivity index [SI]) and consistency of the six most commonly used accident analysis methods in the petroleum industry. In order to evaluate the methods of accident analysis, two real case studies (process safety and personal accident) from the petroleum industry were analyzed by 10 assessors. The accuracy and consistency of these methods were then evaluated. The assessors were trained in the workshop of accident analysis methods. The systematic cause analysis technique and bowtie methods gained the greatest SI scores for both personal and process safety accidents, respectively. The best average results of the consistency in a single method (based on 10 independent assessors) were in the region of 70%. This study confirmed that the application of methods with pre-defined causes and a logic tree could enhance the sensitivity and consistency of accident analysis.
Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.
Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G
2018-06-01
This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.
USDA-ARS?s Scientific Manuscript database
Visible/near-infrared (Vis/NIR) spectroscopy with wavelength range between 400 and 2500 nm combined with factor analysis method was tested to predict quality attributes of chicken breast fillets. Quality attributes, including color (L*, a*, b*), pH, and drip loss were analyzed using factor analysis ...
Spectroscopic analysis and control
Tate; , James D.; Reed, Christopher J.; Domke, Christopher H.; Le, Linh; Seasholtz, Mary Beth; Weber, Andy; Lipp, Charles
2017-04-18
Apparatus for spectroscopic analysis which includes a tunable diode laser spectrometer having a digital output signal and a digital computer for receiving the digital output signal from the spectrometer, the digital computer programmed to process the digital output signal using a multivariate regression algorithm. In addition, a spectroscopic method of analysis using such apparatus. Finally, a method for controlling an ethylene cracker hydrogenator.
A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers
Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.
2016-01-01
Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715
Process Improvement Through Tool Integration in Aero-Mechanical Design
NASA Technical Reports Server (NTRS)
Briggs, Clark
2010-01-01
Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.
Documentation of spreadsheets for the analysis of aquifer-test and slug-test data
Halford, Keith J.; Kuniansky, Eve L.
2002-01-01
Several spreadsheets have been developed for the analysis of aquifer-test and slug-test data. Each spreadsheet incorporates analytical solution(s) of the partial differential equation for ground-water flow to a well for a specific type of condition or aquifer. The derivations of the analytical solutions were previously published. Thus, this report abbreviates the theoretical discussion, but includes practical information about each method and the important assumptions for the applications of each method. These spreadsheets were written in Microsoft Excel 9.0 (use of trade names does not constitute endorsement by the USGS). Storage properties should not be estimated with many of the spreadsheets because most are for analyzing single-well tests. Estimation of storage properties from single-well tests is generally discouraged because single-well tests are affected by wellbore storage and by well construction. These non-ideal effects frequently cause estimates of storage to be erroneous by orders of magnitude. Additionally, single-well tests are not sensitive to aquifer-storage properties. Single-well tests include all slug tests (Bouwer and Rice Method, Cooper, Bredehoeft, Papadopulos Method, and van der Kamp Method), the Cooper-Jacob straight-line Method, Theis recovery-data analysis, Jacob-Lohman method for flowing wells in a confined aquifer, and the step-drawdown test. Multi-well test spreadsheets included in this report are; Hantush-Jacob Leaky Aquifer Method and Distance-Drawdown Methods. The distance-drawdown method is an equilibrium or steady-state method, thus storage cannot be estimated.
Freud: a software suite for high-throughput simulation analysis
NASA Astrophysics Data System (ADS)
Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon
Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.
Symbolic algebra approach to the calculation of intraocular lens power following cataract surgery
NASA Astrophysics Data System (ADS)
Hjelmstad, David P.; Sayegh, Samir I.
2013-03-01
We present a symbolic approach based on matrix methods that allows for the analysis and computation of intraocular lens power following cataract surgery. We extend the basic matrix approach corresponding to paraxial optics to include astigmatism and other aberrations. The symbolic approach allows for a refined analysis of the potential sources of errors ("refractive surprises"). We demonstrate the computation of lens powers including toric lenses that correct for both defocus (myopia, hyperopia) and astigmatism. A specific implementation in Mathematica allows an elegant and powerful method for the design and analysis of these intraocular lenses.
Comparison of Analysis, Simulation, and Measurement of Wire-to-Wire Crosstalk. Part 2
NASA Technical Reports Server (NTRS)
Bradley, Arthur T.; Yavoich, Brian James; Hodson, Shane M.; Godley, Franklin
2010-01-01
In this investigation, we compare crosstalk analysis, simulation, and measurement results for electrically short configurations. Methods include hand calculations, PSPICE simulations, Microstripes transient field solver, and empirical measurement. In total, four representative physical configurations are examined, including a single wire over a ground plane, a twisted pair over a ground plane, generator plus receptor wires inside a cylindrical conduit, and a single receptor wire inside a cylindrical conduit. Part 1 addresses the first two cases, and Part 2 addresses the final two. Agreement between the analysis methods and test data is shown to be very good.
SeeSway - A free web-based system for analysing and exploring standing balance data.
Clark, Ross A; Pua, Yong-Hao
2018-06-01
Computerised posturography can be used to assess standing balance, and can predict poor functional outcomes in many clinical populations. A key limitation is the disparate signal filtering and analysis techniques, with many methods requiring custom computer programs. This paper discusses the creation of a freely available web-based software program, SeeSway (www.rehabtools.org/seesway), which was designed to provide powerful tools for pre-processing, analysing and visualising standing balance data in an easy to use and platform independent website. SeeSway links an interactive web platform with file upload capability to software systems including LabVIEW, Matlab, Python and R to perform the data filtering, analysis and visualisation of standing balance data. Input data can consist of any signal that comprises an anterior-posterior and medial-lateral coordinate trace such as center of pressure or mass displacement. This allows it to be used with systems including criterion reference commercial force platforms and three dimensional motion analysis, smartphones, accelerometers and low-cost technology such as Nintendo Wii Balance Board and Microsoft Kinect. Filtering options include Butterworth, weighted and unweighted moving average, and discrete wavelet transforms. Analysis methods include standard techniques such as path length, amplitude, and root mean square in addition to less common but potentially promising methods such as sample entropy, detrended fluctuation analysis and multiresolution wavelet analysis. These data are visualised using scalograms, which chart the change in frequency content over time, scatterplots and standard line charts. This provides the user with a detailed understanding of their results, and how their different pre-processing and analysis method selections affect their findings. An example of the data analysis techniques is provided in the paper, with graphical representation of how advanced analysis methods can better discriminate between someone with neurological impairment and a healthy control. The goal of SeeSway is to provide a simple yet powerful educational and research tool to explore how standing balance is affected in aging and clinical populations. Copyright © 2018 Elsevier B.V. All rights reserved.
2011-01-01
Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440
High resolution frequency analysis techniques with application to the redshift experiment
NASA Technical Reports Server (NTRS)
Decher, R.; Teuber, D.
1975-01-01
High resolution frequency analysis methods, with application to the gravitational probe redshift experiment, are discussed. For this experiment a resolution of .00001 Hz is required to measure a slowly varying, low frequency signal of approximately 1 Hz. Major building blocks include fast Fourier transform, discrete Fourier transform, Lagrange interpolation, golden section search, and adaptive matched filter technique. Accuracy, resolution, and computer effort of these methods are investigated, including test runs on an IBM 360/65 computer.
Computational Methods for Structural Mechanics and Dynamics
NASA Technical Reports Server (NTRS)
Stroud, W. Jefferson (Editor); Housner, Jerrold M. (Editor); Tanner, John A. (Editor); Hayduk, Robert J. (Editor)
1989-01-01
Topics addressed include: transient dynamics; transient finite element method; transient analysis in impact and crash dynamic studies; multibody computer codes; dynamic analysis of space structures; multibody mechanics and manipulators; spatial and coplanar linkage systems; flexible body simulation; multibody dynamics; dynamical systems; and nonlinear characteristics of joints.
Methods for structural design at elevated temperatures
NASA Technical Reports Server (NTRS)
Ellison, A. M.; Jones, W. E., Jr.; Leimbach, K. R.
1973-01-01
A procedure which can be used to design elevated temperature structures is discussed. The desired goal is to have the same confidence in the structural integrity at elevated temperature as the factor of safety gives on mechanical loads at room temperature. Methods of design and analysis for creep, creep rupture, and creep buckling are presented. Example problems are included to illustrate the analytical methods. Creep data for some common structural materials are presented. Appendix B is description, user's manual, and listing for the creep analysis program. The program predicts time to a given creep or to creep rupture for a material subjected to a specified stress-temperature-time spectrum. Fatigue at elevated temperature is discussed. Methods of analysis for high stress-low cycle fatigue, fatigue below the creep range, and fatigue in the creep range are included. The interaction of thermal fatigue and mechanical loads is considered, and a detailed approach to fatigue analysis is given for structures operating below the creep range.
Li, Xueqi; Woodman, Michael; Wang, Selina C
2015-08-01
Pheophytins and pyropheophytin are degradation products of chlorophyll pigments, and their ratios can be used as a sensitive indicator of stress during the manufacturing and storage of olive oil. They increase over time depending on the storage condition and if the oil is exposed to heat treatments during the refining process. The traditional analysis method includes solvent- and time-consuming steps of solid-phase extraction followed by analysis by high-performance liquid chromatography with ultraviolet detection. We developed an improved dilute/fluorescence method where multi-step sample preparation was replaced by a simple isopropanol dilution before the high-performance liquid chromatography injection. A quaternary solvent gradient method was used to include a fourth strong solvent wash on a quaternary gradient pump, which avoided the need to premix any solvents and greatly reduced the oil residues on the column from previous analysis. This new method not only reduces analysis cost and time but shows reliability, repeatability, and improved sensitivity, especially important for low-level samples. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Structure identification methods for atomistic simulations of crystalline materials
Stukowski, Alexander
2012-05-28
Here, we discuss existing and new computational analysis techniques to classify local atomic arrangements in large-scale atomistic computer simulations of crystalline solids. This article includes a performance comparison of typical analysis algorithms such as common neighbor analysis (CNA), centrosymmetry analysis, bond angle analysis, bond order analysis and Voronoi analysis. In addition we propose a simple extension to the CNA method that makes it suitable for multi-phase systems. Finally, we introduce a new structure identification algorithm, the neighbor distance analysis, which is designed to identify atomic structure units in grain boundaries.
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Hsien
2012-11-01
Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.
Recent statistical methods for orientation data
NASA Technical Reports Server (NTRS)
Batschelet, E.
1972-01-01
The application of statistical methods for determining the areas of animal orientation and navigation are discussed. The method employed is limited to the two-dimensional case. Various tests for determining the validity of the statistical analysis are presented. Mathematical models are included to support the theoretical considerations and tables of data are developed to show the value of information obtained by statistical analysis.
Method of multi-dimensional moment analysis for the characterization of signal peaks
Pfeifer, Kent B; Yelton, William G; Kerr, Dayle R; Bouchier, Francis A
2012-10-23
A method of multi-dimensional moment analysis for the characterization of signal peaks can be used to optimize the operation of an analytical system. With a two-dimensional Peclet analysis, the quality and signal fidelity of peaks in a two-dimensional experimental space can be analyzed and scored. This method is particularly useful in determining optimum operational parameters for an analytical system which requires the automated analysis of large numbers of analyte data peaks. For example, the method can be used to optimize analytical systems including an ion mobility spectrometer that uses a temperature stepped desorption technique for the detection of explosive mixtures.
NASA Astrophysics Data System (ADS)
Carpenter, Matthew H.; Jernigan, J. G.
2007-05-01
We present examples of an analysis progression consisting of a synthesis of the Photon Clean Method (Carpenter, Jernigan, Brown, Beiersdorfer 2007) and bootstrap methods to quantify errors and variations in many-parameter models. The Photon Clean Method (PCM) works well for model spaces with large numbers of parameters proportional to the number of photons, therefore a Monte Carlo paradigm is a natural numerical approach. Consequently, PCM, an "inverse Monte-Carlo" method, requires a new approach for quantifying errors as compared to common analysis methods for fitting models of low dimensionality. This presentation will explore the methodology and presentation of analysis results derived from a variety of public data sets, including observations with XMM-Newton, Chandra, and other NASA missions. Special attention is given to the visualization of both data and models including dynamic interactive presentations. This work was performed under the auspices of the Department of Energy under contract No. W-7405-Eng-48. We thank Peter Beiersdorfer and Greg Brown for their support of this technical portion of a larger program related to science with the LLNL EBIT program.
Emerging surface characterization techniques for carbon steel corrosion: a critical brief review.
Dwivedi, D; Lepkova, K; Becker, T
2017-03-01
Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed.
Emerging surface characterization techniques for carbon steel corrosion: a critical brief review
NASA Astrophysics Data System (ADS)
Dwivedi, D.; Lepkova, K.; Becker, T.
2017-03-01
Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed.
Emerging surface characterization techniques for carbon steel corrosion: a critical brief review
Dwivedi, D.; Becker, T.
2017-01-01
Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed. PMID:28413351
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
Summary of research in applied mathematics, numerical analysis, and computer sciences
NASA Technical Reports Server (NTRS)
1986-01-01
The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.
Aeroelastic stability and response of rotating structures
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.
1993-01-01
A summary of the work performed during the progress period is presented. Analysis methods for predicting loads and instabilities of wind turbines were developed. Three new areas of research to aid the Advanced Turboprop Project (ATP) were initiated and developed. These three areas of research are aeroelastic analysis methods for cascades including blade and disk flexibility; stall flutter analysis; and computational aeroelasticity.
Reliability Validation and Improvement Framework
2012-11-01
systems . Steps in that direction include the use of the Architec- ture Tradeoff Analysis Method ® (ATAM®) developed at the Carnegie Mellon...embedded software • cyber - physical systems (CPSs) to indicate that the embedded software interacts with, manag - es, and controls a physical system [Lee...the use of formal static analysis methods to increase our confidence in system operation beyond testing. However, analysis results
U.S. EPA Method 1623 is used to detect and quantify Cruptosporidum spp. oocysts in ater. The protocol consists of filtration, immunomagnetic separation (IMS), staining with a fluorescent antibody, and microscopic analysis. Microscopic analysis includes detection by fluorescent ...
Lange, Miranda; Zeng, Yan; Knight, Andrew; Windebank, Anthony; Trushina, Eugenia
2012-01-01
Changes in mitochondrial dynamics and function contribute to progression of multiple neurodegenerative diseases including peripheral neuropathies. The Seahorse Extracellular Flux XF24 analyzer provides a comprehensive assessment of the relative state of glycolytic and aerobic metabolism in live cells making this method instrumental in assessing mitochondrial function. One of the most important steps in the analysis of mitochondrial respiration using the Seahorse XF24 analyzer is plating a uniform monolayer of firmly attached cells. However, culturing of primary dorsal root ganglion (DRG) neurons is associated with multiple challenges, including their propensity to form clumps and detach from the culture plate. This could significantly interfere with proper analysis and interpretation of data. We have tested multiple cell culture parameters including coating substrates, culture medium, XF24 microplate plastics, and plating techniques in order to optimize plating conditions. Here we describe a highly reproducible method to obtain neuron-enriched monolayers of securely attached dissociated primary embryonic (E15) rat DRG neurons suitable for analysis with the Seahorse XF24 platform. PMID:23248613
Lange, Miranda; Zeng, Yan; Knight, Andrew; Windebank, Anthony; Trushina, Eugenia
2012-01-01
Changes in mitochondrial dynamics and function contribute to progression of multiple neurodegenerative diseases including peripheral neuropathies. The Seahorse Extracellular Flux XF24 analyzer provides a comprehensive assessment of the relative state of glycolytic and aerobic metabolism in live cells making this method instrumental in assessing mitochondrial function. One of the most important steps in the analysis of mitochondrial respiration using the Seahorse XF24 analyzer is plating a uniform monolayer of firmly attached cells. However, culturing of primary dorsal root ganglion (DRG) neurons is associated with multiple challenges, including their propensity to form clumps and detach from the culture plate. This could significantly interfere with proper analysis and interpretation of data. We have tested multiple cell culture parameters including coating substrates, culture medium, XF24 microplate plastics, and plating techniques in order to optimize plating conditions. Here we describe a highly reproducible method to obtain neuron-enriched monolayers of securely attached dissociated primary embryonic (E15) rat DRG neurons suitable for analysis with the Seahorse XF24 platform.
Practical Use of Computationally Frugal Model Analysis Methods
Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...
2015-03-21
Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less
Sayyah, Mehdi; Shirbandi, Kiarash; Saki-Malehi, Amal; Rahim, Fakher
2017-01-01
Objectives The aim of this systematic review and meta-analysis was to evaluate the problem-based learning (PBL) method as an alternative to conventional educational methods in Iranian undergraduate medical courses. Materials and methods We systematically searched international datasets banks, including PubMed, Scopus, and Embase, and internal resources of banks, including MagirIran, IranMedex, IranDoc, and Scientific Information Database (SID), using appropriate search terms, such as “PBL”, “problem-based learning”, “based on problems”, “active learning”, and“ learner centered”, to identify PBL studies, and these were combined with other key terms such as “medical”, “undergraduate”, “Iranian”, “Islamic Republic of Iran”, “I.R. of Iran”, and “Iran”. The search included the period from 1980 to 2016 with no language limits. Results Overall, a total of 1,057 relevant studies were initially found, of which 21 studies were included in the systematic review and meta-analysis. Of the 21 studies, 12 (57.14%) had a high methodological quality. Considering the pooled effect size data, there was a significant difference in the scores (standardized mean difference [SMD]=0.80, 95% CI [0.52, 1.08], P<0.000) in favor of PBL, compared with the lecture-based method. Subgroup analysis revealed that using PBL alone is more favorable compared to using a mixed model with other learning methods such as lecture-based learning (LBL). Conclusion The results of this systematic review showed that using PBL may have a positive effect on the academic achievement of undergraduate medical courses. The results suggest that teachers and medical education decision makers give more attention on using this method for effective and proper training. PMID:29042827
Apparatus and system for multivariate spectral analysis
Keenan, Michael R.; Kotula, Paul G.
2003-06-24
An apparatus and system for determining the properties of a sample from measured spectral data collected from the sample by performing a method of multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used by a spectrum analyzer to process X-ray spectral data generated by a spectral analysis system that can include a Scanning Electron Microscope (SEM) with an Energy Dispersive Detector and Pulse Height Analyzer.
NASA/Howard University Large Space Structures Institute
NASA Technical Reports Server (NTRS)
Broome, T. H., Jr.
1984-01-01
Basic research on the engineering behavior of large space structures is presented. Methods of structural analysis, control, and optimization of large flexible systems are examined. Topics of investigation include the Load Correction Method (LCM) modeling technique, stabilization of flexible bodies by feedback control, mathematical refinement of analysis equations, optimization of the design of structural components, deployment dynamics, and the use of microprocessors in attitude and shape control of large space structures. Information on key personnel, budgeting, support plans and conferences is included.
1990-08-01
the spectral domain is extended to include the effects of two-dimensional, two-component current flow in planar transmission line discontinuities 6n...PROFESSOR: Tatsuo Itoh A deterministic formulation of the method of moments carried out in the spectral domain is extended to include the effects of...two-dimensional, two- component current flow in planar transmission line discontinuities on open substrates. The method includes the effects of space
Bayesian data analysis in population ecology: motivations, methods, and benefits
Dorazio, Robert
2016-01-01
During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.
Analysis of Environmental Contamination resulting from ...
Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to safe levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illu
Apel, William A.; Thompson, Vicki S; Lacey, Jeffrey A.; Gentillon, Cynthia A.
2016-08-09
A method for determining a plurality of proteins for discriminating and positively identifying an individual based from a biological sample. The method may include profiling a biological sample from a plurality of individuals against a protein array including a plurality of proteins. The protein array may include proteins attached to a support in a preselected pattern such that locations of the proteins are known. The biological sample may be contacted with the protein array such that a portion of antibodies in the biological sample reacts with and binds to the proteins forming immune complexes. A statistical analysis method, such as discriminant analysis, may be performed to determine discriminating proteins for distinguishing individuals. Proteins of interest may be used to form a protein array. Such a protein array may be used, for example, to compare a forensic sample from an unknown source with a sample from a known source.
Thompson, Vicki S; Lacey, Jeffrey A; Gentillon, Cynthia A; Apel, William A
2015-03-03
A method for determining a plurality of proteins for discriminating and positively identifying an individual based from a biological sample. The method may include profiling a biological sample from a plurality of individuals against a protein array including a plurality of proteins. The protein array may include proteins attached to a support in a preselected pattern such that locations of the proteins are known. The biological sample may be contacted with the protein array such that a portion of antibodies in the biological sample reacts with and binds to the proteins forming immune complexes. A statistical analysis method, such as discriminant analysis, may be performed to determine discriminating proteins for distinguishing individuals. Proteins of interest may be used to form a protein array. Such a protein array may be used, for example, to compare a forensic sample from an unknown source with a sample from a known source.
Methods for magnetic resonance analysis using magic angle technique
Hu, Jian Zhi [Richland, WA; Wind, Robert A [Kennewick, WA; Minard, Kevin R [Kennewick, WA; Majors, Paul D [Kennewick, WA
2011-11-22
Methods of performing a magnetic resonance analysis of a biological object are disclosed that include placing the object in a main magnetic field (that has a static field direction) and in a radio frequency field; rotating the object at a frequency of less than about 100 Hz around an axis positioned at an angle of about 54.degree.44' relative to the main magnetic static field direction; pulsing the radio frequency to provide a sequence that includes a phase-corrected magic angle turning pulse segment; and collecting data generated by the pulsed radio frequency. In particular embodiments the method includes pulsing the radio frequency to provide at least two of a spatially selective read pulse, a spatially selective phase pulse, and a spatially selective storage pulse. Further disclosed methods provide pulse sequences that provide extended imaging capabilities, such as chemical shift imaging or multiple-voxel data acquisition.
Recent Applications of Higher-Order Spectral Analysis to Nonlinear Aeroelastic Phenomena
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Hajj, Muhammad R.; Dunn, Shane; Strganac, Thomas W.; Powers, Edward J.; Stearman, Ronald
2005-01-01
Recent applications of higher-order spectral (HOS) methods to nonlinear aeroelastic phenomena are presented. Applications include the analysis of data from a simulated nonlinear pitch and plunge apparatus and from F-18 flight flutter tests. A MATLAB model of the Texas A&MUniversity s Nonlinear Aeroelastic Testbed Apparatus (NATA) is used to generate aeroelastic transients at various conditions including limit cycle oscillations (LCO). The Gaussian or non-Gaussian nature of the transients is investigated, related to HOS methods, and used to identify levels of increasing nonlinear aeroelastic response. Royal Australian Air Force (RAAF) F/A-18 flight flutter test data is presented and analyzed. The data includes high-quality measurements of forced responses and LCO phenomena. Standard power spectral density (PSD) techniques and HOS methods are applied to the data and presented. The goal of this research is to develop methods that can identify the onset of nonlinear aeroelastic phenomena, such as LCO, during flutter testing.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
Methods for the Analysis of Protein Phosphorylation-Mediated Cellular Signaling Networks
NASA Astrophysics Data System (ADS)
White, Forest M.; Wolf-Yadlin, Alejandro
2016-06-01
Protein phosphorylation-mediated cellular signaling networks regulate almost all aspects of cell biology, including the responses to cellular stimulation and environmental alterations. These networks are highly complex and comprise hundreds of proteins and potentially thousands of phosphorylation sites. Multiple analytical methods have been developed over the past several decades to identify proteins and protein phosphorylation sites regulating cellular signaling, and to quantify the dynamic response of these sites to different cellular stimulation. Here we provide an overview of these methods, including the fundamental principles governing each method, their relative strengths and weaknesses, and some examples of how each method has been applied to the analysis of complex signaling networks. When applied correctly, each of these techniques can provide insight into the topology, dynamics, and regulation of protein phosphorylation signaling networks.
Freud's psychoanalysis of Edith Banfield Jackson, 1930-1936.
Lynn, David J
2003-01-01
This paper is a historical study of Sigmund Freud's psychoanalysis of Edith Banfield Jackson. It relies on primary sources, including unpublished correspondence, to describe her background, the analysis itself, and her subsequent life. This analysis, which began in 1930, had both clinical and training purposes. Freud's actual methods are contrasted with his published recommendations in terms of anonymity, neutrality, and confidentiality. During this analysis, Sigmund Freud took on a number of roles in Edith Jackson's life, including teacher, commentator, social intermediary, recipient of her translation services, and recipient of her philanthropic donations. These roles are described in detail. The implications of Freud's actual methods in this case are fully discussed. Since Freud did not describe the methods he used in this case, they cannot be replicated, and, for clinical purposes, they are lost to history.
Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.
Clark, R.N.; Roush, T.L.
1984-01-01
Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors
A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network
NASA Astrophysics Data System (ADS)
Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.
2018-02-01
Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.
Multiple shooting shadowing for sensitivity analysis of chaotic dynamical systems
NASA Astrophysics Data System (ADS)
Blonigan, Patrick J.; Wang, Qiqi
2018-02-01
Sensitivity analysis methods are important tools for research and design with simulations. Many important simulations exhibit chaotic dynamics, including scale-resolving turbulent fluid flow simulations. Unfortunately, conventional sensitivity analysis methods are unable to compute useful gradient information for long-time-averaged quantities in chaotic dynamical systems. Sensitivity analysis with least squares shadowing (LSS) can compute useful gradient information for a number of chaotic systems, including simulations of chaotic vortex shedding and homogeneous isotropic turbulence. However, this gradient information comes at a very high computational cost. This paper presents multiple shooting shadowing (MSS), a more computationally efficient shadowing approach than the original LSS approach. Through an analysis of the convergence rate of MSS, it is shown that MSS can have lower memory usage and run time than LSS.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
A century of enzyme kinetic analysis, 1913 to 2013.
Johnson, Kenneth A
2013-09-02
This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.
2003-01-01
An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.
SEP thrust subsystem performance sensitivity analysis
NASA Technical Reports Server (NTRS)
Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.
1973-01-01
This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.
Use of model calibration to achieve high accuracy in analysis of computer networks
Frogner, Bjorn; Guarro, Sergio; Scharf, Guy
2004-05-11
A system and method are provided for creating a network performance prediction model, and calibrating the prediction model, through application of network load statistical analyses. The method includes characterizing the measured load on the network, which may include background load data obtained over time, and may further include directed load data representative of a transaction-level event. Probabilistic representations of load data are derived to characterize the statistical persistence of the network performance variability and to determine delays throughout the network. The probabilistic representations are applied to the network performance prediction model to adapt the model for accurate prediction of network performance. Certain embodiments of the method and system may be used for analysis of the performance of a distributed application characterized as data packet streams.
A method for the design of transonic flexible wings
NASA Technical Reports Server (NTRS)
Smith, Leigh Ann; Campbell, Richard L.
1990-01-01
Methodology was developed for designing airfoils and wings at transonic speeds which includes a technique that can account for static aeroelastic deflections. This procedure is capable of designing either supercritical or more conventional airfoil sections. Methods for including viscous effects are also illustrated and are shown to give accurate results. The methodology developed is an interactive system containing three major parts. A design module was developed which modifies airfoil sections to achieve a desired pressure distribution. This design module works in conjunction with an aerodynamic analysis module, which for this study is a small perturbation transonic flow code. Additionally, an aeroelastic module is included which determines the wing deformation due to the calculated aerodynamic loads. Because of the modular nature of the method, it can be easily coupled with any aerodynamic analysis code.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.
Karahalios, Amalia Emily; Salanti, Georgia; Turner, Simon L; Herbison, G Peter; White, Ian R; Veroniki, Areti Angeliki; Nikolakopoulou, Adriani; Mckenzie, Joanne E
2017-06-24
Network meta-analysis, a method to synthesise evidence from multiple treatments, has increased in popularity in the past decade. Two broad approaches are available to synthesise data across networks, namely, arm- and contrast-synthesis models, with a range of models that can be fitted within each. There has been recent debate about the validity of the arm-synthesis models, but to date, there has been limited empirical evaluation comparing results using the methods applied to a large number of networks. We aim to address this gap through the re-analysis of a large cohort of published networks of interventions using a range of network meta-analysis methods. We will include a subset of networks from a database of network meta-analyses of randomised trials that have been identified and curated from the published literature. The subset of networks will include those where the primary outcome is binary, the number of events and participants are reported for each direct comparison, and there is no evidence of inconsistency in the network. We will re-analyse the networks using three contrast-synthesis methods and two arm-synthesis methods. We will compare the estimated treatment effects, their standard errors, treatment hierarchy based on the surface under the cumulative ranking (SUCRA) curve, the SUCRA value, and the between-trial heterogeneity variance across the network meta-analysis methods. We will investigate whether differences in the results are affected by network characteristics and baseline risk. The results of this study will inform whether, in practice, the choice of network meta-analysis method matters, and if it does, in what situations differences in the results between methods might arise. The results from this research might also inform future simulation studies.
Soil analysis based on sa,ples withdrawn from different volumes: correlation versus calibration
Lucian Weilopolski; Kurt Johnsen; Yuen Zhang
2010-01-01
Soil, particularly in forests, is replete with spatial variation with respect to soil C. Th e present standard chemical method for soil analysis by dry combustion (DC) is destructive, and comprehensive sampling is labor intensive and time consuming. Th ese, among other factors, are contributing to the development of new methods for soil analysis. Th ese include a near...
NASA Technical Reports Server (NTRS)
Junkin, B. G. (Principal Investigator)
1979-01-01
A method is presented for the processing and analysis of digital topography data that can subsequently be entered in an interactive data base in the form of slope, slope length, elevation, and aspect angle. A discussion of the data source and specific descriptions of the data processing software programs are included. In addition, the mathematical considerations involved in the registration of raw digitized coordinate points to the UTM coordinate system are presented. Scale factor considerations are also included. Results of the processing and analysis are illustrated using the Shiprock and Gallup Quadrangle test data.
Analysis strategies for longitudinal attachment loss data.
Beck, J D; Elter, J R
2000-02-01
The purpose of this invited review is to describe and discuss methods currently in use to quantify the progression of attachment loss in epidemiological studies of periodontal disease, and to make recommendations for specific analytic methods based upon the particular design of the study and structure of the data. The review concentrates on the definition of incident attachment loss (ALOSS) and its component parts; measurement issues including thresholds and regression to the mean; methods of accounting for longitudinal change, including changes in means, changes in proportions of affected sites, incidence density, the effect of tooth loss and reversals, and repeated events; statistical models of longitudinal change, including the incorporation of the time element, use of linear, logistic or Poisson regression or survival analysis, and statistical tests; site vs person level of analysis, including statistical adjustment for correlated data; the strengths and limitations of ALOSS data. Examples from the Piedmont 65+ Dental Study are used to illustrate specific concepts. We conclude that incidence density is the preferred methodology to use for periodontal studies with more than one period of follow-up and that the use of studies not employing methods for dealing with complex samples, correlated data, and repeated measures does not take advantage of our current understanding of the site- and person-level variables important in periodontal disease and may generate biased results.
Arizona Indian Demographic Data: Needs and Recommendations.
ERIC Educational Resources Information Center
Taylor, Benjamin J.; Helmkamp, John
Included in this report on Arizona Indian demographic data are "an evaluation of several recent studies of Indian populations" and "an extensive analysis of methods for obtaining and maintaining accurate data in the future." Recommended methods by which accurate population data for the smaller reservations should be maintained are included in the…
Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler
2016-01-01
An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.
Cristale, Joyce; Lacorte, Silvia
2013-08-30
This study presents a multiresidue method for simultaneous extraction, clean-up and analysis of priority and emerging flame retardants in sediment, sewage sludge and dust. Studied compounds included eight polybrominated diphenyl ethers congeners, nine new brominated flame retardants and ten organophosphorus flame retardants. The analytical method was based on ultrasound-assisted extraction with ethyl acetate/cyclohexane (5:2, v/v), clean-up with Florisil cartridges and analysis by gas chromatography coupled to tandem mass spectrometry (GC-EI-MS/MS). Method development and validation protocol included spiked samples, certified reference material (for dust), and participation in an interlaboratory calibration. The method proved to be efficient and robust for extraction and determination of three families of flame retardants families in the studied solid matrices. The method was applied to river sediment, sewage sludge and dust samples, and allowed detection of 24 among the 27 studied flame retardants. Organophosphate esters, BDE-209 and decabromodiphenyl ethane were the most ubiquitous contaminants detected. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bohmann, Jonathan A.; Weinhold, Frank; Farrar, Thomas C.
1997-07-01
Nuclear magnetic shielding tensors computed by the gauge including atomic orbital (GIAO) method in the Hartree-Fock self-consistent-field (HF-SCF) framework are partitioned into magnetic contributions from chemical bonds and lone pairs by means of natural chemical shielding (NCS) analysis, an extension of natural bond orbital (NBO) analysis. NCS analysis complements the description provided by alternative localized orbital methods by directly calculating chemical shieldings due to delocalized features in the electronic structure, such as bond conjugation and hyperconjugation. Examples of NCS tensor decomposition are reported for CH4, CO, and H2CO, for which a graphical mnemonic due to Cornwell is used to illustrate the effect of hyperconjugative delocalization on the carbon shielding.
NASA Astrophysics Data System (ADS)
Chan, Y. C.; Vowles, P. D.; McTainsh, G. H.; Simpson, R. W.; Cohen, D. D.; Bailey, G. M.; McOrist, G. D.
This paper describes a method for the simultaneous collection of size-fractionated aerosol samples on several collection substrates, including glass-fibre filter, carbon tape and silver tape, with a commercially available high-volume cascade impactor. This permitted various chemical analysis procedures, including ion beam analysis (IBA), instrumental neutron activation analysis (INAA), carbon analysis and scanning electron microscopy (SEM), to be carried out on the samples.
Developing and validating a nutrition knowledge questionnaire: key methods and considerations.
Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina
2017-10-01
To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.
On-line gas chromatographic analysis of airborne particles
Hering, Susanne V [Berkeley, CA; Goldstein, Allen H [Orinda, CA
2012-01-03
A method and apparatus for the in-situ, chemical analysis of an aerosol. The method may include the steps of: collecting an aerosol; thermally desorbing the aerosol into a carrier gas to provide desorbed aerosol material; transporting the desorbed aerosol material onto the head of a gas chromatography column; analyzing the aerosol material using a gas chromatograph, and quantizing the aerosol material as it evolves from the gas chromatography column. The apparatus includes a collection and thermal desorption cell, a gas chromatograph including a gas chromatography column, heated transport lines coupling the cell and the column; and a quantization detector for aerosol material evolving from the gas chromatography column.
Non-Contacting Compliant Foil Seal for Gas Turbine Engine
NASA Technical Reports Server (NTRS)
Salehi, Mohsen; Heshmat, Hooshang
2002-01-01
The program is aimed at enhancing the existing analysis to include the turbulence effect. Several manufacturing methods are being investigated in order to apply our know-how in building the seal hardware. The contents include: 1) Test Facilities; 2) Analysis Enhancements; 3) Accomplishments/Status; and 4) Materials Study.
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-01-01
Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144
Bootstrap Methods: A Very Leisurely Look.
ERIC Educational Resources Information Center
Hinkle, Dennis E.; Winstead, Wayland H.
The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…
The U. S. Department of Energy SARP review training program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mauck, C.J.
1988-01-01
In support of its radioactive material packaging certification program, the U.S. Department of Energy (DOE) has established a special training workshop. The purpose of the two-week workshop is to develop skills in reviewing Safety Analysis Reports for Packagings (SARPs) and performing confirmatory analyses. The workshop, conducted by the Lawrence Livermore National Laboratory (LLNL) for DOE, is divided into two parts: methods of review and methods of analysis. The sessions covering methods of review are based on the DOE document, ''Packaging Review Guide for Reviewing Safety Analysis Reports for Packagings'' (PRG). The sessions cover relevant DOE Orders and all areas ofmore » review in the applicable Nuclear Regulatory Commission (NRC) Regulatory Guides. The technical areas addressed include structural and thermal behavior, materials, shielding, criticality, and containment. The course sessions on methods of analysis provide hands-on experience in the use of calculational methods and codes for reviewing SARPs. Analytical techniques and computer codes are discussed and sample problems are worked. Homework is assigned each night and over the included weekend; at the conclusion, a comprehensive take-home examination is given requiring six to ten hours to complete.« less
Applications of modern statistical methods to analysis of data in physical science
NASA Astrophysics Data System (ADS)
Wicker, James Eric
Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.
NASA Technical Reports Server (NTRS)
Duffy, S. F.; Hu, J.; Hopkins, D. A.
1995-01-01
The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.
NASA Technical Reports Server (NTRS)
Heinrichs, J. A.; Fee, J. J.
1972-01-01
Space station and solar array data and the analyses which were performed in support of the integrated dynamic analysis study. The analysis methods and the formulated digital simulation were developed. Control systems for space station altitude control and solar array orientation control include generic type control systems. These systems have been digitally coded and included in the simulation.
Methods and Devices for Micro-Isolation, Extraction, and/or Analysis of Microscale Components
NASA Technical Reports Server (NTRS)
Wade, Lawrence A. (Inventor); Kartalov, Emil P. (Inventor); Taylor, Clive (Inventor); Shibata, Darryl (Inventor)
2014-01-01
Provided herein are devices and methods for the micro-isolation of biological cellular material. A micro-isolation apparatus described can comprise a photomask that protects regions of interest against DNA-destroying illumination. The micro-isolation apparatus can further comprise photosensitive material defining access wells following illumination and subsequent developing of the photosensitive material. The micro-isolation apparatus can further comprise a chambered microfluidic device comprising channels providing access to wells defined in photosensitive material. The micro-isolation apparatus can comprise a chambered microfluidic device without access wells defined in photosensitive material where valves control the flow of gases or liquids through the channels of the microfluidic device. Also included are methods for selectively isolating cellular material using the apparatuses described herein, as are methods for biochemical analysis of individual regions of interest of cellular material using the devices described herein. Further included are methods of making masking arrays useful for the methods described herein.
Immunoaffinity chromatography: an introduction to applications and recent developments
Moser, Annette C
2010-01-01
Immunoaffinity chromatography (IAC) combines the use of LC with the specific binding of antibodies or related agents. The resulting method can be used in assays for a particular target or for purification and concentration of analytes prior to further examination by another technique. This review discusses the history and principles of IAC and the various formats that can be used with this method. An overview is given of the general properties of antibodies and of antibody-production methods. The supports and immobilization methods used with antibodies in IAC and the selection of application and elution conditions for IAC are also discussed. Several applications of IAC are considered, including its use in purification, immunodepletion, direct sample analysis, chromatographic immunoassays and combined analysis methods. Recent developments include the use of IAC with CE or MS, ultrafast immunoextraction methods and the use of immunoaffinity columns in microanalytical systems. PMID:20640220
A Comparative Analysis of Method Books for Class Jazz Instruction
ERIC Educational Resources Information Center
Watson, Kevin E.
2017-01-01
The purpose of this study was to analyze and compare instructional topics and teaching approaches included in selected class method books for jazz pedagogy through content analysis methodology. Frequency counts for the number of pages devoted to each defined instructional content category were compiled and percentages of pages allotted to each…
An investigation on the intra-sample distribution of cotton color by using image analysis
USDA-ARS?s Scientific Manuscript database
The colorimeter principle is widely used to measure cotton color. This method provides the sample’s color grade; but the result does not include information about the color distribution and any variation within the sample. We conducted an investigation that used image analysis method to study the ...
Forestry sector analysis for developing countries: issues and methods.
R.W. Haynes
1993-01-01
A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their applications in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing...
U.S.EPA Methods 1622 and 1623 are used to detect and quantify Cryptosporidium oocysts in water. The protocol consists of filtration, immunomagnetic separation (IMS), staining with a fluorescent antibody, and microscopic analysis. Microscopic analysis includes detection by fluor...
The use of lithiated adducts for structural analysis of acylglycerols by MS-ESI
USDA-ARS?s Scientific Manuscript database
Electrospray ionization mass spectrometry (ESI-MS) using lithium adducts is the method of choice for the analysis of acyglycerols. The method can be used for the identification of the structures of fatty acid constituents, including the number and location of double bonds and hydroxyl groups. The me...
ERIC Educational Resources Information Center
Torreblanca, Maximo
1988-01-01
Discusses the validity of studies of Spanish pronunciation in terms of research methods employed. Topics include data collection in the laboratory vs. in a natural setting; recorded vs. non-recorded data; quality of the recording; aural analysis vs. spectrographic analysis; and transcriber reliability. Suggestions for improving data collection are…
Some Supplementary Methods for the Analysis of the Delis-Kaplan Executive Function System
ERIC Educational Resources Information Center
Crawford, John R.; Garthwaite, Paul H.; Sutherland, David; Borland, Nicola
2011-01-01
Supplementary methods for the analysis of the Delis-Kaplan Executive Function System (Delis, Kaplan, & Kramer, 2001) are made available, including (a) quantifying the number of abnormally low achievement scores exhibited by an individual and accompanying this with an estimate of the percentage of the normative population expected to exhibit at…
Quantitative prediction of cellular metabolism with constraint-based models: the COBRA Toolbox v2.0
Schellenberger, Jan; Que, Richard; Fleming, Ronan M. T.; Thiele, Ines; Orth, Jeffrey D.; Feist, Adam M.; Zielinski, Daniel C.; Bordbar, Aarash; Lewis, Nathan E.; Rahmanian, Sorena; Kang, Joseph; Hyduke, Daniel R.; Palsson, Bernhard Ø.
2012-01-01
Over the past decade, a growing community of researchers has emerged around the use of COnstraint-Based Reconstruction and Analysis (COBRA) methods to simulate, analyze and predict a variety of metabolic phenotypes using genome-scale models. The COBRA Toolbox, a MATLAB package for implementing COBRA methods, was presented earlier. Here we present a significant update of this in silico ToolBox. Version 2.0 of the COBRA Toolbox expands the scope of computations by including in silico analysis methods developed since its original release. New functions include: (1) network gap filling, (2) 13C analysis, (3) metabolic engineering, (4) omics-guided analysis, and (5) visualization. As with the first version, the COBRA Toolbox reads and writes Systems Biology Markup Language formatted models. In version 2.0, we improved performance, usability, and the level of documentation. A suite of test scripts can now be used to learn the core functionality of the Toolbox and validate results. This Toolbox lowers the barrier of entry to use powerful COBRA methods. PMID:21886097
Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr
2010-03-24
Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less
Climate Action Benefits: Methods of Analysis
This page provides detailed information on the methods used in the CIRA analyses, including the overall framework, temperature projections, precipitation projections, sea level rise projections, uncertainty, and limitations.
NASA Technical Reports Server (NTRS)
Oswald, J. E.; Siegel, P. H.
1994-01-01
The finite difference time domain (FDTD) method is applied to the analysis of microwave, millimeter-wave and submillimeter-wave filter circuits. In each case, the validity of this method is confirmed by comparison with measured data. In addition, the FDTD calculations are used to design a new ultra-thin coplanar-strip filter for feeding a THz planar-antenna mixer.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components
NASA Technical Reports Server (NTRS)
1991-01-01
The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.
Analyzing the dynamics of cell cycle processes from fixed samples through ergodic principles
Wheeler, Richard John
2015-01-01
Tools to analyze cyclical cellular processes, particularly the cell cycle, are of broad value for cell biology. Cell cycle synchronization and live-cell time-lapse observation are widely used to analyze these processes but are not available for many systems. Simple mathematical methods built on the ergodic principle are a well-established, widely applicable, and powerful alternative analysis approach, although they are less widely used. These methods extract data about the dynamics of a cyclical process from a single time-point “snapshot” of a population of cells progressing through the cycle asynchronously. Here, I demonstrate application of these simple mathematical methods to analysis of basic cyclical processes—cycles including a division event, cell populations undergoing unicellular aging, and cell cycles with multiple fission (schizogony)—as well as recent advances that allow detailed mapping of the cell cycle from continuously changing properties of the cell such as size and DNA content. This includes examples using existing data from mammalian, yeast, and unicellular eukaryotic parasite cell biology. Through the ongoing advances in high-throughput cell analysis by light microscopy, electron microscopy, and flow cytometry, these mathematical methods are becoming ever more important and are a powerful complementary method to traditional synchronization and time-lapse cell cycle analysis methods. PMID:26543196
NASA Technical Reports Server (NTRS)
Kvaternik, Raymond G.
1992-01-01
An overview is presented of government contributions to the program called Design Analysis Methods for Vibrations (DAMV) which attempted to develop finite-element-based analyses of rotorcraft vibrations. NASA initiated the program with a finite-element modeling program for the CH-47D tandem-rotor helicopter. The DAMV program emphasized four areas including: airframe finite-element modeling, difficult components studies, coupled rotor-airframe vibrations, and airframe structural optimization. Key accomplishments of the program include industrywide standards for modeling metal and composite airframes, improved industrial designs for vibrations, and the identification of critical structural contributors to airframe vibratory responses. The program also demonstrated the value of incorporating secondary modeling details to improving correlation, and the findings provide the basis for an improved finite-element-based dynamics design-analysis capability.
The coordinate-based meta-analysis of neuroimaging data.
Samartsidis, Pantelis; Montagna, Silvia; Nichols, Thomas E; Johnson, Timothy D
2017-01-01
Neuroimaging meta-analysis is an area of growing interest in statistics. The special characteristics of neuroimaging data render classical meta-analysis methods inapplicable and therefore new methods have been developed. We review existing methodologies, explaining the benefits and drawbacks of each. A demonstration on a real dataset of emotion studies is included. We discuss some still-open problems in the field to highlight the need for future research.
The coordinate-based meta-analysis of neuroimaging data
Samartsidis, Pantelis; Montagna, Silvia; Nichols, Thomas E.; Johnson, Timothy D.
2017-01-01
Neuroimaging meta-analysis is an area of growing interest in statistics. The special characteristics of neuroimaging data render classical meta-analysis methods inapplicable and therefore new methods have been developed. We review existing methodologies, explaining the benefits and drawbacks of each. A demonstration on a real dataset of emotion studies is included. We discuss some still-open problems in the field to highlight the need for future research. PMID:29545671
High temperature flow-through device for rapid solubilization and analysis
West, Jason A. A. [Castro Valley, CA; Hukari, Kyle W [San Ramon, CA; Patel, Kamlesh D [Dublin, CA; Peterson, Kenneth A [Albuquerque, NM; Renzi, Ronald F [Tracy, CA
2009-09-22
Devices and methods for thermally lysing of biological material, for example vegetative bacterial cells and bacterial spores, are provided. Hot solution methods for solubilizing bacterial spores are described. Systems for direct analysis are disclosed including thermal lysers coupled to sample preparation stations. Integrated systems capable of performing sample lysis, labeling and protein fingerprint analysis of biological material, for example, vegetative bacterial cells, bacterial spores and viruses are provided.
High temperature flow-through device for rapid solubilization and analysis
West, Jason A. A.; Hukari, Kyle W.; Patel, Kamlesh D.; Peterson, Kenneth A.; Renzi, Ronald F.
2013-04-23
Devices and methods for thermally lysing of biological material, for example vegetative bacterial cells and bacterial spores, are provided. Hot solution methods for solubilizing bacterial spores are described. Systems for direct analysis are disclosed including thermal lysers coupled to sample preparation stations. Integrated systems capable of performing sample lysis, labeling and protein fingerprint analysis of biological material, for example, vegetative bacterial cells, bacterial spores and viruses are provided.
Safo, Sandra E; Li, Shuzhao; Long, Qi
2018-03-01
Integrative analysis of high dimensional omics data is becoming increasingly popular. At the same time, incorporating known functional relationships among variables in analysis of omics data has been shown to help elucidate underlying mechanisms for complex diseases. In this article, our goal is to assess association between transcriptomic and metabolomic data from a Predictive Health Institute (PHI) study that includes healthy adults at a high risk of developing cardiovascular diseases. Adopting a strategy that is both data-driven and knowledge-based, we develop statistical methods for sparse canonical correlation analysis (CCA) with incorporation of known biological information. Our proposed methods use prior network structural information among genes and among metabolites to guide selection of relevant genes and metabolites in sparse CCA, providing insight on the molecular underpinning of cardiovascular disease. Our simulations demonstrate that the structured sparse CCA methods outperform several existing sparse CCA methods in selecting relevant genes and metabolites when structural information is informative and are robust to mis-specified structural information. Our analysis of the PHI study reveals that a number of gene and metabolic pathways including some known to be associated with cardiovascular diseases are enriched in the set of genes and metabolites selected by our proposed approach. © 2017, The International Biometric Society.
Interdisciplinary research on patient-provider communication: a cross-method comparison.
Chou, Wen-Ying Sylvia; Han, Paul; Pilsner, Alison; Coa, Kisha; Greenberg, Larrie; Blatt, Benjamin
2011-01-01
Patient-provider communication, a key aspect of healthcare delivery, has been assessed through multiple methods for purposes of research, education, and quality control. Common techniques include satisfaction ratings and quantitatively- and qualitatively-oriented direct observations. Identifying the strengths and weaknesses of different approaches is critically important in determining the appropriate assessment method for a specific research or practical goal. Analyzing ten videotaped simulated encounters between medical students and Standardized Patients (SPs), this study compared three existing assessment methods through the same data set. Methods included: (1) dichotomized SP ratings on students' communication skills; (2) Roter Interaction Analysis System (RIAS) analysis; and (3) inductive discourse analysis informed by sociolinguistic theories. The large dichotomous contrast between good and poor ratings in (1) was not evidenced in any of the other methods. Following a discussion of strengths and weaknesses of each approach, we pilot-tested a combined assessment done by coders blinded to results of (1)-(3). This type of integrative approach has the potential of adding a quantifiable dimension to qualitative, discourse-based observations. Subjecting the same data set to separate analytic methods provides an excellent opportunity for methodological comparisons with the goal of informing future assessment of clinical encounters.
Brenn, T; Arnesen, E
1985-01-01
For comparative evaluation, discriminant analysis, logistic regression and Cox's model were used to select risk factors for total and coronary deaths among 6595 men aged 20-49 followed for 9 years. Groups with mortality between 5 and 93 per 1000 were considered. Discriminant analysis selected variable sets only marginally different from the logistic and Cox methods which always selected the same sets. A time-saving option, offered for both the logistic and Cox selection, showed no advantage compared with discriminant analysis. Analysing more than 3800 subjects, the logistic and Cox methods consumed, respectively, 80 and 10 times more computer time than discriminant analysis. When including the same set of variables in non-stepwise analyses, all methods estimated coefficients that in most cases were almost identical. In conclusion, discriminant analysis is advocated for preliminary or stepwise analysis, otherwise Cox's method should be used.
Robust Mediation Analysis Based on Median Regression
Yuan, Ying; MacKinnon, David P.
2014-01-01
Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925
Development of an SPE/CE method for analyzing HAAs
Zhang, L.; Capel, P.D.; Hozalski, R.M.
2007-01-01
The haloacetic acid (HAA) analysis methods approved by the US Environmental Protection Agency involve extraction and derivatization of HAAs (typically to their methyl ester form) and analysis by gas chromatography (GC) with electron capture detection (ECD). Concerns associated with these methods include the time and effort of the derivatization process, use of potentially hazardous chemicals or conditions during methylation, poor recoveries because of low extraction efficiencies for some HAAs or matrix effects from sulfate, and loss of tribromoacetic acid because of decarboxylation. The HAA analysis method introduced here uses solid-phase extraction (SPE) followed by capillary electrophoresis (CE) analysis. The method is accurate, reproducible, sensitive, relatively safe, and easy to perform, and avoids the use of large amounts of solvent for liquid-liquid extraction and the potential hazards and hassles of derivatization. The cost of analyzing HAAs using this method should be lower than the currently approved methods, and utilities with a GC/ECD can perform the analysis in-house.
Model reduction methods for control design
NASA Technical Reports Server (NTRS)
Dunipace, K. R.
1988-01-01
Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.
Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1989-01-01
An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.
Antoszewska-Smith, Joanna; Sarul, Michał; Łyczek, Jan; Konopka, Tomasz; Kawala, Beata
2017-03-01
The aim of this systematic review was to compare the effectiveness of orthodontic miniscrew implants-temporary intraoral skeletal anchorage devices (TISADs)-in anchorage reinforcement during en-masse retraction in relation to conventional methods of anchorage. A search of PubMed, Embase, Cochrane Central Register of Controlled Trials, and Web of Science was performed. The keywords were orthodontic, mini-implants, miniscrews, miniplates, and temporary anchorage device. Relevant articles were assessed for quality according to Cochrane guidelines and the data extracted for statistical analysis. A meta-analysis of raw mean differences concerning anchorage loss, tipping of molars, retraction of incisors, tipping of incisors, and treatment duration was carried out. Initially, we retrieved 10,038 articles. The selection process finally resulted in 14 articles including 616 patients (451 female, 165 male) for detailed analysis. Quality of the included studies was assessed as moderate. Meta-analysis showed that use of TISADs facilitates better anchorage reinforcement compared with conventional methods. On average, TISADs enabled 1.86 mm more anchorage preservation than did conventional methods (P <0.001). The results of the meta-analysis showed that TISADs are more effective than conventional methods of anchorage reinforcement. The average difference of 2 mm seems not only statistically but also clinically significant. However, the results should be interpreted with caution because of the moderate quality of the included studies. More high-quality studies on this issue are necessary to enable drawing more reliable conclusions. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
Analysis of Endocrine Disrupting Pesticides by Capillary GC with Mass Spectrometric Detection
Matisová, Eva; Hrouzková, Svetlana
2012-01-01
Endocrine disrupting chemicals, among them many pesticides, alter the normal functioning of the endocrine system of both wildlife and humans at very low concentration levels. Therefore, the importance of method development for their analysis in food and the environment is increasing. This also covers contributions in the field of ultra-trace analysis of multicomponent mixtures of organic pollutants in complex matrices. With this fact conventional capillary gas chromatography (CGC) and fast CGC with mass spectrometric detection (MS) has acquired a real importance in the analysis of endocrine disrupting pesticide (EDP) residues. This paper provides an overview of GC methods, including sample preparation steps, for analysis of EDPs in a variety of matrices at ultra-trace concentration levels. Emphasis is put on separation method, mode of MS detection and ionization and obtained limits of detection and quantification. Analysis time is one of the most important aspects that should be considered in the choice of analytical methods for routine analysis. Therefore, the benefits of developed fast GC methods are important. PMID:23202677
Cabinetmaker. Occupational Analysis Series.
ERIC Educational Resources Information Center
Chinien, Chris; Boutin, France
This document contains the analysis of the occupation of cabinetmaker, or joiner, that is accepted by the Canadian Council of Directors as the national standard for the occupation. The front matter preceding the analysis includes exploration of the development of the analysis, structure of the analysis, validation method, scope of the cabinetmaker…
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2002-01-01
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Lin, Zhichao; Wu, Zhongyu
2009-05-01
A rapid and reliable radiochemical method coupled with a simple and compact plating apparatus was developed, validated, and applied for the analysis of (210)Po in variety of food products and bioassay samples. The method performance characteristics, including accuracy, precision, robustness, and specificity, were evaluated along with a detailed measurement uncertainty analysis. With high Po recovery, improved energy resolution, and effective removal of interfering elements by chromatographic extraction, the overall method accuracy was determined to be better than 5% with measurement precision of 10%, at 95% confidence level.
Off-axis illumination direct-to-digital holography
Thomas, Clarence E.; Price, Jeffery R.; Voelkl, Edgar; Hanson, Gregory R.
2004-06-08
Systems and methods are described for off-axis illumination direct-to-digital holography. A method of recording an off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis, includes: reflecting a reference beam from a reference mirror at a non-normal angle; reflecting an object beam from an object at an angle with respect to an optical axis defined by a focusing lens; focusing the reference beam and the object beam at a focal plane of a digital recorder to form the off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; digitally recording the off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; Fourier analyzing the recorded off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes by transforming axes of the recorded off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes in Fourier space to sit on top of a heterodyne carrier frequency defined as an angle between the reference beam and the object beam; applying a digital filter to cut off signals around an original origin; and then performing an inverse Fourier transform.
An analysis method for two-dimensional transonic viscous flow
NASA Technical Reports Server (NTRS)
Bavitz, P. C.
1975-01-01
A method for the approximate calculation of transonic flow over airfoils, including shock waves and viscous effects, is described. Numerical solutions are obtained by use of a computer program which is discussed in the appendix. The importance of including the boundary layer in the analysis is clearly demonstrated, as well as the need to improve on existing procedures near the trailing edge. Comparisons between calculations and experimental data are presented for both conventional and supercritical airfoils, emphasis being on the surface pressure distribution, and good agreement is indicated.
NASA Astrophysics Data System (ADS)
Vaks, V. L.; Domracheva, E. G.; Chernyaeva, M. B.; Pripolzin, S. I.; Revin, L. S.; Tretyakov, I. V.; Anfertyev, V. A.; Yablokov, A. A.; Lukyanenko, I. A.; Sheikov, Yu. V.
2018-02-01
We show prospects for using the method of high-resolution terahertz spectroscopy for a continuous analysis of the decomposition products of energy substances in the gas phase (including short-lived ones) in a wide temperature range. The experimental setup, which includes a terahertz spectrometer for studying the thermal decomposition reactions, is described. The results of analysis of the gaseous decomposition products of energy substances by the example of ammonium nitrate heated from room temperature to 167°C are presented.
Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1989-12-01
On March 6, 1980, the US Department of Energy (DOE) and the Ministry of Energy and Mines of Venezuela (MEMV) entered into a joint agreement which included analysis of heavy crude oils from the Venezuelan Orinoco oil belt. The purpose of this report is to present compositional data and describe new analytical methods obtained from work on the Cerro Negro Orinoco belt crude oil since 1980. Most of the chapters focus on the methods rather than the resulting data on Cerro Negro oil, and results from other oils obtained during the verification of the method are included. In addition, publishedmore » work on analysis of heavy oils, tar sand bitumens, and like materials is reviewed, and the overall state of the art in analytical methodology for heavy fossil liquids is assessed. The various phases of the work included: distillation and determination of routine'' physical/chemical properties (Chapter 1); preliminary separation of >200{degrees} C distillates and the residue into acid, base, neutral, saturated hydrocarbon and neutral-aromatic concentrates (Chapter 2); further separation of acid, base, and neutral concentrates into subtypes (Chapters 3--5); and determination of the distribution of metal-containing compounds in all fractions (Chapter 6).« less
Exploring Mission Concepts with the JPL Innovation Foundry A-Team
NASA Technical Reports Server (NTRS)
Ziemer, John K.; Ervin, Joan; Lang, Jared
2013-01-01
The JPL Innovation Foundry has established a new approach for exploring, developing, and evaluating early concepts called the A-Team. The A-Team combines innovative collaborative methods with subject matter expertise and analysis tools to help mature mission concepts. Science, implementation, and programmatic elements are all considered during an A-Team study. Methods are grouped by Concept Maturity Level (CML), from 1 through 3, including idea generation and capture (CML 1), initial feasibility assessment (CML 2), and trade space exploration (CML 3). Methods used for each CML are presented, and the key team roles are described from two points of view: innovative methods and technical expertise. A-Team roles for providing innovative methods include the facilitator, study lead, and assistant study lead. A-Team roles for providing technical expertise include the architect, lead systems engineer, and integration engineer. In addition to these key roles, each A-Team study is uniquely staffed to match the study topic and scope including subject matter experts, scientists, technologists, flight and instrument systems engineers, and program managers as needed. Advanced analysis and collaborative engineering tools (e.g. cost, science traceability, mission design, knowledge capture, study and analysis support infrastructure) are also under development for use in A-Team studies and will be discussed briefly. The A-Team facilities provide a constructive environment for innovative ideas from all aspects of mission formulation to eliminate isolated studies and come together early in the development cycle when they can provide the biggest impact. This paper provides an overview of the A-Team, its study processes, roles, methods, tools and facilities.
On wiping the interior walls of 37-mm closed-face cassettes: an OSHA perspective.
Hendricks, Warren; Stones, Fern; Lillquist, Dean
2009-12-01
As early as 1976, Occupational Safety and Health Administration (OSHA) methods for analyzing metal samples collected using 37-mm polystyrene closed-face cassettes specified that any loose dust be transferred from the cassette to the digestion vessel, that the cassette be rinsed, and that, if necessary, the cassette be wiped out to help ensure that all particles that enter the cassette are included along with the filter as part of the sample for analysis. OSHA analytical methods for metal analysis were recently revised to explicitly require cassette wiping for all metal samples. This change was based on policy that any material entering the collection device constitutes part of the sample and on OSHA Salt Lake Technical Center research showing that invisible residue on the cassette walls can significantly contribute to the total sample results reported. OSHA procedures are consistent with guidance given in the NIOSH Manual of Analytical Methods. This guidance concludes that internal deposits in sampling cassettes should be included in the analysis and that one way to accomplish this would be to wipe or wash the internal surfaces of the cassette and include the material along with the filter for analysis.
Sample detection and analysis techniques for electrophoretic separation
NASA Technical Reports Server (NTRS)
Falb, R. D.; Hughes, K. E.; Powell, T. R.
1975-01-01
Methods for detecting and analyzing biological agents suitable for space flight operations were studied primarily by literature searches which were conducted of cell separation techniques. Detection methods discussed include: photometrometric, electric, radiometric, micrometry, ultrasonic, microscopic, and photographic. A bibliography, and a directory of vendors are included along with an index of commercial hardware.
Kansas's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; W. Keith Moser; Charles J. Barnett
2011-01-01
The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...
NASA Technical Reports Server (NTRS)
Ehret, R. M.
1974-01-01
The concepts explored in a state of the art review of those engineering fracture mechanics considered most applicable to the space shuttle vehicle include fracture toughness, precritical flaw growth, failure mechanisms, inspection methods (including proof test logic), and crack growth predictive analysis techniques.
Functional Multiple-Set Canonical Correlation Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun; Jung, Kwanghee; Takane, Yoshio; Woodward, Todd S.
2012-01-01
We propose functional multiple-set canonical correlation analysis for exploring associations among multiple sets of functions. The proposed method includes functional canonical correlation analysis as a special case when only two sets of functions are considered. As in classical multiple-set canonical correlation analysis, computationally, the…
Multivariate analysis: A statistical approach for computations
NASA Astrophysics Data System (ADS)
Michu, Sachin; Kaushik, Vandana
2014-10-01
Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.
Operations planning and analysis handbook for NASA/MSFC phase B development projects
NASA Technical Reports Server (NTRS)
Batson, Robert C.
1986-01-01
Current operations planning and analysis practices on NASA/MSFC Phase B projects were investigated with the objectives of (1) formalizing these practices into a handbook and (2) suggesting improvements. The study focused on how Science and Engineering (S&E) Operational Personnel support Program Development (PD) Task Teams. The intimate relationship between systems engineering and operations analysis was examined. Methods identified for use by operations analysts during Phase B include functional analysis, interface analysis methods to calculate/allocate such criteria as reliability, Maintainability, and operations and support cost.
School-Based Mentoring for Adolescents: A Systematic Review and Meta-Analysis
ERIC Educational Resources Information Center
Wood, Sarah; Mayo-Wilson, Evan
2012-01-01
Objectives: To evaluate the impact of school-based mentoring for adolescents (11-18 years) on academic performance, attendance, attitudes, behavior, and self-esteem. Method: A systematic review and meta-analysis. The authors searched 12 databases from 1980 to 2011. Eight studies with 6,072 participants were included, 6 were included in…
ERIC Educational Resources Information Center
Herbert, Patrick C.; Lohrmann, David K.
2011-01-01
Background: This study explores the relationship between instructional strategies and effective health education curricula. Methods: A content analysis of 10 health education curricula was conducted to determine if they include common instructional strategies for actively engaging students in acquisition of health skills. To be included, the…
Effects of problem-based learning in Chinese radiology education
Zhang, Song; Xu, Jiancheng; Wang, Hongwei; Zhang, Dong; Zhang, Qichuan; Zou, Liguang
2018-01-01
Abstract Background: In recent years, the problem-based learning (PBL) teaching method has been extensively applied as an experimental educational method in Chinese radiology education. However, the results of individual studies were inconsistent and inconclusive. A meta-analysis was performed to evaluate the effects of PBL on radiology education in China. Methods: Databases of Chinese and English languages were searched from inception up to November 2017. The standard mean difference (SMD) with its 95% confidence interval (95% CI) was used to determine the over effects of PBL compared with the traditional teaching method. Results: Seventeen studies involving 1487 participants were included in this meta-analysis. Of them, 16 studies provided sufficient data for the pooled analysis and showed that PBL teaching method had a positive effect on achieving higher theoretical scores compared with the traditional teaching method (SMD = 1.20, 95% CI [0.68, 1.71]). Thirteen studies provided sufficient data on skill scores, and a significant difference in favor of PBL was also observed (SMD = 2.10, 95% CI [1.38, 2.83]). Questionnaire surveys were applied in most of the included studies and indicated positive effects of PBL on students’ learning interest, scope of knowledge, team spirit, and oral expression. Conclusion: The result shows that PBL appears to be more effective on radiology education than traditional teaching method in China. However, the heterogeneity of the included studies cannot be neglected. Further well-designed studies about this topic are needed to confirm the above findings. PMID:29489669
DNA-PCR analysis of bloodstains sampled by the polyvinyl-alcohol method.
Schyma, C; Huckenbeck, W; Bonte, W
1999-01-01
Among the usual techniques of sampling gunshot residues (GSR), the polyvinyl-alcohol method (PVAL) includes the advantage of embedding all particles, foreign bodies and stains on the surface of the shooter's hand in exact and reproducible topographic localization. The aim of the present study on ten persons killed by firearms was to check the possibility of DNA-PCR typing of blood traces embedded in the PVAL gloves in a second step following GSR analysis. The results of these examinations verify that the PVAL technique does not include factors that inhibit successful PCR typing. Thus the PVAL method can be recommended as a combination technique to secure and preserve inorganic and biological traces at the same time.
An Analysis of Periodic Components in BL Lac Object S5 0716 +714 with MUSIC Method
NASA Astrophysics Data System (ADS)
Tang, J.
2012-01-01
Multiple signal classification (MUSIC) algorithms are introduced to the estimation of the period of variation of BL Lac objects.The principle of MUSIC spectral analysis method and theoretical analysis of the resolution of frequency spectrum using analog signals are included. From a lot of literatures, we have collected a lot of effective observation data of BL Lac object S5 0716 + 714 in V, R, I bands from 1994 to 2008. The light variation periods of S5 0716 +714 are obtained by means of the MUSIC spectral analysis method and periodogram spectral analysis method. There exist two major periods: (3.33±0.08) years and (1.24±0.01) years for all bands. The estimation of the period of variation of the algorithm based on the MUSIC spectral analysis method is compared with that of the algorithm based on the periodogram spectral analysis method. It is a super-resolution algorithm with small data length, and could be used to detect the period of variation of weak signals.
Structural optimization: Status and promise
NASA Astrophysics Data System (ADS)
Kamat, Manohar P.
Chapters contained in this book include fundamental concepts of optimum design, mathematical programming methods for constrained optimization, function approximations, approximate reanalysis methods, dual mathematical programming methods for constrained optimization, a generalized optimality criteria method, and a tutorial and survey of multicriteria optimization in engineering. Also included are chapters on the compromise decision support problem and the adaptive linear programming algorithm, sensitivity analyses of discrete and distributed systems, the design sensitivity analysis of nonlinear structures, optimization by decomposition, mixed elements in shape sensitivity analysis of structures based on local criteria, and optimization of stiffened cylindrical shells subjected to destabilizing loads. Other chapters are on applications to fixed-wing aircraft and spacecraft, integrated optimum structural and control design, modeling concurrency in the design of composite structures, and tools for structural optimization. (No individual items are abstracted in this volume)
Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly
2015-12-18
This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
Sample preparation for the analysis of isoflavones from soybeans and soy foods.
Rostagno, M A; Villares, A; Guillamón, E; García-Lafuente, A; Martínez, J A
2009-01-02
This manuscript provides a review of the actual state and the most recent advances as well as current trends and future prospects in sample preparation and analysis for the quantification of isoflavones from soybeans and soy foods. Individual steps of the procedures used in sample preparation, including sample conservation, extraction techniques and methods, and post-extraction treatment procedures are discussed. The most commonly used methods for extraction of isoflavones with both conventional and "modern" techniques are examined in detail. These modern techniques include ultrasound-assisted extraction, pressurized liquid extraction, supercritical fluid extraction and microwave-assisted extraction. Other aspects such as stability during extraction and analysis by high performance liquid chromatography are also covered.
Bigler, Erin D
2015-09-01
Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.
Processing and analysis of cardiac optical mapping data obtained with potentiometric dyes
Laughner, Jacob I.; Ng, Fu Siong; Sulkin, Matthew S.; Arthur, R. Martin
2012-01-01
Optical mapping has become an increasingly important tool to study cardiac electrophysiology in the past 20 years. Multiple methods are used to process and analyze cardiac optical mapping data, and no consensus currently exists regarding the optimum methods. The specific methods chosen to process optical mapping data are important because inappropriate data processing can affect the content of the data and thus alter the conclusions of the studies. Details of the different steps in processing optical imaging data, including image segmentation, spatial filtering, temporal filtering, and baseline drift removal, are provided in this review. We also provide descriptions of the common analyses performed on data obtained from cardiac optical imaging, including activation mapping, action potential duration mapping, repolarization mapping, conduction velocity measurements, and optical action potential upstroke analysis. Optical mapping is often used to study complex arrhythmias, and we also discuss dominant frequency analysis and phase mapping techniques used for the analysis of cardiac fibrillation. PMID:22821993
Detection of incipient defects in cables by partial discharge signal analysis
NASA Astrophysics Data System (ADS)
Martzloff, F. D.; Simmon, E.; Steiner, J. P.; Vanbrunt, R. J.
1992-07-01
As one of the objectives of a program aimed at assessing test methods for in-situ detection of incipient defects in cables due to aging, a laboratory test system was implemented to demonstrate that the partial discharge analysis method can be successfully applied to low-voltage cables. Previous investigations generally involved cables rated 5 kV or higher, while the objective of the program focused on the lower voltages associated with the safety systems of nuclear power plants. The defect detection system implemented for the project was based on commercially available signal analysis hardware and software packages, customized for the specific purposes of the project. The test specimens included several cables of the type found in nuclear power plants, including artificial defects introduced at various points of the cable. The results indicate that indeed, partial discharge analysis is capable of detecting incipient defects in low-voltage cables. There are, however, some limitations of technical and non-technical nature that need further exploration before this method can be accepted in the industry.
Array magnetics modal analysis for the DIII-D tokamak based on localized time-series modelling
Olofsson, K. Erik J.; Hanson, Jeremy M.; Shiraki, Daisuke; ...
2014-07-14
Here, time-series analysis of magnetics data in tokamaks is typically done using block-based fast Fourier transform methods. This work presents the development and deployment of a new set of algorithms for magnetic probe array analysis. The method is based on an estimation technique known as stochastic subspace identification (SSI). Compared with the standard coherence approach or the direct singular value decomposition approach, the new technique exhibits several beneficial properties. For example, the SSI method does not require that frequencies are orthogonal with respect to the timeframe used in the analysis. Frequencies are obtained directly as parameters of localized time-series models.more » The parameters are extracted by solving small-scale eigenvalue problems. Applications include maximum-likelihood regularized eigenmode pattern estimation, detection of neoclassical tearing modes, including locked mode precursors, and automatic clustering of modes, and magnetics-pattern characterization of sawtooth pre- and postcursors, edge harmonic oscillations and fishbones.« less
Cervical vertebral maturation as a biologic indicator of skeletal maturity.
Santiago, Rodrigo César; de Miranda Costa, Luiz Felipe; Vitral, Robert Willer Farinazzo; Fraga, Marcelo Reis; Bolognese, Ana Maria; Maia, Lucianne Cople
2012-11-01
To identify and review the literature regarding the reliability of cervical vertebrae maturation (CVM) staging to predict the pubertal spurt. The selection criteria included cross-sectional and longitudinal descriptive studies in humans that evaluated qualitatively or quantitatively the accuracy and reproducibility of the CVM method on lateral cephalometric radiographs, as well as the correlation with a standard method established by hand-wrist radiographs. The searches retrieved 343 unique citations. Twenty-three studies met the inclusion criteria. Six articles had moderate to high scores, while 17 of 23 had low scores. Analysis also showed a moderate to high statistically significant correlation between CVM and hand-wrist maturation methods. There was a moderate to high reproducibility of the CVM method, and only one specific study investigated the accuracy of the CVM index in detecting peak pubertal growth. This systematic review has shown that the studies on CVM method for radiographic assessment of skeletal maturation stages suffer from serious methodological failures. Better-designed studies with adequate accuracy, reproducibility, and correlation analysis, including studies with appropriate sensitivity-specificity analysis, should be performed.
Diagnostics of Tree Diseases Caused by Phytophthora austrocedri Species.
Mulholland, Vincent; Elliot, Matthew; Green, Sarah
2015-01-01
We present methods for the detection and quantification of four Phytophthora species which are pathogenic on trees; Phytophthora ramorum, Phytophthora kernoviae, Phytophthora lateralis, and Phytophthora austrocedri. Nucleic acid extraction methods are presented for phloem tissue from trees, soil, and pure cultures on agar plates. Real-time PCR methods are presented and include primer and probe sets for each species, general advice on real-time PCR setup and data analysis. A method for sequence-based identification, useful for pure cultures, is also included.
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
1991-03-21
discussion of spectral factorability and motivations for broadband analysis, the report is subdivided into four main sections. In Section 1.0, we...estimates. The motivation for developing our multi-channel deconvolution method was to gain information about seismic sources, most notably, nuclear...with complex constraints for estimating the rupture history. Such methods (applied mostly to data sets that also include strong rmotion data), were
ERIC Educational Resources Information Center
Kim, Kyung Hi
2014-01-01
This research, based on a case study of vulnerable children in Korea, used a mixed methods transformative approach to explore strategies to support and help disadvantaged children. The methodological approach includes three phases: a mixed methods contextual analysis, a qualitative dominant analysis based on Sen's capability approach and critical…
Charting the Learning Journey of a Group of Adults Returning to Education
ERIC Educational Resources Information Center
Mooney, Des
2011-01-01
Using a qualitative case study method the researcher studied a group of adult returning students completing a childcare course. Methods used included focus groups, a questionnaire and observations. Using a holistic analysis approach (Yin 2003) of the case the researcher then focused on a number of key issues. From this analysis the themes of…
ERIC Educational Resources Information Center
Peterlin, Primoz
2010-01-01
Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…
Benefits of Using Planned Comparisons Rather Than Post Hoc Tests: A Brief Review with Examples.
ERIC Educational Resources Information Center
DuRapau, Theresa M.
The rationale behind analysis of variance (including analysis of covariance and multiple analyses of variance and covariance) methods is reviewed, and unplanned and planned methods of evaluating differences between means are briefly described. Two advantages of using planned or a priori tests over unplanned or post hoc tests are presented. In…
Analysis of High School English Curriculum Materials through Rasch Measurement Model and Maxqda
ERIC Educational Resources Information Center
Batdi, Veli; Elaldi, Senel
2016-01-01
The purpose of the study is to analyze high school English curriculum materials (ECM) through FACETS analysis and MAXQDA-11 programs. The mixed methods approach, both quantitative and qualitative methods, were used in three samples including English teachers in Elazig during the 2014-2015 academic year. While the quantitative phase of the study…
Integrated method for chaotic time series analysis
Hively, Lee M.; Ng, Esmond G.
1998-01-01
Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.
Sight-Singing Pedagogy: A Content Analysis of Choral Methods Textbooks
ERIC Educational Resources Information Center
Floyd, Eva G.; Haning, Marshall A.
2015-01-01
The purpose of this study was to examine the sight-singing pedagogy content of choral methods textbooks, with the intent of determining what elements of sight-singing pedagogy are most commonly included in these resources. A content analysis was conducted to analyze information related to sight-singing pedagogy in 10 textbooks that are commonly…
US EPA SW-846 methods have typically relied on dual column gas chromatography coupled with electron capture detection (GC-ECD) for analysis of low concentrations of organochlorine pesticides, including toxaphene, in environmental samples. Toxaphene is one of the most widely appl...
2013-06-24
Barrier methods for critical exponent problems in geometric analysis and mathematical physics, J. Erway and M. Hoist, Submitted for publication . • Finite...1996. [20] C. LANCZOS, Linear Differential Operators, Dover Publications , Mineola, NY, 1997. [21] G.I. MARCHUK, Adjoint Equations and Analysis of...NUMBER(S) 16. SECURITY CLASSIFICATION OF: 19b. TELEPHONE NUMBER (Include area code) The public reporting burden for this collection of information is
Atkins, Salla; Launiala, Annika; Kagaha, Alexander; Smith, Helen
2012-04-30
Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. The inclusion of mixed methods studies in systematic reviews is increasing, but these studies pose particular challenges to methods of review. This article examines the quality of the reporting of mixed methods and qualitative-only studies. We used two completed systematic reviews to generate a sample of qualitative studies and mixed method studies in order to make an assessment of how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. Overall, the reporting of qualitative studies in our sample was consistently better when compared with the reporting of mixed methods studies. We found that mixed methods studies are less likely to provide a description of the research conduct or qualitative data analysis procedures and less likely to be judged credible or provide rich data and thick description compared with standalone qualitative studies. Our time-related analysis shows that for both types of study, papers published since 2003 are more likely to report on the study context, describe analysis procedures, and be judged credible and provide rich data. However, the reporting of other aspects of research conduct (i.e. descriptions of the research question, the sampling strategy, and data collection methods) in mixed methods studies does not appear to have improved over time. Mixed methods research makes an important contribution to health research in general, and could make a more substantial contribution to systematic reviews. Through our careful analysis of the quality of reporting of mixed methods and qualitative-only research, we have identified areas that deserve more attention in the conduct and reporting of mixed methods research.
40 CFR 60.715 - Test methods and procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... base film (i.e., the sample shall include any dilution solvent or other VOC added during the... for gas analysis. (f) Method 4 is used for stack gas moisture. (g) Methods 2, 2A, 2C, 2D, 3, and 4...
CRITERIA FOR EVALUATION OF PROPOSED PROTOZOAN DETECTION METHODS
There has been a proliferation of techniques and methods reported for analysis of water samples to determine the presence of the protozoan pathogens Cryptosporidium parvum and Giardia lamblia. Many of the proposed methods are presented as complete procedures, which include sampli...
CRITERIA FOR EVALUATION OF PROPOSED PROTOZOAN DETECTION METHODS.
There has been a proliferation of techniques and methods reported for analysis of water samples to determine the presence of the protozoan pathogens Cryptosporidium parvum and Giardia lamblia. Many of the proposed methods are presented as complete procedures, which include sampli...
Analytical study of comet nucleus samples
NASA Technical Reports Server (NTRS)
Albee, A. L.
1989-01-01
Analytical procedures for studying and handling frozen (130 K) core samples of comet nuclei are discussed. These methods include neutron activation analysis, x ray fluorescent analysis and high resolution mass spectroscopy.
[Therapeutic strategies targeting brain tumor stem cells].
Toda, Masahiro
2009-07-01
Progress in stem cell research reveals cancer stem cells to be present in a variety of malignant tumors. Since they exhibit resistance to anticancer drugs and radiotherapy, analysis of their properties has been rapidly carried forward as an important target for the treatment of intractable malignancies, including brain tumors. In fact, brain cancer stem cells (BCSCs) have been isolated from brain tumor tissue and brain tumor cell lines by using neural stem cell culture methods and isolation methods for side population (SP) cells, which have high drug-efflux capacity. Although the analysis of the properties of BCSCs is the most important to developing methods in treating BCSCs, the absence of BCSC purification methods should be remedied by taking it up as an important research task in the immediate future. Thus far, there are no effective treatment methods for BCSCs, and several treatment methods have been proposed based on the cell biology characteristics of BCSCs. In this article, I outline potential treatment methods damaging treatment-resistant BCSCs, including immunotherapy which is currently a topic of our research.
Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried
2017-01-01
Abstract Background: This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. Methods: The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. Results: The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. Conclusion: The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. PMID:27815414
Ultrasound for Distal Forearm Fracture: A Systematic Review and Diagnostic Meta-Analysis
Douma-den Hamer, Djoke; Blanker, Marco H.; Edens, Mireille A.; Buijteweg, Lonneke N.; Boomsma, Martijn F.; van Helden, Sven H.; Mauritz, Gert-Jan
2016-01-01
Study Objective To determine the diagnostic accuracy of ultrasound for detecting distal forearm fractures. Methods A systematic review and diagnostic meta-analysis was performed according to the PRISMA statement. We searched MEDLINE, Web of Science and the Cochrane Library from inception to September 2015. All prospective studies of the diagnostic accuracy of ultrasound versus radiography as the reference standard were included. We excluded studies with a retrospective design and those with evidence of verification bias. We assessed the methodological quality of the included studies with the QUADAS-2 tool. We performed a meta-analysis of studies evaluating ultrasound to calculate the pooled sensitivity and specificity with 95% confidence intervals (CI95%) using a bivariate model with random effects. Subgroup and sensitivity analysis were used to examine the effect of methodological differences and other study characteristics. Results Out of 867 publications we included 16 studies with 1,204 patients and 641 fractures. The pooled test characteristics for ultrasound were: sensitivity 97% (CI95% 93–99%), specificity 95% (CI95% 89–98%), positive likelihood ratio (LR) 20.0 (8.5–47.2) and negative LR 0.03 (0.01–0.08). The corresponding pooled diagnostic odds ratio (DOR) was 667 (142–3,133). Apparent differences were shown for method of viewing, with the 6-view method showing higher specificity, positive LR, and DOR, compared to the 4-view method. Conclusion The present meta-analysis showed that ultrasound has a high accuracy for the diagnosis of distal forearm fractures in children when used by proper viewing method. Based on this, ultrasound should be considered a reliable alternative, which has the advantages of being radiation free. PMID:27196439
Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier
2013-01-01
To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.
NASA Technical Reports Server (NTRS)
Hill, Geoffrey A.; Olson, Erik D.
2004-01-01
Due to the growing problem of noise in today's air transportation system, there have arisen needs to incorporate noise considerations in the conceptual design of revolutionary aircraft. Through the use of response surfaces, complex noise models may be converted into polynomial equations for rapid and simplified evaluation. This conversion allows many of the commonly used response surface-based trade space exploration methods to be applied to noise analysis. This methodology is demonstrated using a noise model of a notional 300 passenger Blended-Wing-Body (BWB) transport. Response surfaces are created relating source noise levels of the BWB vehicle to its corresponding FAR-36 certification noise levels and the resulting trade space is explored. Methods demonstrated include: single point analysis, parametric study, an optimization technique for inverse analysis, sensitivity studies, and probabilistic analysis. Extended applications of response surface-based methods in noise analysis are also discussed.
Prony Ringdown GUI (CERTS Prony Ringdown, part of the DSI Tool Box)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuffner, Francis; Marinovici, PNNL Laurentiu; Hauer, PNNL John
2014-02-21
The PNNL Prony Ringdown graphical user interface is one analysis tool included in the Dynamic System Identification toolbox (DSI Toolbox). The Dynamic System Identification toolbox is a MATLAB-based collection of tools for parsing and analyzing phasor measurement unit data, especially in regards to small signal stability. It includes tools to read the data, preprocess it, and perform small signal analysis. 5. Method of Solution: The Dynamic System Identification Toolbox (DSI Toolbox) is designed to provide a research environment for examining phasor measurement unit data and performing small signal stability analysis. The software uses a series of text-driven menus to helpmore » guide users and organize the toolbox features. Methods for reading in populate phasor measurement unit data are provided, with appropriate preprocessing options for small-signal-stability analysis. The toolbox includes the Prony Ringdown GUI and basic algorithms to estimate information on oscillatory modes of the system, such as modal frequency and damping ratio.« less
Nurse competence: a concept analysis.
Smith, Sarah A
2012-10-01
The purpose of this analysis was to explore the concept of nurse competence. Data sources include EBSCOhost, Gale PowerSearch, ProQuest, PubMed Medline, Google Scholar, and Online Journal of Issues in Nursing. This paper utilizes Rodgers' evolutionary method to analyze the concept of nurse competence. Antecedents to nurse competence include personal and external motivations. Attributes include integrating knowledge into practice, experience, critical thinking, proficient skills, caring, communication, environment, motivation, and professionalism. Consequences include confidence, safe practice, and holistic care. Implications for nursing responsibility regarding defining nurse competence and ensuring nurse competence need to be identified. More research is needed to determine the best evaluation methods for the different facets of nurse competence. © 2012, The Author. International Journal of Nursing Knowledge © 2012, NANDA International.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo Henriques
A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.
Larson, S.J.; Capel, P.D.; VanderLoop, A.G.
1996-01-01
Laboratory and quality assurance procedures for the analysis of ground-water samples for herbicides at the Management Systems Evaluation Area near Princeton, Minnesota are described. The target herbicides include atrazine, de-ethylatrazine, de-isopropylatrazine, metribuzin, alachlor, 2,6-diethylaniline, and metolachlor. The analytical techniques used are solid-phase extraction, and analysis by gas chromatography with mass-selective detection. Descriptions of cleaning procedures, preparation of standard solutions, isolation of analytes from water, sample transfer methods, instrumental analysis, and data analysis are included.
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-11-26
Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided.
Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark
2015-01-01
This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.
Advances in explosives analysis—part II: photon and neutron methods
Brown, Kathryn E.; Greenfield, Margo T.; McGrane, Shawn D.; ...
2015-10-07
The number and capability of explosives detection and analysis methods have increased dramatically since publication of the Analytical and Bioanalytical Chemistry special issue devoted to Explosives Analysis [Moore DS, Goodpaster JV, Anal Bioanal Chem 395:245–246, 2009]. Here we review and critically evaluate the latest (the past five years) important advances in explosives detection, with details of the improvements over previous methods, and suggest possible avenues towards further advances in, e.g., stand-off distance, detection limit, selectivity, and penetration through camouflage or packaging. Our review consists of two parts. Part I discussed methods based on animals, chemicals (including colorimetry, molecularly imprinted polymers,more » electrochemistry, and immunochemistry), ions (both ion-mobility spectrometry and mass spectrometry), and mechanical devices. In Part II, we review methods based on photons, from very energetic photons including X-rays and gamma rays down to the terahertz range, and neutrons.« less
NASA Technical Reports Server (NTRS)
Bourke, M.; Balme, M.; Beyer, R. A.; Williams, K. K.
2004-01-01
Methods traditionally used to estimate the relative height of surface features on Mars include: photoclinometry, shadow length and stereography. The MOLA data set enables a more accurate assessment of the surface topography of Mars. However, many small-scale aeolian bedforms remain below the sample resolution of the MOLA data set. In response to this a number of research teams have adopted and refined existing methods and applied them to high resolution (2-6 m/pixel) narrow angle MOC satellite images. Collectively, the methods provide data on a range of morphometric parameters (many not previously available for dunes on Mars). These include dune height, width, length, surface area, volume, longitudinal and cross profiles). This data will facilitate a more accurate analysis of aeolian bedforms on Mars. In this paper we undertake a comparative analysis of methods used to determine the height of aeolian dunes and ripples.
NASA Astrophysics Data System (ADS)
Szafranko, Elżbieta
2017-10-01
Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.
Advances in explosives analysis—part I. animal, chemical, ion, and mechanical methods
Brown, Kathryn E.; Greenfield, Margo T.; McGrane, Shawn D.; ...
2015-10-13
The number and capability of explosives detection and analysis methods have increased substantially since the publication of the Analytical and Bioanalytical Chemistry special issue devoted to Explosives Analysis (Moore and Goodpaster, Anal Bioanal Chem 395(2):245–246, 2009). We review and critically evaluate the latest (the past five years) important advances in explosives detection, with details of the improvements over previous methods, and suggest possible avenues towards further advances in, e.g., stand-off distance, detection limit, selectivity, and penetration through camouflage or packaging. The review consists of two parts. Moreover, Part I, reviews methods based on animals, chemicals (including colorimetry, molecularly imprinted polymers,more » electrochemistry, and immunochemistry), ions (both ion-mobility spectrometry and mass spectrometry), and mechanical devices. Part II will review methods based on photons, from very energetic photons including X-rays and gamma rays down to the terahertz range, and neutrons.« less
Practical techniques for enhancing the high-frequency MASW method
USDA-ARS?s Scientific Manuscript database
For soil exploration in the vadose zone, a high-frequency multi-channel analysis of surface waves (HF-MASW) method has been developed. In the study, several practical techniques were applied to enhance the overtone image of the HF-MASW method. They included (1) the self-adaptive MASW method using a ...
Suurmond, Robert; van Rhee, Henk; Hak, Tony
2017-12-01
We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Analysis of beryllium and depleted uranium: An overview of detection methods in aerosols and soils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camins, I.; Shinn, J.H.
We conducted a survey of commercially available methods for analysis of beryllium and depleted uranium in aerosols and soils to find a reliable, cost-effective, and sufficiently precise method for researchers involved in environmental testing at the Yuma Proving Ground, Yuma, Arizona. Criteria used for evaluation include cost, method of analysis, specificity, sensitivity, reproducibility, applicability, and commercial availability. We found that atomic absorption spectrometry with graphite furnace meets these criteria for testing samples for beryllium. We found that this method can also be used to test samples for depleted uranium. However, atomic absorption with graphite furnace is not as sensitive amore » measurement method for depleted uranium as it is for beryllium, so we recommend that quality control of depleted uranium analysis be maintained by testing 10 of every 1000 samples by neutron activation analysis. We also evaluated 45 companies and institutions that provide analyses of beryllium and depleted uranium. 5 refs., 1 tab.« less
Teresa E. Jordan
2015-10-22
The files included in this submission contain all data pertinent to the methods and results of this task’s output, which is a cohesive multi-state map of all known potential geothermal reservoirs in our region, ranked by their potential favorability. Favorability is quantified using a new metric, Reservoir Productivity Index, as explained in the Reservoirs Methodology Memo (included in zip file). Shapefile and images of the Reservoir Productivity and Reservoir Uncertainty are included as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wunschel, David S.; Kreuzer-Martin, Helen W.; Antolick, Kathryn C.
2009-12-01
This report describes method development and preliminary evaluation for analyzing castor samples for signatures of purifying ricin. Ricin purification from the source castor seeds is essentially a problem of protein purification using common biochemical methods. Indications of protein purification will likely manifest themselves as removal of the non-protein fractions of the seed. Two major, non-protein, types of biochemical constituents in the seed are the castor oil and various carbohydrates. The oil comprises roughly half the seed weight while the carbohydrate component comprises roughly half of the remaining “mash” left after oil and hull removal. Different castor oil and carbohydrate componentsmore » can serve as indicators of specific toxin processing steps. Ricinoleic acid is a relatively unique fatty acid in nature and is the most abundant component of castor oil. The loss of ricinoleic acid indicates a step to remove oil from the seeds. The relative amounts of carbohydrates and carbohydrate-like compounds, including arabinose, xylose, myo-inositol fucose, rhamnose, glucosamine and mannose detected in the sample can also indicate specific processing steps. For instance, the differential loss of arabinose relative to mannose and N-acetyl glucosamine indicates enrichment for the protein fraction of the seed using protein precipitation. The methods developed in this project center on fatty acid and carbohydrate extraction from castor samples followed by derivatization to permit analysis by gas chromatography-mass spectrometry (GC-MS). Method descriptions herein include: the source and preparation of castor materials used for method evaluation, the equipment and description of procedure required for chemical derivatization, and the instrument parameters used in the analysis. Two types of derivatization methods describe analysis of carbohydrates and one procedure for analysis of fatty acids. Two types of GC-MS analysis is included in the method development, one employing a quadrupole MS system for compound identification and an isotope ratio MS for measuring the stable isotope ratios of deuterium and hydrogen (D/H) in fatty acids. Finally, the method for analyzing the compound abundance data is included. This study indicates that removal of ricinoleic acid is a conserved consequence of each processing step we tested. Furthermore, the stable isotope D/H ratio of ricinoleic acid distinguished between two of the three castor seed sources. Concentrations of arabinose, xylose, mannose, glucosamine and myo-inositol differentiated between crude or acetone extracted samples and samples produced by protein precipitation. Taken together these data illustrate the ability to distinguish between processes used to purify a ricin sample as well as potentially the source seeds.« less
NASA Astrophysics Data System (ADS)
Hoang, Khoa; Pophristic, Milan; Horan, Andrew J.; Johnston, Murray V.; McEwen, Charles N.
2016-10-01
First results are reported using a simple, fast, and reproducible matrix-assisted ionization (MAI) sample introduction method that provides substantial improvements relative to previously published MAI methods. The sensitivity of the new MAI methods, which requires no laser, high voltage, or nebulizing gas, is comparable to those reported for MALDI-TOF and n-ESI. High resolution full acquisition mass spectra having low chemical background are acquired from low nanoliters of solution using only a few femtomoles of analyte. The limit-of-detection for angiotensin II is less than 50 amol on an Orbitrap Exactive mass spectrometer. Analysis of peptides, including a bovine serum albumin digest, and drugs, including drugs in urine without a purification step, are reported using a 1 μL zero dead volume syringe in which only the analyte solution wetting the walls of the syringe needle is used in the analysis.
Inutan, Ellen D.; Trimpin, Sarah
2013-01-01
The introduction of electrospray ionization (ESI) and matrix-assisted laser desorption/ionization (MALDI) for the mass spectrometric analysis of peptides and proteins had a dramatic impact on biological science. We now report that a wide variety of compounds, including peptides, proteins, and protein complexes, are transported directly from a solid-state small molecule matrix to gas-phase ions when placed into the vacuum of a mass spectrometer without the use of high voltage, a laser, or added heat. This ionization process produces ions having charge states similar to ESI, making the method applicable for high performance mass spectrometers designed for atmospheric pressure ionization. We demonstrate highly sensitive ionization using intermediate pressure MALDI and modified ESI sources. This matrix and vacuum assisted soft ionization method is suitable for the direct surface analysis of biological materials, including tissue, via mass spectrometry. PMID:23242551
Van Berkel, Gary J.
2015-10-06
A system and method for analyzing a chemical composition of a specimen are described. The system can include at least one pin; a sampling device configured to contact a liquid with a specimen on the at least one pin to form a testing solution; and a stepper mechanism configured to move the at least one pin and the sampling device relative to one another. The system can also include an analytical instrument for determining a chemical composition of the specimen from the testing solution. In particular, the systems and methods described herein enable chemical analysis of specimens, such as tissue, to be evaluated in a manner that the spatial-resolution is limited by the size of the pins used to obtain tissue samples, not the size of the sampling device used to solubilize the samples coupled to the pins.
Qualitative research methods in renal medicine: an introduction.
Bristowe, Katherine; Selman, Lucy; Murtagh, Fliss E M
2015-09-01
Qualitative methodologies are becoming increasingly widely used in health research. However, within some specialties, including renal medicine, qualitative approaches remain under-represented in the high-impact factor journals. Qualitative research can be undertaken: (i) as a stand-alone research method, addressing specific research questions; (ii) as part of a mixed methods approach alongside quantitative approaches or (iii) embedded in clinical trials, or during the development of complex interventions. The aim of this paper is to introduce qualitative research, including the rationale for choosing qualitative approaches, and guidance for ensuring quality when undertaking and reporting qualitative research. In addition, we introduce types of qualitative data (observation, interviews and focus groups) as well as some of the most commonly encountered methodological approaches (case studies, ethnography, phenomenology, grounded theory, thematic analysis, framework analysis and content analysis). © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
NASA Technical Reports Server (NTRS)
Pliutau, Denis; Prasad, Narasimha S.
2012-01-01
In this paper a modeling method based on data reductions is investigated which includes pre analyzed MERRA atmospheric fields for quantitative estimates of uncertainties introduced in the integrated path differential absorption methods for the sensing of various molecules including CO2. This approach represents the extension of our existing lidar modeling framework previously developed and allows effective on- and offline wavelength optimizations and weighting function analysis to minimize the interference effects such as those due to temperature sensitivity and water vapor absorption. The new simulation methodology is different from the previous implementation in that it allows analysis of atmospheric effects over annual spans and the entire Earth coverage which was achieved due to the data reduction methods employed. The effectiveness of the proposed simulation approach is demonstrated with application to the mixing ratio retrievals for the future ASCENDS mission. Independent analysis of multiple accuracy limiting factors including the temperature, water vapor interferences, and selected system parameters is further used to identify favorable spectral regions as well as wavelength combinations facilitating the reduction in total errors in the retrieved XCO2 values.
Factors Influencing Progressive Failure Analysis Predictions for Laminated Composite Structure
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
2008-01-01
Progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model for use with a nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details are described in the present paper. Parametric studies for laminated composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented and to demonstrate their influence on progressive failure analysis predictions.
Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi
2015-01-01
Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107
A Bayesian approach to meta-analysis of plant pathology studies.
Mila, A L; Ngugi, H K
2011-01-01
Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework. Bayesian meta-analysis can readily include information not easily incorporated in classical methods, and allow for a full evaluation of competing models. Given the power and flexibility of Bayesian methods, we expect them to become widely adopted for meta-analysis of plant pathology studies.
Coformer screening using thermal analysis based on binary phase diagrams.
Yamashita, Hiroyuki; Hirakura, Yutaka; Yuda, Masamichi; Terada, Katsuhide
2014-08-01
The advent of cocrystals has demonstrated a growing need for efficient and comprehensive coformer screening in search of better development forms, including salt forms. Here, we investigated a coformer screening system for salts and cocrystals based on binary phase diagrams using thermal analysis and examined the effectiveness of the method. Indomethacin and tenoxicam were used as models of active pharmaceutical ingredients (APIs). Physical mixtures of an API and 42 kinds of coformers were analyzed using Differential Scanning Calorimetry (DSC) and X-ray DSC. We also conducted coformer screening using a conventional slurry method and compared these results with those from the thermal analysis method and previous studies. Compared with the slurry method, the thermal analysis method was a high-performance screening system, particularly for APIs with low solubility and/or propensity to form solvates. However, this method faced hurdles for screening coformers combined with an API in the presence of kinetic hindrance for salt or cocrystal formation during heating or if there is degradation near the metastable eutectic temperature. The thermal analysis and slurry methods are considered complementary to each other for coformer screening. Feasibility of the thermal analysis method in drug discovery practice is ensured given its small scale and high throughput.
North Dakota's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; David E. Haugen; Charles J. Barnett
2011-01-01
The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...
South Dakota's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; Ronald J. Piva; Charles J. Barnett
2011-01-01
The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...
ERIC Educational Resources Information Center
Frankel, Elaine B.; Hutchinson, Nancy L.; Burbidge, Julie; Minnes, Patricia
2014-01-01
This mixed methods study reports on the perspectives of 143 preservice early childhood educators (ECE) and 208 elementary teacher candidates (TC) on teaching children with developmental disabilities and delays (DDD) in inclusive classrooms. A questionnaire was administered which included items on demographic characteristics, experience, knowledge,…
Willke, Richard J; Zheng, Zhiyuan; Subedi, Prasun; Althin, Rikard; Mullins, C Daniel
2012-12-13
Implicit in the growing interest in patient-centered outcomes research is a growing need for better evidence regarding how responses to a given intervention or treatment may vary across patients, referred to as heterogeneity of treatment effect (HTE). A variety of methods are available for exploring HTE, each associated with unique strengths and limitations. This paper reviews a selected set of methodological approaches to understanding HTE, focusing largely but not exclusively on their uses with randomized trial data. It is oriented for the "intermediate" outcomes researcher, who may already be familiar with some methods, but would value a systematic overview of both more and less familiar methods with attention to when and why they may be used. Drawing from the biomedical, statistical, epidemiological and econometrics literature, we describe the steps involved in choosing an HTE approach, focusing on whether the intent of the analysis is for exploratory, initial testing, or confirmatory testing purposes. We also map HTE methodological approaches to data considerations as well as the strengths and limitations of each approach. Methods reviewed include formal subgroup analysis, meta-analysis and meta-regression, various types of predictive risk modeling including classification and regression tree analysis, series of n-of-1 trials, latent growth and growth mixture models, quantile regression, and selected non-parametric methods. In addition to an overview of each HTE method, examples and references are provided for further reading.By guiding the selection of the methods and analysis, this review is meant to better enable outcomes researchers to understand and explore aspects of HTE in the context of patient-centered outcomes research.
Bayesian analysis of rare events
NASA Astrophysics Data System (ADS)
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Environmental analysis of higher brominated diphenyl ethers and decabromodiphenyl ethane.
Kierkegaard, Amelie; Sellström, Ulla; McLachlan, Michael S
2009-01-16
Methods for environmental analysis of higher brominated diphenyl ethers (PBDEs), in particular decabromodiphenyl ether (BDE209), and the recently discovered environmental contaminant decabromodiphenyl ethane (deBDethane) are reviewed. The extensive literature on analysis of BDE209 has identified several critical issues, including contamination of the sample, degradation of the analyte during sample preparation and GC analysis, and the selection of appropriate detection methods and surrogate standards. The limited experience with the analysis of deBDethane suggests that there are many commonalities with BDE209. The experience garnered from the analysis of BDE209 over the last 15 years will greatly facilitate progress in the analysis of deBDethane.
Patterns of Contraceptive Adoption, Continuation, and Switching after Delivery among Malawian Women.
Kopp, Dawn M; Rosenberg, Nora E; Stuart, Gretchen S; Miller, William C; Hosseinipour, Mina C; Bonongwe, Phylos; Mwale, Mwawi; Tang, Jennifer H
2017-01-01
Women who report use of postpartum family planning may not continue their initial method or use it consistently. Understanding the patterns of method uptake, discontinuation, and switching among women after delivery is important to promote uptake and continuation of effective methods of contraception. This is a secondary analysis of 634 Malawian women enrolled into a prospective cohort study after delivery. They completed baseline surveys upon enrollment and follow-up telephone surveys 3, 6, and 12 months post-delivery. Women were included in this analysis if they had completed at least the 3- and 6-month post-delivery surveys. Descriptive statistics were used to assess contraceptive method mix and patterns of switching, whereas Pearson's χ2 tests were used for bivariable analyses to compare characteristics of women who continued and discontinued their initial post-delivery contraceptive method. Among the 479 women included in this analysis, the use of abstinence/traditional methods decreased and the use of long-acting and permanent methods (LAPM) increased over time. Almost half (47%) discontinued the contraceptive method reported at 3-months post-delivery; women using injectables or LAPM at 3-months post-delivery were significantly more likely to continue their method than those using non-modern methods (p<0.001). Of the 216 women who switched methods, 82% switched to a more or equally effective method. The change in contraceptive method mix and high rate of contraceptive switching in the first 12 months postpartum highlights a need to assist women in accessing effective contraceptives soon after delivery.
Exploratory Analysis in Learning Analytics
ERIC Educational Resources Information Center
Gibson, David; de Freitas, Sara
2016-01-01
This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…
Eastwood, John G; Kemp, Lynn A; Jalaludin, Bin B
2016-01-01
We have recently described a protocol for a study that aims to build a theory of neighbourhood context and postnatal depression. That protocol proposed a critical realist Explanatory Theory Building Method comprising of an: (1) emergent phase, (2) construction phase, and (3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design was described. The protocol also described in detail the Theory Construction Phase which will be presented here. The Theory Construction Phase will include: (1) defining stratified levels; (2) analytic resolution; (3) abductive reasoning; (4) comparative analysis (triangulation); (5) retroduction; (6) postulate and proposition development; (7) comparison and assessment of theories; and (8) conceptual frameworks and model development. The stratified levels of analysis in this study were predominantly social and psychological. The abductive analysis used the theoretical frames of: Stress Process; Social Isolation; Social Exclusion; Social Services; Social Capital, Acculturation Theory and Global-economic level mechanisms. Realist propositions are presented for each analysis of triangulated data. Inference to best explanation is used to assess and compare theories. A conceptual framework of maternal depression, stress and context is presented that includes examples of mechanisms at psychological, social, cultural and global-economic levels. Stress was identified as a necessary mechanism that has the tendency to cause several outcomes including depression, anxiety, and health harming behaviours. The conceptual framework subsequently included conditional mechanisms identified through the retroduction including the stressors of isolation and expectations and buffers of social support and trust. The meta-theory of critical realism is used here to generate and construct social epidemiological theory using stratified ontology and both abductive and retroductive analysis. The findings will be applied to the development of a middle range theory and subsequent programme theory for local perinatal child and family interventions.
Ultra-short wavelength x-ray system
Umstadter, Donald [Ann Arbor, MI; He, Fei [Ann Arbor, MI; Lau, Yue-Ying [Potomac, MD
2008-01-22
A method and apparatus to generate a beam of coherent light including x-rays or XUV by colliding a high-intensity laser pulse with an electron beam that is accelerated by a synchronized laser pulse. Applications include x-ray and EUV lithography, protein structural analysis, plasma diagnostics, x-ray diffraction, crack analysis, non-destructive testing, surface science and ultrafast science.
Lerch, Oliver; Temme, Oliver; Daldrup, Thomas
2014-07-01
The analysis of opioids, cocaine, and metabolites from blood serum is a routine task in forensic laboratories. Commonly, the employed methods include many manual or partly automated steps like protein precipitation, dilution, solid phase extraction, evaporation, and derivatization preceding a gas chromatography (GC)/mass spectrometry (MS) or liquid chromatography (LC)/MS analysis. In this study, a comprehensively automated method was developed from a validated, partly automated routine method. This was possible by replicating method parameters on the automated system. Only marginal optimization of parameters was necessary. The automation relying on an x-y-z robot after manual protein precipitation includes the solid phase extraction, evaporation of the eluate, derivatization (silylation with N-methyl-N-trimethylsilyltrifluoroacetamide, MSTFA), and injection into a GC/MS. A quantitative analysis of almost 170 authentic serum samples and more than 50 authentic samples of other matrices like urine, different tissues, and heart blood on cocaine, benzoylecgonine, methadone, morphine, codeine, 6-monoacetylmorphine, dihydrocodeine, and 7-aminoflunitrazepam was conducted with both methods proving that the analytical results are equivalent even near the limits of quantification (low ng/ml range). To our best knowledge, this application is the first one reported in the literature employing this sample preparation system.
Kuang, Ming-Jie; Du, Yuren; Ma, Jian-Xiong; He, Weiwei; Fu, Lin; Ma, Xin-Long
2017-04-01
Total knee arthroplasty (TKA) is gradually emerging as the treatment of choice for end-stage osteoarthritis. In the past, the method of liposomal bupivacaine by periarticular injection (PAI) showed better effects on pain reduction and opioid consumption after surgery. However, some recent studies have reported that liposomal bupivacaine by PAI did not improve pain control and functional recovery in patients undergoing TKA. Therefore, this meta-analysis was conducted to determine whether liposomal bupivacaine provides better pain relief and functional recovery after TKA. Web of Science, PubMed, Embase, and the Cochrane Library were comprehensively searched. Randomized controlled trials, controlled clinical trials, and cohort studies were included in our meta-analysis. Eleven studies that compared liposomal bupivacaine using the PAI technique with the conventional PAI method were included in our meta-analysis. The preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines and Cochrane Handbook were applied to assess the quality of the results published in all included studies to ensure that the results of our meta-analysis were reliable and veritable. Our pooled data analysis demonstrated that liposomal bupivacaine was as effective as the control group in terms of visual analog scale score at 24 hours (P = .46), 48 hours (P = .43), 72 hours (P = .21), total amount of opioid consumption (P = .25), range of motion (P = .28), length of hospital stay (P = .53), postoperative nausea (P = .34), and ambulation distance (P = .07). Compared with the conventional PAI method, liposomal bupivacaine shows similar pain control and functional recovery after TKA. Considering the cost for pain control, liposomal bupivacaine is not worthy of being recommended as a long-acting alternative analgesic agent using the PAI method. Copyright © 2016 Elsevier Inc. All rights reserved.
Shardell, Michelle; Harris, Anthony D; El-Kamary, Samer S; Furuno, Jon P; Miller, Ram R; Perencevich, Eli N
2007-10-01
Quasi-experimental study designs are frequently used to assess interventions that aim to limit the emergence of antimicrobial-resistant pathogens. However, previous studies using these designs have often used suboptimal statistical methods, which may result in researchers making spurious conclusions. Methods used to analyze quasi-experimental data include 2-group tests, regression analysis, and time-series analysis, and they all have specific assumptions, data requirements, strengths, and limitations. An example of a hospital-based intervention to reduce methicillin-resistant Staphylococcus aureus infection rates and reduce overall length of stay is used to explore these methods.
Comparing direct and iterative equation solvers in a large structural analysis software system
NASA Technical Reports Server (NTRS)
Poole, E. L.
1991-01-01
Two direct Choleski equation solvers and two iterative preconditioned conjugate gradient (PCG) equation solvers used in a large structural analysis software system are described. The two direct solvers are implementations of the Choleski method for variable-band matrix storage and sparse matrix storage. The two iterative PCG solvers include the Jacobi conjugate gradient method and an incomplete Choleski conjugate gradient method. The performance of the direct and iterative solvers is compared by solving several representative structural analysis problems. Some key factors affecting the performance of the iterative solvers relative to the direct solvers are identified.
Systems and methods for analyzing building operations sensor data
Mezic, Igor; Eisenhower, Bryan A.
2015-05-26
Systems and methods are disclosed for analyzing building sensor information and decomposing the information therein to a more manageable and more useful form. Certain embodiments integrate energy-based and spectral-based analysis methods with parameter sampling and uncertainty/sensitivity analysis to achieve a more comprehensive perspective of building behavior. The results of this analysis may be presented to a user via a plurality of visualizations and/or used to automatically adjust certain building operations. In certain embodiments, advanced spectral techniques, including Koopman-based operations, are employed to discern features from the collected building sensor data.
An approximate methods approach to probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.
1989-01-01
A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.
NASA Technical Reports Server (NTRS)
Davis, J. E.; Bonnett, W. S.; Medan, R. T.
1976-01-01
A computer program known as SOLN was developed as an independent segment of the NASA-Ames three-dimensional potential flow analysis systems of linear algebraic equations. Methods used include: LU decomposition, Householder's method, a partitioning scheme, and a block successive relaxation method. Due to the independent modular nature of the program, it may be used by itself and not necessarily in conjunction with other segments of the POTFAN system.
Survey of Existing and Promising New Methods of Surface Preparation
1982-04-01
and abroad, a description and analysis are givev of applicable methods including: • Equipment employing recycled steel shot and grit. • wet blast...requirements that must be met by these methods. 23. Barrillom, P., “Preservation of Materials in the Marine Environment— Analysis of Replies TO The Enquiry on...conditions, can hydrolyze or give sulfuric acid, causing renewed corrosion. Wet blasting or the use of high pressure water jets appears to be useful in
Pedigree data analysis with crossover interference.
Browning, Sharon
2003-01-01
We propose a new method for calculating probabilities for pedigree genetic data that incorporates crossover interference using the chi-square models. Applications include relationship inference, genetic map construction, and linkage analysis. The method is based on importance sampling of unobserved inheritance patterns conditional on the observed genotype data and takes advantage of fast algorithms for no-interference models while using reweighting to allow for interference. We show that the method is effective for arbitrarily many markers with small pedigrees. PMID:12930760
Solution of elastic-plastic stress analysis problems by the p-version of the finite element method
NASA Technical Reports Server (NTRS)
Szabo, Barna A.; Actis, Ricardo L.; Holzer, Stefan M.
1993-01-01
The solution of small strain elastic-plastic stress analysis problems by the p-version of the finite element method is discussed. The formulation is based on the deformation theory of plasticity and the displacement method. Practical realization of controlling discretization errors for elastic-plastic problems is the main focus. Numerical examples which include comparisons between the deformation and incremental theories of plasticity under tight control of discretization errors are presented.
Dynamic analysis environment for nuclear forensic analyses
NASA Astrophysics Data System (ADS)
Stork, C. L.; Ummel, C. C.; Stuart, D. S.; Bodily, S.; Goldblum, B. L.
2017-01-01
A Dynamic Analysis Environment (DAE) software package is introduced to facilitate group inclusion/exclusion method testing, evaluation and comparison for pre-detonation nuclear forensics applications. Employing DAE, the multivariate signatures of a questioned material can be compared to the signatures for different, known groups, enabling the linking of the questioned material to its potential process, location, or fabrication facility. Advantages of using DAE for group inclusion/exclusion include built-in query tools for retrieving data of interest from a database, the recording and documentation of all analysis steps, a clear visualization of the analysis steps intelligible to a non-expert, and the ability to integrate analysis tools developed in different programming languages. Two group inclusion/exclusion methods are implemented in DAE: principal component analysis, a parametric feature extraction method, and k nearest neighbors, a nonparametric pattern recognition method. Spent Fuel Isotopic Composition (SFCOMPO), an open source international database of isotopic compositions for spent nuclear fuels (SNF) from 14 reactors, is used to construct PCA and KNN models for known reactor groups, and 20 simulated SNF samples are utilized in evaluating the performance of these group inclusion/exclusion models. For all 20 simulated samples, PCA in conjunction with the Q statistic correctly excludes a large percentage of reactor groups and correctly includes the true reactor of origination. Employing KNN, 14 of the 20 simulated samples are classified to their true reactor of origination.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-24
... methods for medical device comparative analyses, best practices and best design and analysis methods. II... the performance of medical devices (including comparative effectiveness studies). The centers...
ERIC Educational Resources Information Center
Krishen, Anoop
1989-01-01
This review covers methods for identification, characterization, and determination of rubber and materials in rubber. Topics include: general information, nuclear magnetic resonance spectroscopy, infrared spectroscopy, thermal methods, gel permeation chromatography, size exclusion chromatography, analysis related to safety and health, and…
Analytical procedures for water-soluble vitamins in foods and dietary supplements: a review.
Blake, Christopher J
2007-09-01
Water-soluble vitamins include the B-group vitamins and vitamin C. In order to correctly monitor water-soluble vitamin content in fortified foods for compliance monitoring as well as to establish accurate data banks, an accurate and precise analytical method is a prerequisite. For many years microbiological assays have been used for analysis of B vitamins. However they are no longer considered to be the gold standard in vitamins analysis as many studies have shown up their deficiencies. This review describes the current status of analytical methods, including microbiological assays and spectrophotometric, biosensor and chromatographic techniques. In particular it describes the current status of the official methods and highlights some new developments in chromatographic procedures and detection methods. An overview is made of multivitamin extractions and analyses for foods and supplements.
NATIONAL RESPONSE TEAM TECHNICAL ASSISTANCE ...
This document provides technical information on a wide range of activities to aid in response to intentional release of anthrax in urban environments. It includes initial actions when a potential release is discovered, health and safety issues for responders, sampling and analysis methods, decontamination technologies, decontamination waste disposal, and communication with public. This document provides technical information on a wide range of activities to aid in response to intentional release of anthrax in urban environments. It includes initial actions when a potential release is discovered, health and safety issues for responders, sampling and analysis methods, decontamination technologies, decontamination waste disposal, and communication with public.
NASA Technical Reports Server (NTRS)
Badler, N. I.
1985-01-01
Human motion analysis is the task of converting actual human movements into computer readable data. Such movement information may be obtained though active or passive sensing methods. Active methods include physical measuring devices such as goniometers on joints of the body, force plates, and manually operated sensors such as a Cybex dynamometer. Passive sensing de-couples the position measuring device from actual human contact. Passive sensors include Selspot scanning systems (since there is no mechanical connection between the subject's attached LEDs and the infrared sensing cameras), sonic (spark-based) three-dimensional digitizers, Polhemus six-dimensional tracking systems, and image processing systems based on multiple views and photogrammetric calculations.
Activity recognition using Video Event Segmentation with Text (VEST)
NASA Astrophysics Data System (ADS)
Holloway, Hillary; Jones, Eric K.; Kaluzniacki, Andrew; Blasch, Erik; Tierno, Jorge
2014-06-01
Multi-Intelligence (multi-INT) data includes video, text, and signals that require analysis by operators. Analysis methods include information fusion approaches such as filtering, correlation, and association. In this paper, we discuss the Video Event Segmentation with Text (VEST) method, which provides event boundaries of an activity to compile related message and video clips for future interest. VEST infers meaningful activities by clustering multiple streams of time-sequenced multi-INT intelligence data and derived fusion products. We discuss exemplar results that segment raw full-motion video (FMV) data by using extracted commentary message timestamps, FMV metadata, and user-defined queries.
Robinson, César Leyton; Caballero, Andrés Díaz
2007-01-01
This article is an experimental methodological reflection on the use of medical images as useful documents for constructing the history of medicine. A method is used that is based on guidelines or analysis topics that include different ways of viewing documents, from aesthetic, technical, social and political theories to historical and medical thinking. Some exercises are also included that enhance the proposal for the reader: rediscovering the worlds in society that harbor these medical photographical archives to obtain a new theoretical approach to the construction of the history of medical science.
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It provide...
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It pro...
This compendium contains seven SOPs developed by Food and Drug Administration (FDA) laboratories for methods of analyzing trace metals in dietary samples collected using Total Diet study procedures. The SOPs include the following: (1) Quality Control for Analysis of NHEXAS Food o...
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It pro...
Integrated method for chaotic time series analysis
Hively, L.M.; Ng, E.G.
1998-09-29
Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.
Cho, Il Haeng; Park, Kyung S; Lim, Chang Joo
2010-02-01
In this study, we described the characteristics of five different biological age (BA) estimation algorithms, including (i) multiple linear regression, (ii) principal component analysis, and somewhat unique methods developed by (iii) Hochschild, (iv) Klemera and Doubal, and (v) a variant of Klemera and Doubal's method. The objective of this study is to find the most appropriate method of BA estimation by examining the association between Work Ability Index (WAI) and the differences of each algorithm's estimates from chronological age (CA). The WAI was found to be a measure that reflects an individual's current health status rather than the deterioration caused by a serious dependency with the age. Experiments were conducted on 200 Korean male participants using a BA estimation system developed principally under the concept of non-invasive, simple to operate and human function-based. Using the empirical data, BA estimation as well as various analyses including correlation analysis and discriminant function analysis was performed. As a result, it had been confirmed by the empirical data that Klemera and Doubal's method with uncorrelated variables from principal component analysis produces relatively reliable and acceptable BA estimates. 2009 Elsevier Ireland Ltd. All rights reserved.
Methods for collection and analysis of geopressured geothermal and oil field waters
Lico, Michael S.; Kharaka, Yousif K.; Carothers, William W.; Wright, Victoria A.
1982-01-01
Present methods are described for the collection, preservation, and chemical analysis of waters produced from geopressured geothermal and petroleum wells. Detailed procedures for collection include precautions and equipment necessary to ensure that the sample is representative of the water produced. Procedures for sample preservation include filtration, acidification, dilution for silica, methyl isobutyl ketone (MIBK) extraction of aluminum, addition of potassium permanganate to preserve mercury, and precipitation of carbonate species as strontium carbonate for stable carbon isotopes and total dissolved carbonate analysis. Characteristics determined at the well site are sulfide, pH, ammonia, and conductivity. Laboratory procedures are given for the analysis of lithium, sodium, potassium, rubidium, cesium, magnesium, calcium, strontium, barium, iron, manganese, zinc, lead, aluminum, .and mercury by atomic absorption and flame emission spectroscopy. Chloride is determined by silver nitrate titration and fluoride by ion-specific electrode. Bromide and iodide concentrations are determined by the hypochlorite oxidation method. Sulfate is analyzed by titration using barium chloride with thorin indicator after pretreatment with alumina. Boron and silica are determined colorimetrically by the carmine and molybdate-blue methods, respectively. Aliphatic acid anions (C2 through C5) are determined by gas chromatography after separation and concentration in a chloroform-butanol mixture.
Toward patient-specific articular contact mechanics
Ateshian, Gerard A.; Henak, Corinne R.; Weiss, Jeffrey A.
2015-01-01
The mechanics of contacting cartilage layers is fundamentally important to understanding the development, homeostasis and pathology of diarthrodial joints. Because of the highly nonlinear nature of both the materials and the contact problem itself, numerical methods such as the finite element method are typically incorporated to obtain solutions. Over the course of five decades, we have moved from an initial qualitative understanding of articular cartilage material behavior to the ability to perform complex, three-dimensional contact analysis, including multiphasic material representations. This history includes the development of analytical and computational contact analysis methods that now provide the ability to perform highly nonlinear analyses. Numerical implementations of contact analysis based on the finite element method are rapidly advancing and will soon enable patient-specific analysis of joint contact mechanics using models based on medical image data. In addition to contact stress on the articular surfaces, these techniques can predict variations in strain and strain through the cartilage layers, providing the basis to predict damage and failure. This opens up exciting areas for future research and application to patient-specific diagnosis and treatment planning applied to a variety of pathologies that affect joint function and cartilage homeostasis. PMID:25698236
Analysis of digital communication signals and extraction of parameters
NASA Astrophysics Data System (ADS)
Al-Jowder, Anwar
1994-12-01
The signal classification performance of four types of electronics support measure (ESM) communications detection systems is compared from the standpoint of the unintended receiver (interceptor). Typical digital communication signals considered include binary phase shift keying (BPSK), quadrature phase shift keying (QPSK), frequency shift keying (FSK), and on-off keying (OOK). The analysis emphasizes the use of available signal processing software. Detection methods compared include broadband energy detection, FFT-based narrowband energy detection, and two correlation methods which employ the fast Fourier transform (FFT). The correlation methods utilize modified time-frequency distributions, where one of these is based on the Wigner-Ville distribution (WVD). Gaussian white noise is added to the signal to simulate various signal-to-noise ratios (SNR's).
Classical least squares multivariate spectral analysis
Haaland, David M.
2002-01-01
An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.
NASA Technical Reports Server (NTRS)
1994-01-01
General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.
An Integrated Analysis-Test Approach
NASA Technical Reports Server (NTRS)
Kaufman, Daniel
2003-01-01
This viewgraph presentation provides an overview of a project to develop a computer program which integrates data analysis and test procedures. The software application aims to propose a new perspective to traditional mechanical analysis and test procedures and to integrate pre-test and test analysis calculation methods. The program also should also be able to be used in portable devices and allows for the 'quasi-real time' analysis of data sent by electronic means. Test methods reviewed during this presentation include: shaker swept sine and random tests, shaker shock mode tests, shaker base driven model survey tests and acoustic tests.
Assessment of environmental impacts part one. Intervention analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hipel, Keith William; Lettenmaier, Dennis P.; McLeod, A. Ian
The use of intervention analysis as a statistical method of gauging the effects of environmental changes is discussed. The Box-Jenkins model, serves as the basis for the intervention analysis methodology. Environmental studies of the Aswan Dam, the South Saskatchewan River, and a forest fire near the Pipers Hole River, Canada, are included as case studies in which intervention analysis was employed. Methods of data collection for intervention analysis are found to have a significant impact on model reliability; effective data collection processes for the Box-Jenkins model are provided. (15 graphs, 27 references, 2 tables)
The analysis of verbal interaction sequences in dyadic clinical communication: a review of methods.
Connor, Martin; Fletcher, Ian; Salmon, Peter
2009-05-01
To identify methods available for sequential analysis of dyadic verbal clinical communication and to review their methodological and conceptual differences. Critical review, based on literature describing sequential analyses of clinical and other relevant social interaction. Dominant approaches are based on analysis of communication according to its precise position in the series of utterances that constitute event-coded dialogue. For practical reasons, methods focus on very short-term processes, typically the influence of one party's speech on what the other says next. Studies of longer-term influences are rare. Some analyses have statistical limitations, particularly in disregarding heterogeneity between consultations, patients or practitioners. Additional techniques, including ones that can use information about timing and duration of speech from interval-coding are becoming available. There is a danger that constraints of commonly used methods shape research questions and divert researchers from potentially important communication processes including ones that operate over a longer-term than one or two speech turns. Given that no one method can model the complexity of clinical communication, multiple methods, both quantitative and qualitative, are necessary. Broadening the range of methods will allow the current emphasis on exploratory studies to be balanced by tests of hypotheses about clinically important communication processes.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2004-03-23
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
NASA Astrophysics Data System (ADS)
Shaltout, Abdallah A.; Moharram, Mohammed A.; Mostafa, Nasser Y.
2012-01-01
This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method.
NASA Astrophysics Data System (ADS)
Wang, Hongliang; Liu, Baohua; Ding, Zhongjun; Wang, Xiangxin
2017-02-01
Absorption-based optical sensors have been developed for the determination of water pH. In this paper, based on the preparation of a transparent sol-gel thin film with a phenol red (PR) indicator, several calculation methods, including simple linear regression analysis, quadratic regression analysis and dual-wavelength absorbance ratio analysis, were used to calculate water pH. Results of MSSRR show that dual-wavelength absorbance ratio analysis can improve the calculation accuracy of water pH in long-term measurement.
NASA Technical Reports Server (NTRS)
Malroy, Eric T.
2010-01-01
This paper presents the analysis examining the feasibility of interstellar travel using laser sail probes ranging in size from the nano to the macro. The relativistic differential equations of motion for a laser sail are set up and solved using the Pasic Method. The limitations of the analysis are presented and discussed. The requirements for the laser system are examined, including the thermal analysis of the laser sails. Black holes, plasma fields, atmospheric collisions and sun light are several methods discussed to enable the deceleration of the interstellar probe. A number of novel mission scenarios are presented including the embryonic transport of plant life as a precursor to the arrival of space colonies
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals
Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.
2016-01-01
Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116
Lausberg, Hedda; Sloetjes, Han
2016-09-01
As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods.
Neutron activation analysis of certified samples by the absolute method
NASA Astrophysics Data System (ADS)
Kadem, F.; Belouadah, N.; Idiri, Z.
2015-07-01
The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.
Underlying risk factors for prescribing errors in long-term aged care: a qualitative study.
Tariq, Amina; Georgiou, Andrew; Raban, Magdalena; Baysari, Melissa Therese; Westbrook, Johanna
2016-09-01
To identify system-related risk factors perceived to contribute to prescribing errors in Australian long-term care settings, that is, residential aged care facilities (RACFs). The study used qualitative methods to explore factors that contribute to unsafe prescribing in RACFs. Data were collected at three RACFs in metropolitan Sydney, Australia between May and November 2011. Participants included RACF managers, doctors, pharmacists and RACF staff actively involved in prescribing-related processes. Methods included non-participant observations (74 h), in-depth semistructured interviews (n=25) and artefact analysis. Detailed process activity models were developed for observed prescribing episodes supplemented by triangulated analysis using content analysis methods. System-related factors perceived to increase the risk of prescribing errors in RACFs were classified into three overarching themes: communication systems, team coordination and staff management. Factors associated with communication systems included limited point-of-care access to information, inadequate handovers, information storage across different media (paper, electronic and memory), poor legibility of charts, information double handling, multiple faxing of medication charts and reliance on manual chart reviews. Team factors included lack of established lines of responsibility, inadequate team communication and limited participation of doctors in multidisciplinary initiatives like medication advisory committee meetings. Factors related to staff management and workload included doctors' time constraints and their accessibility, lack of trained RACF staff and high RACF staff turnover. The study highlights several system-related factors including laborious methods for exchanging medication information, which often act together to contribute to prescribing errors. Multiple interventions (eg, technology systems, team communication protocols) are required to support the collaborative nature of RACF prescribing. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
This compendium includes descriptions of methods for analyzing metals, pesticides and volatile organic compounds (VOCs) in water. The individual methods covered are these: (1) Method 200.8: determination of trace elements in waters and wastes by inductively coupled plasma-mass s...
A Phase-Only technique for enhancing the high-frequency MASW method
USDA-ARS?s Scientific Manuscript database
For soil exploration in the vadose zone, a high-frequency multi-channel analysis of surface waves (HF-MASW) method has been developed. In the study, several practical techniques were applied to enhance the overtone image of the HF-MASW method. They included (1) the self-adaptive MASW method using a ...
Al-Badr, Abdullah A; Alodhaib, Mansour M
2016-01-01
Dacarbazine is a cell cycle nonspecific antineoplastic alkylating agent used in the treatment of metastatic malignant melanoma. This chapter contains the descriptions of the drug: nomenclature, formulae, chemical structure, elemental composition, and appearance. The uses and applications of dacarbazine and the methods that were used for its preparation are reported. The methods which were used for the physical characterization of the drug are ionization constant, solubility, X-ray powder diffraction pattern, crystal structure, melting point, and differential scanning calorimetry. The profile contains the spectra of the drug: ultraviolet spectrum, vibrational spectrum, nuclear magnetic resonance spectra, and mass spectrum. The compendial methods of analysis for dacarbazine include the United States Pharmacopeia methods, British Pharmacopeia methods, and International Pharmacopeia methods. Other reported methods that are used for the analysis of the drug are high-performance liquid chromatography, high-performance liquid chromatography-mass spectrometry, and polarography. Metabolism, pharmacokinetics, and stability studies on dacarbazine are also included. Reviews of some analytical methods and physicochemical properties of the drug as well as the most important enzymes that are involved in the prodrug activation are provided. Sixty-four references are listed at the end of this monograph. © 2016 Elsevier Inc. All rights reserved.
Ideology Awareness Project: An Exercise in Item Unit Content Analysis.
ERIC Educational Resources Information Center
Simon, David R.
1981-01-01
Describes an exercise in the content analysis of political ideologies. Advantages of the exercise include that it teaches students to employ content analysis as a method of research and that it introduces them to the ideological statements of America's leading social critics. (DB)
Advances in the Use of Thermography to Inspect Composite Tanks for Liquid Fuel Propulsion Systems
NASA Technical Reports Server (NTRS)
Lansing, Matthew D.; Russell, Samuel S.; Walker, James L.; Jones, Clyde S. (Technical Monitor)
2001-01-01
This viewgraph presentation gives an overview of advances in the use of thermography to inspect composite tanks for liquid fuel propulsion systems. Details are given on the thermographic inspection system, thermographic analysis method (includes scan and defect map, method of inspection, and inclusions, ply wrinkle, and delamination defects), graphite composite cryogenic feedline (including method, image map, and deep/shallow inclusions and resin rich area defects), and material degradation nondestructive evaluation.
A Dual Super-Element Domain Decomposition Approach for Parallel Nonlinear Finite Element Analysis
NASA Astrophysics Data System (ADS)
Jokhio, G. A.; Izzuddin, B. A.
2015-05-01
This article presents a new domain decomposition method for nonlinear finite element analysis introducing the concept of dual partition super-elements. The method extends ideas from the displacement frame method and is ideally suited for parallel nonlinear static/dynamic analysis of structural systems. In the new method, domain decomposition is realized by replacing one or more subdomains in a "parent system," each with a placeholder super-element, where the subdomains are processed separately as "child partitions," each wrapped by a dual super-element along the partition boundary. The analysis of the overall system, including the satisfaction of equilibrium and compatibility at all partition boundaries, is realized through direct communication between all pairs of placeholder and dual super-elements. The proposed method has particular advantages for matrix solution methods based on the frontal scheme, and can be readily implemented for existing finite element analysis programs to achieve parallelization on distributed memory systems with minimal intervention, thus overcoming memory bottlenecks typically faced in the analysis of large-scale problems. Several examples are presented in this article which demonstrate the computational benefits of the proposed parallel domain decomposition approach and its applicability to the nonlinear structural analysis of realistic structural systems.
Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio
2018-01-01
Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.
System of Systems Analytic Workbench - 2017
2017-08-31
and transitional activities with key collaborators. The tools include: System Operational Dependency Analysis/System Developmental Dependency Analysis...in the methods of the SoS-AWB involve the following: 1. System Operability Dependency Analysis (SODA)/System Development Dependency Analysis...available f. Development of standard dependencies with combinations of low-medium-high parameters Report No. SERC-2017-TR-111
Progressive Failure Analysis Methodology for Laminated Composite Structures
NASA Technical Reports Server (NTRS)
Sleight, David W.
1999-01-01
A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael W.
1995-01-01
An objective and quantitative method has been developed for deriving models of complex and specialized spheres of activity (domains) from domain-generated verbal data. The method was developed for analysis of interview transcripts, incident reports, and other text documents whose original source is people who are knowledgeable about, and participate in, the domain in question. To test the method, it is applied here to a report describing a remote sensing project within the scope of the Earth Observing System (EOS). The method has the potential to improve the designs of domain-related computer systems and software by quickly providing developers with explicit and objective models of the domain in a form which is useful for design. Results of the analysis include a network model of the domain, and an object-oriented relational analysis report which describes the nodes and relationships in the network model. Other products include a database of relationships in the domain, and an interactive concordance. The analysis method utilizes a newly developed relational metric, a proximity-weighted frequency of co-occurrence. The metric is applied to relations between the most frequently occurring terms (words or multiword entities) in the domain text, and the terms found within the contexts of these terms. Contextual scope is selectable. Because of the discriminating power of the metric, data reduction from the association matrix to the network is simple. In addition to their value for design. the models produced by the method are also useful for understanding the domains themselves. They can, for example, be interpreted as models of presence in the domain.
Wafer characteristics via reflectometry
Sopori, Bhushan L.
2010-10-19
Various exemplary methods (800, 900, 1000, 1100) are directed to determining wafer thickness and/or wafer surface characteristics. An exemplary method (900) includes measuring reflectance of a wafer and comparing the measured reflectance to a calculated reflectance or a reflectance stored in a database. Another exemplary method (800) includes positioning a wafer on a reflecting support to extend a reflectance range. An exemplary device (200) has an input (210), analysis modules (222-228) and optionally a database (230). Various exemplary reflectometer chambers (1300, 1400) include radiation sources positioned at a first altitudinal angle (1308, 1408) and at a second altitudinal angle (1312, 1412). An exemplary method includes selecting radiation sources positioned at various altitudinal angles. An exemplary element (1650, 1850) includes a first aperture (1654, 1854) and a second aperture (1658, 1858) that can transmit reflected radiation to a fiber and an imager, respectfully.
Optimal guidance law development for an advanced launch system
NASA Technical Reports Server (NTRS)
Calise, Anthony J.; Hodges, Dewey H.
1990-01-01
A regular perturbation analysis is presented. Closed-loop simulations were performed with a first order correction including all of the atmospheric terms. In addition, a method was developed for independently checking the accuracy of the analysis and the rather extensive programming required to implement the complete first order correction with all of the aerodynamic effects included. This amounted to developing an equivalent Hamiltonian computed from the first order analysis. A second order correction was also completed for the neglected spherical Earth and back-pressure effects. Finally, an analysis was begun on a method for dealing with control inequality constraints. The results on including higher order corrections do show some improvement for this application; however, it is not known at this stage if significant improvement will result when the aerodynamic forces are included. The weak formulation for solving optimal problems was extended in order to account for state inequality constraints. The formulation was tested on three example problems and numerical results were compared to the exact solutions. Development of a general purpose computational environment for the solution of a large class of optimal control problems is under way. An example, along with the necessary input and the output, is given.
High throughput protein production screening
Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA
2009-09-08
Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.
Research in Computational Astrobiology
NASA Technical Reports Server (NTRS)
Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.
2003-01-01
We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.
Determination of micro amounts of iron, aluminum, and alkaline earth metals in silicon carbide
NASA Technical Reports Server (NTRS)
Hirata, H.; Arai, M.
1978-01-01
A colorimetric method for analysis of micro components in silicon carbide used as the raw material for varistors is described. The microcomponents analyzed included iron soluble in hydrochloric acid, iron, aluminum, calcium and magnesium. Samples were analyzed by the method, and the results for iron and aluminum agreed well with the N.B.S. standard values and the values obtained by the other company. The method can therefore be applied to the analysis of actual samples.
On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods
NASA Technical Reports Server (NTRS)
Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.
2003-01-01
Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.
Aeroelastic Stability and Response of Rotating Structures
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Reddy, Tondapu
2004-01-01
A summary of the work performed under NASA grant is presented. More details can be found in the cited references. This grant led to the development of relatively faster aeroelastic analysis methods for predicting flutter and forced response in fans, compressors, and turbines using computational fluid dynamic (CFD) methods. These methods are based on linearized two- and three-dimensional, unsteady, nonlinear aerodynamic equations. During the period of the grant, aeroelastic analysis that includes the effects of uncertainties in the design variables has also been developed.
Structural system reliability calculation using a probabilistic fault tree analysis method
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.
1992-01-01
The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
HydroClimATe: hydrologic and climatic analysis toolkit
Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.
2014-01-01
The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.
Probabilistic structural analysis methods for improving Space Shuttle engine reliability
NASA Technical Reports Server (NTRS)
Boyce, L.
1989-01-01
Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.
NASA Technical Reports Server (NTRS)
Ray, Ronald J.
1994-01-01
New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.
Technologies for Clinical Diagnosis Using Expired Human Breath Analysis
Mathew, Thalakkotur Lazar; Pownraj, Prabhahari; Abdulla, Sukhananazerin; Pullithadathil, Biji
2015-01-01
This review elucidates the technologies in the field of exhaled breath analysis. Exhaled breath gas analysis offers an inexpensive, noninvasive and rapid method for detecting a large number of compounds under various conditions for health and disease states. There are various techniques to analyze some exhaled breath gases, including spectrometry, gas chromatography and spectroscopy. This review places emphasis on some of the critical biomarkers present in exhaled human breath, and its related effects. Additionally, various medical monitoring techniques used for breath analysis have been discussed. It also includes the current scenario of breath analysis with nanotechnology-oriented techniques. PMID:26854142
Innovating Method of Existing Mechanical Product Based on TRIZ Theory
NASA Astrophysics Data System (ADS)
Zhao, Cunyou; Shi, Dongyan; Wu, Han
Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.
Estimation of ovular fiber production in cotton
Van't Hof, Jack
1998-09-01
The present invention is a method for rendering cotton fiber cells that are post-anthesis and pre-harvest available for analysis of their physical properties. The method includes the steps of hydrolyzing cotton fiber cells and separating cotton fiber cells from cotton ovules thereby rendering the cells available for analysis. The analysis of the fiber cells is through any suitable means, e.g., visual inspection. Visual inspection of the cells can be accomplished by placing the cells under an instrument for detection, such as microscope or other means.
Incorporating general race and housing flexibility and deadband in rolling element bearing analysis
NASA Technical Reports Server (NTRS)
Davis, R. R.; Vallance, C. S.
1989-01-01
Methods for including the effects of general race and housing compliance and outer race-to-housing deadband (clearance) in rolling element bearing mechanics analysis is presented. It is shown that these effects can cause significant changes in bearing stiffness characteristics, which are of major importance in rotordynamic response of turbomachinery and other rotating systems. Preloading analysis is demonstrated with the finite element/contact mechanics hybrid method applied to a 45 mm angular contact ball bearing.
Casciotti, K.L.; Sigman, D.M.; Hastings, M. Galanter; Böhlke, J.K.; Hilkert, A.
2002-01-01
We report a novel method for measurement of the oxygen isotopic composition (18O/16O) of nitrate (NO3-) from both seawater and freshwater. The denitrifier method, based on the isotope ratio analysis of nitrous oxide generated from sample nitrate by cultured denitrifying bacteria, has been described elsewhere for its use in nitrogen isotope ratio (15N/14N) analysis of nitrate.1Here, we address the additional issues associated with 18O/16O analysis of nitrate by this approach, which include (1) the oxygen isotopic difference between the nitrate sample and the N2O analyte due to isotopic fractionation associated with the loss of oxygen atoms from nitrate and (2) the exchange of oxygen atoms with water during the conversion of nitrate to N2O. Experiments with 18O-labeled water indicate that water exchange contributes less than 10%, and frequently less than 3%, of the oxygen atoms in the N2O product for Pseudomonas aureofaciens. In addition, both oxygen isotope fractionation and oxygen atom exchange are consistent within a given batch of analyses. The analysis of appropriate isotopic reference materials can thus be used to correct the measured 18O/16O ratios of samples for both effects. This is the first method tested for 18O/16O analysis of nitrate in seawater. Benefits of this method, relative to published freshwater methods, include higher sensitivity (tested down to 10 nmol and 1 μM NO3-), lack of interference by other solutes, and ease of sample preparation.
Using Paradigm Case Analysis To Foster Instructor Development.
ERIC Educational Resources Information Center
Peregrym, Jill; And Others
Paradigm Case Analysis (PCA) is a method of increasing instructor effectiveness through the gathering of narratives of critical teaching incidents and experiences from proficient instructors and their analysis in group discussions. Critical Incidents (CI's) may include those in which the instructor's intervention made a significant difference in…
ERIC Educational Resources Information Center
Bessemer, David W.; Shrage, Jules H.
Recommendations for an alternative plan, based on typological analysis techniques, for the evaluation of student characteristics related to media, presentation design, and academic performance are presented. Difficulties with present evaluation plans are discussed, and different methods of typological analysis are described. Included are…
Hossain, Ahmed; Beyene, Joseph
2014-01-01
This article compares baseline, average, and longitudinal data analysis methods for identifying genetic variants in genome-wide association study using the Genetic Analysis Workshop 18 data. We apply methods that include (a) linear mixed models with baseline measures, (b) random intercept linear mixed models with mean measures outcome, and (c) random intercept linear mixed models with longitudinal measurements. In the linear mixed models, covariates are included as fixed effects, whereas relatedness among individuals is incorporated as the variance-covariance structure of the random effect for the individuals. The overall strategy of applying linear mixed models decorrelate the data is based on Aulchenko et al.'s GRAMMAR. By analyzing systolic and diastolic blood pressure, which are used separately as outcomes, we compare the 3 methods in identifying a known genetic variant that is associated with blood pressure from chromosome 3 and simulated phenotype data. We also analyze the real phenotype data to illustrate the methods. We conclude that the linear mixed model with longitudinal measurements of diastolic blood pressure is the most accurate at identifying the known single-nucleotide polymorphism among the methods, but linear mixed models with baseline measures perform best with systolic blood pressure as the outcome.
ERIC Educational Resources Information Center
McBride, Karen Anne
2012-01-01
This study is a sequential mixed methods study that also includes three mini-case studies for the purpose of portraying contemporary Thai views on the internationalization of higher education in Thailand and reflections on its status as an international education hub in the Southeast Asian region. An inductive analysis technique is used whereby…
Analysis for Non-Traditional Security Challenges: Methods and Tools
2006-11-20
PMESII Modeling Challenges modeling or where data is not available to support the model, would aid decision Domain is large, nebulous, complex, and...traditional challenges . This includes enlisting the aid of the inter-agency and alliance/coalition communities. Second, we need to realize this...20 November 2006 MILITARY OPERATIONS RESEARCH SOCIETY MIFh MORS Workshop Analysis for Non-Traditional Security Challenges : Methods and Tools 21-23
Extended time-interval analysis
NASA Astrophysics Data System (ADS)
Fynbo, H. O. U.; Riisager, K.
2014-01-01
Several extensions of the halflife analysis method recently suggested by Horvat and Hardy are put forward. Goodness-of-fit testing is included, and the method is extended to cases where more information is available for each decay event which allows applications also for e.g. γ decay data. The results are tested with Monte Carlo simulations and are applied to the decays of 64Cu and 56Mn.
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). The pr...
[Herbalism, botany and components analysis study on original plants of frankincense].
Sun, Lei; Xu, Jimin; Jin, Hongyu; Tian, Jingai; Lin, Ruichao
2011-01-01
In order to clarify original plants of traditional Chinese medicine (TCM) frankincense, a GC method for determination essential oils and a HPLC method for determination boswellic acids were carried out together with analysis of herbalism, botany, components and pharmacology papers of frankincense. It was concluded that original plants of TCM frankincense include at least Boswellia sacra, B. papyrifera and B. serrata.
NASA Astrophysics Data System (ADS)
Xu, Jun; Dang, Chao; Kong, Fan
2017-10-01
This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.
NASA Astrophysics Data System (ADS)
Bhartia, R.; Wanger, G.; Orphan, V. J.; Fries, M.; Rowe, A. R.; Nealson, K. H.; Abbey, W. J.; DeFlores, L. P.; Beegle, L. W.
2014-12-01
Detection of in situ biosignatures on terrestrial and planetary missions is becoming increasingly more important. Missions that target the Earth's deep biosphere, Mars, moons of Jupiter (including Europa), moons of Saturn (Titan and Enceladus), and small bodies such as asteroids or comets require methods that enable detection of materials for both in-situ analysis that preserve context and as a means to select high priority sample for return to Earth. In situ instrumentation for biosignature detection spans a wide range of analytical and spectroscopic methods that capitalize on amino acid distribution, chirality, lipid composition, isotopic fractionation, or textures that persist in the environment. Many of the existing analytical instruments are bulk analysis methods and while highly sensitive, these require sample acquisition and sample processing. However, by combining with triaging spectroscopic methods, biosignatures can be targeted on a surface and preserve spatial context (including mineralogy, textures, and organic distribution). To provide spatially correlated chemical analysis at multiple spatial scales (meters to microns) we have employed a dual spectroscopic approach that capitalizes on high sensitivity deep UV native fluorescence detection and high specificity deep UV Raman analysis.. Recently selected as a payload on the Mars 2020 mission, SHERLOC incorporates these optical methods for potential biosignatures detection on Mars. We present data from both Earth analogs that operate as our only examples known biosignatures and meteorite samples that provide an example of abiotic organic formation, and demonstrate how provenance effects the spatial distribution and composition of organics.
Neural networks: Application to medical imaging
NASA Technical Reports Server (NTRS)
Clarke, Laurence P.
1994-01-01
The research mission is the development of computer assisted diagnostic (CAD) methods for improved diagnosis of medical images including digital x-ray sensors and tomographic imaging modalities. The CAD algorithms include advanced methods for adaptive nonlinear filters for image noise suppression, hybrid wavelet methods for feature segmentation and enhancement, and high convergence neural networks for feature detection and VLSI implementation of neural networks for real time analysis. Other missions include (1) implementation of CAD methods on hospital based picture archiving computer systems (PACS) and information networks for central and remote diagnosis and (2) collaboration with defense and medical industry, NASA, and federal laboratories in the area of dual use technology conversion from defense or aerospace to medicine.
2012-01-01
Implicit in the growing interest in patient-centered outcomes research is a growing need for better evidence regarding how responses to a given intervention or treatment may vary across patients, referred to as heterogeneity of treatment effect (HTE). A variety of methods are available for exploring HTE, each associated with unique strengths and limitations. This paper reviews a selected set of methodological approaches to understanding HTE, focusing largely but not exclusively on their uses with randomized trial data. It is oriented for the “intermediate” outcomes researcher, who may already be familiar with some methods, but would value a systematic overview of both more and less familiar methods with attention to when and why they may be used. Drawing from the biomedical, statistical, epidemiological and econometrics literature, we describe the steps involved in choosing an HTE approach, focusing on whether the intent of the analysis is for exploratory, initial testing, or confirmatory testing purposes. We also map HTE methodological approaches to data considerations as well as the strengths and limitations of each approach. Methods reviewed include formal subgroup analysis, meta-analysis and meta-regression, various types of predictive risk modeling including classification and regression tree analysis, series of n-of-1 trials, latent growth and growth mixture models, quantile regression, and selected non-parametric methods. In addition to an overview of each HTE method, examples and references are provided for further reading. By guiding the selection of the methods and analysis, this review is meant to better enable outcomes researchers to understand and explore aspects of HTE in the context of patient-centered outcomes research. PMID:23234603
Bayesian analysis of rare events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less
Giles, Emma Louise; Adams, Jean M
2015-01-01
Capturing public opinion toward public health topics is important to ensure that services, policy, and research are aligned with the beliefs and priorities of the general public. A number of approaches can be used to capture public opinion. We are conducting a program of work on the effectiveness and acceptability of health promoting financial incentive interventions. We have captured public opinion on financial incentive interventions using three methods: a systematic review, focus group study, and analysis of online user-generated comments to news media reports. In this short editorial-style piece, we compare and contrast our experiences with these three methods. Each of these methods had their advantages and disadvantages. Advantages include tailoring of the research question for systematic reviews, probing of answers during focus groups, and the ability to aggregate a large data set using online user-generated content. However, disadvantages include needing to update systematic reviews, participants conforming to a dominant perspective in focus groups, and being unable to collect respondent characteristics during analysis of user-generated online content. That said, analysis of user-generated online content offers additional time and resource advantages, and we found it elicited similar findings to those obtained via more traditional methods, such as systematic reviews and focus groups. A number of methods for capturing public opinions on public health topics are available. Public health researchers, policy makers, and practitioners should choose methods appropriate to their aims. Analysis of user-generated online content, especially in the context of news media reports, may be a quicker and cheaper alternative to more traditional methods, without compromising on the breadth of opinions captured.
LFSTAT - Low-Flow Analysis in R
NASA Astrophysics Data System (ADS)
Koffler, Daniel; Laaha, Gregor
2013-04-01
The calculation of characteristic stream flow during dry conditions is a basic requirement for many problems in hydrology, ecohydrology and water resources management. As opposed to floods, a number of different indices are used to characterise low flows and streamflow droughts. Although these indices and methods of calculation have been well documented in the WMO Manual on Low-flow Estimation and Prediction [1], a comprehensive software was missing which enables a fast and standardized calculation of low flow statistics. We present the new software package lfstat to fill in this obvious gap. Our software package is based on the statistical open source software R, and expands it to analyse daily stream flow data records focusing on low-flows. As command-line based programs are not everyone's preference, we also offer a plug-in for the R-Commander, an easy to use graphical user interface (GUI) provided for R which is based on tcl/tk. The functionality of lfstat includes estimation methods for low-flow indices, extreme value statistics, deficit characteristics, and additional graphical methods to control the computation of complex indices and to illustrate the data. Beside the basic low flow indices, the baseflow index and recession constants can be computed. For extreme value statistics, state-of-the-art methods for L-moment based local and regional frequency analysis (RFA) are available. The tools for deficit characteristics include various pooling and threshold selection methods to support the calculation of drought duration and deficit indices. The most common graphics for low flow analysis are available, and the plots can be modified according to the user preferences. Graphics include hydrographs for different periods, flexible streamflow deficit plots, baseflow visualisation, recession diagnostic, flow duration curves as well as double mass curves, and many more. From a technical point of view, the package uses a S3-class called lfobj (low-flow objects). This objects are usual R-data-frames including date, flow, hydrological year and possibly baseflow information. Once these objects are created, analysis can be performed by mouse-click and a script can be saved to make the analysis easily reproducible. At the moment we are offering implementation of all major methods proposed in the WMO manual on Low-flow Estimation and Predictions [1]. Future plans include a dynamic low flow report in odt-file format using odf-weave which allows automatic updates if data or analysis change. We hope to offer a tool to ease and structure the analysis of stream flow data focusing on low-flows and to make analysis transparent and communicable. The package can also be used in teaching students the first steps in low-flow hydrology. The software packages can be installed from CRAN (latest stable) and R-Forge: http://r-forge.r-project.org (development version). References: [1] Gustard, Alan; Demuth, Siegfried, (eds.) Manual on Low-flow Estimation and Prediction. Geneva, Switzerland, World Meteorological Organization, (Operational Hydrology Report No. 50, WMO-No. 1029).
Analysis of Nonstationary Time Series for Biological Rhythms Research.
Leise, Tanya L
2017-06-01
This article is part of a Journal of Biological Rhythms series exploring analysis and statistics topics relevant to researchers in biological rhythms and sleep research. The goal is to provide an overview of the most common issues that arise in the analysis and interpretation of data in these fields. In this article on time series analysis for biological rhythms, we describe some methods for assessing the rhythmic properties of time series, including tests of whether a time series is indeed rhythmic. Because biological rhythms can exhibit significant fluctuations in their period, phase, and amplitude, their analysis may require methods appropriate for nonstationary time series, such as wavelet transforms, which can measure how these rhythmic parameters change over time. We illustrate these methods using simulated and real time series.
One-year test-retest reliability of intrinsic connectivity network fMRI in older adults
Guo, Cong C.; Kurth, Florian; Zhou, Juan; Mayer, Emeran A.; Eickhoff, Simon B; Kramer, Joel H.; Seeley, William W.
2014-01-01
“Resting-state” or task-free fMRI can assess intrinsic connectivity network (ICN) integrity in health and disease, suggesting a potential for use of these methods as disease-monitoring biomarkers. Numerous analytical options are available, including model-driven ROI-based correlation analysis and model-free, independent component analysis (ICA). High test-retest reliability will be a necessary feature of a successful ICN biomarker, yet available reliability data remains limited. Here, we examined ICN fMRI test-retest reliability in 24 healthy older subjects scanned roughly one year apart. We focused on the salience network, a disease-relevant ICN not previously subjected to reliability analysis. Most ICN analytical methods proved reliable (intraclass coefficients > 0.4) and could be further improved by wavelet analysis. Seed-based ROI correlation analysis showed high map-wise reliability, whereas graph theoretical measures and temporal concatenation group ICA produced the most reliable individual unit-wise outcomes. Including global signal regression in ROI-based correlation analyses reduced reliability. Our study provides a direct comparison between the most commonly used ICN fMRI methods and potential guidelines for measuring intrinsic connectivity in aging control and patient populations over time. PMID:22446491
AACSD: An atomistic analyzer for crystal structure and defects
NASA Astrophysics Data System (ADS)
Liu, Z. R.; Zhang, R. F.
2018-01-01
We have developed an efficient command-line program named AACSD (Atomistic Analyzer for Crystal Structure and Defects) for the post-analysis of atomic configurations generated by various atomistic simulation codes. The program has implemented not only the traditional filter methods like the excess potential energy (EPE), the centrosymmetry parameter (CSP), the common neighbor analysis (CNA), the common neighborhood parameter (CNP), the bond angle analysis (BAA), and the neighbor distance analysis (NDA), but also the newly developed ones including the modified centrosymmetry parameter (m-CSP), the orientation imaging map (OIM) and the local crystallographic orientation (LCO). The newly proposed OIM and LCO methods have been extended for all three crystal structures including face centered cubic, body centered cubic and hexagonal close packed. More specially, AACSD can be easily used for the atomistic analysis of metallic nanocomposite with each phase to be analyzed independently, which provides a unique pathway to capture their dynamic evolution of various defects on the fly. In this paper, we provide not only a throughout overview on various theoretical methods and their implementation into AACSD program, but some critical evaluations, specific testing and applications, demonstrating the capability of the program on each functionality.
Analysis and synthesis of (SAR) waveguide phased array antennas
NASA Astrophysics Data System (ADS)
Visser, H. J.
1994-02-01
This report describes work performed due to ESA contract No. 101 34/93/NL/PB. Started is with a literature study on dual polarized waveguide radiators, resulting in the choice for the open ended square waveguide. After a thorough description of the mode matching infinite waveguide array analysis method - including finiteness effects - that forms the basis for all further described analysis and synthesis methods, the accuracy of the analysis software is validated by comparison with measurements on two realized antennas. These antennas have centered irises in the waveguide apertures and a dielectric wide angle impedance matching sheet in front of the antenna. A synthesis method, using simulated annealing and downhill simplex, is described next and different antenna designs, based on the analysis of a single element in an infinite array environment, are presented. Next, designs of subarrays are presented. Shown is the paramount importance of including the array environment in the design of a subarray. A microstrip patch waveguide exciter and subarray feeding network are discussed and the depth of the waveguide radiator is estimated. Chosen is a rectangular grid array with waveguides of 2.5 cm depth without irises and without dielectric sheet, grouped in linear 8 elements subarrays.
Patterns of Contraceptive Adoption, Continuation, and Switching after Delivery among Malawian Women
Rosenberg, Nora E.; Stuart, Gretchen S.; Miller, William C.; Hosseinipour, Mina C.; Bonongwe, Phylos; Mwale, Mwawi; Tang, Jennifer H.
2017-01-01
Women who report use of postpartum family planning may not continue their initial method or use it consistently. Understanding the patterns of method uptake, discontinuation, and switching among women after delivery is important to promote uptake and continuation of effective methods of contraception. This is a secondary analysis of 634 Malawian women enrolled into a prospective cohort study after delivery. They completed baseline surveys upon enrollment and follow-up telephone surveys 3, 6, and 12 months post-delivery. Women were included in this analysis if they had completed at least the 3- and 6-month post-delivery surveys. Descriptive statistics were used to assess contraceptive method mix and patterns of switching, whereas Pearson’s χ2 tests were used for bivariable analyses to compare characteristics of women who continued and discontinued their initial post-delivery contraceptive method. Among the 479 women included in this analysis, the use of abstinence/traditional methods decreased and the use of long-acting and permanent methods (LAPM) increased over time. Almost half (47%) discontinued the contraceptive method reported at 3-months post-delivery; women using injectables or LAPM at 3-months post-delivery were significantly more likely to continue their method than those using non-modern methods (p<0.001). Of the 216 women who switched methods, 82% switched to a more or equally effective method. The change in contraceptive method mix and high rate of contraceptive switching in the first 12 months postpartum highlights a need to assist women in accessing effective contraceptives soon after delivery. PMID:28107404
NASA Technical Reports Server (NTRS)
Bamber, M J
1935-01-01
General methods of theoretical analysis of airplane spinning characteristics have been available for some time. Some of these methods of analysis might be used by designers to predict the spinning characteristics of proposed airplane designs if the necessary aerodynamic data were known. The present investigation, to determine the spinning characteristics of wings, is planned to include variations in airfoil sections, plan forms, and tip shapes of monoplane wings and variations in stagger, gap, and decalage for biplane cellules. The first series of tests, made on a rectangular Clark Y monoplane wing, are reported in reference 1. That report also gives an analysis of the data for predicting the probable effects of various important parameters on the spin for normal airplanes using such a wing. The present report is the second of the series. It gives the aerodynamic characteristics of a rectangular Clark Y biplane cellule in spinning attitudes and includes a discussion of the data, using the method of analysis given in reference 1.
Tools for T-RFLP data analysis using Excel.
Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie
2014-11-08
Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.
Methods for determination of inorganic substances in water and fluvial sediments
Fishman, Marvin J.; Friedman, Linda C.
1989-01-01
Chapter Al of the laboratory manual contains methods used by the U.S. Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, the total recoverable and total of constituents in water-suspended sediment samples, and the recoverable and total concentrations of constituents in samples of bottom material. The introduction to the manual includes essential definitions and a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including the accuracy and precision of analyses, the use of standard-reference water samples, and the operation of an effective quality-assurance program. Methods for sample preparation and pretreatment are given also. A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods of these techniques are arranged alphabetically by constituent. For each method, the general topics covered are the application, the principle of the method, the interferences, the apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 126 methods are given for the determination of 70 inorganic constituents and physical properties of water, suspended sediment, and bottom material.
Methods for determination of inorganic substances in water and fluvial sediments
Fishman, Marvin J.; Friedman, Linda C.
1985-01-01
Chapter Al of the laboratory manual contains methods used by the Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, total recoverable and total of constituents in water-suspended sediment samples, and recoverable and total concentrations of constituents in samples of bottom material. Essential definitions are included in the introduction to the manual, along with a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including accuracy and precision of analyses, the use of standard reference water samples, and the operation of an effective quality assurance program. Methods for sample preparation and pretreatment are given also.A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods involving these techniques are arranged alphabetically according to constituent. For each method given, the general topics covered are application, principle of the method, interferences, apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 125 methods are given for the determination of 70 different inorganic constituents and physical properties of water, suspended sediment, and bottom material.
Bayesian Computation for Log-Gaussian Cox Processes: A Comparative Analysis of Methods
Teng, Ming; Nathoo, Farouk S.; Johnson, Timothy D.
2017-01-01
The Log-Gaussian Cox Process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly-stochastic property, i.e., it is an hierarchical combination of a Poisson process at the first level and a Gaussian Process at the second level. Various methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods. We focus here on Bayesian methods and several approaches that have been considered for model fitting within this framework, including Hamiltonian Monte Carlo, the Integrated nested Laplace approximation, and Variational Bayes. We consider these approaches and make comparisons with respect to statistical and computational efficiency. These comparisons are made through several simulation studies as well as through two applications, the first examining ecological data and the second involving neuroimaging data. PMID:29200537
Musci, Marilena; Yao, Shicong
2017-12-01
Pu-erh tea is a post-fermented tea that has recently gained popularity worldwide, due to potential health benefits related to the antioxidant activity resulting from its high polyphenolic content. The Folin-Ciocalteu method is a simple, rapid, and inexpensive assay widely applied for the determination of total polyphenol content. Over the past years, it has been subjected to many modifications, often without any systematic optimization or validation. In our study, we sought to optimize the Folin-Ciocalteu method, evaluate quality parameters including linearity, precision and stability, and then apply the optimized model to determine the total polyphenol content of 57 Chinese teas, including green tea, aged and ripened Pu-erh tea. Our optimized Folin-Ciocalteu method reduced analysis time, allowed for the analysis of a large number of samples, to discriminate among the different teas, and to assess the effect of the post-fermentation process on polyphenol content.
Cut set-based risk and reliability analysis for arbitrarily interconnected networks
Wyss, Gregory D.
2000-01-01
Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.
Du, Yongzhao; Fu, Yuqing; Zheng, Lixin
2016-12-20
A real-time complex amplitude reconstruction method for determining the dynamic beam quality M2 factor based on a Mach-Zehnder self-referencing interferometer wavefront sensor is developed. By using the proposed complex amplitude reconstruction method, full characterization of the laser beam, including amplitude (intensity profile) and phase information, can be reconstructed from a single interference pattern with the Fourier fringe pattern analysis method in a one-shot measurement. With the reconstructed complex amplitude, the beam fields at any position z along its propagation direction can be obtained by first utilizing the diffraction integral theory. Then the beam quality M2 factor of the dynamic beam is calculated according to the specified method of the Standard ISO11146. The feasibility of the proposed method is demonstrated with the theoretical analysis and experiment, including the static and dynamic beam process. The experimental method is simple, fast, and operates without movable parts and is allowed in order to investigate the laser beam in inaccessible conditions using existing methods.
Barteneva, Natasha S; Vorobjev, Ivan A
2018-01-01
In this paper, we review some of the recent advances in cellular heterogeneity and single-cell analysis methods. In modern research of cellular heterogeneity, there are four major approaches: analysis of pooled samples, single-cell analysis, high-throughput single-cell analysis, and lately integrated analysis of cellular population at a single-cell level. Recently developed high-throughput single-cell genetic analysis methods such as RNA-Seq require purification step and destruction of an analyzed cell often are providing a snapshot of the investigated cell without spatiotemporal context. Correlative analysis of multiparameter morphological, functional, and molecular information is important for differentiation of more uniform groups in the spectrum of different cell types. Simplified distributions (histograms and 2D plots) can underrepresent biologically significant subpopulations. Future directions may include the development of nondestructive methods for dissecting molecular events in intact cells, simultaneous correlative cellular analysis of phenotypic and molecular features by hybrid technologies such as imaging flow cytometry, and further progress in supervised and non-supervised statistical analysis algorithms.
Criteria for Developing a Successful Privatization Project
1989-05-01
conceptualization and planning are required when pursuing privatization projects. In fact, privatization project proponents need to know how to...selection of projects for analysis, methods of acquiring information about these projects, and the analysis framwork . Chapter IV includes the analysis. A...performed an analysis to determine cormion conceptual and creative approaches and lessons learned. This analysis was then used to develop criteria for
1992-06-01
system capabilities \\Jch as memory management and network communications are provided by a virtual machine-type operating environment. Various human ...thinking. The elements of this substrate include representational formality, genericity, a method of formal analysis, and augmentation of human analytical...the form of identifying: the data entity itself; its aliases (including how the data is presented th programs or human users in the form of copy
Statistical assessment of the learning curves of health technologies.
Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T
2001-01-01
(1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)
Structural weights analysis of advanced aerospace vehicles using finite element analysis
NASA Technical Reports Server (NTRS)
Bush, Lance B.; Lentz, Christopher A.; Rehder, John J.; Naftel, J. Chris; Cerro, Jeffrey A.
1989-01-01
A conceptual/preliminary level structural design system has been developed for structural integrity analysis and weight estimation of advanced space transportation vehicles. The system includes a three-dimensional interactive geometry modeler, a finite element pre- and post-processor, a finite element analyzer, and a structural sizing program. Inputs to the system include the geometry, surface temperature, material constants, construction methods, and aerodynamic and inertial loads. The results are a sized vehicle structure capable of withstanding the static loads incurred during assembly, transportation, operations, and missions, and a corresponding structural weight. An analysis of the Space Shuttle external tank is included in this paper as a validation and benchmark case of the system.
Inverse transonic airfoil design methods including boundary layer and viscous interaction effects
NASA Technical Reports Server (NTRS)
Carlson, L. A.
1979-01-01
The development and incorporation into TRANDES of a fully conservative analysis method utilizing the artificial compressibility approach is described. The method allows for lifting cases and finite thickness airfoils and utilizes a stretched coordinate system. Wave drag and massive separation studies are also discussed.
SER-LARS, Volume 10. Instructional Methods I. 1975-76 Edition.
ERIC Educational Resources Information Center
Montgomery County Intermediate Unit 23, Blue Bell, PA.
The book briefly describes several hundred instructional methods from the Special Education Resources Location Analysis and Retrieval System (SER-LARS), which are intended for use in developing and carrying out individualized programs for handicapped children. Each teaching method includes an accession number; title; author; source; teacher tasks;…
SER-LARS, Volume 11. Instructional Methods II. 1975-76 Edition.
ERIC Educational Resources Information Center
Montgomery County Intermediate Unit 23, Blue Bell, PA.
The book briefly describes several hundred instructional methods from the Special Education Resources Location Analysis and Retrieval System (SER LARS), which are intended for use in developing and carrying out individualized programs for handicapped children. Each teaching method includes an accession number; title; author; source; teacher tasks;…
SER-LARS, Volume 12. Instructional Methods III. 1975-76 Edition.
ERIC Educational Resources Information Center
Montgomery County Intermediate Unit 23, Blue Bell, PA.
The book briefly describes several hundred instructional methods from the Special Education Resources Location Analysis and Retrieval System (SER-LARS), which are intended for use in developing and carrying out individualized programs for handicapped children. Each teaching method includes an accession number; title; author; source; teacher tasks;…
Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.
Wang, Mei; Wang, Yan-Hong; Avula, Bharathi; Radwan, Mohamed M; Wanas, Amira S; Mehmedic, Zlatko; van Antwerp, John; ElSohly, Mahmoud A; Khan, Ikhlas A
2017-05-01
Ultra-high-performance supercritical fluid chromatography (UHPSFC) is an efficient analytical technique and has not been fully employed for the analysis of cannabis. Here, a novel method was developed for the analysis of 30 cannabis plant extracts and preparations using UHPSFC/PDA-MS. Nine of the most abundant cannabinoids, viz. CBD, ∆ 8 -THC, THCV, ∆ 9 -THC, CBN, CBG, THCA-A, CBDA, and CBGA, were quantitatively determined (RSDs < 6.9%). Unlike GC methods, no derivatization or decarboxylation was required prior to UHPSFC analysis. The UHPSFC chromatographic separation of cannabinoids displayed an inverse elution order compared to UHPLC. Combining with PDA-MS, this orthogonality is valuable for discrimination of cannabinoids in complex matrices. The developed method was validated, and the quantification results were compared with a standard UHPLC method. The RSDs of these two methods were within ±13.0%. Finally, chemometric analysis including principal component analysis (PCA) and partial least squares-discriminant analysis (PLS-DA) were used to differentiate between cannabis samples. © 2016 American Academy of Forensic Sciences.
A dimension-wise analysis method for the structural-acoustic system with interval parameters
NASA Astrophysics Data System (ADS)
Xu, Menghui; Du, Jianke; Wang, Chong; Li, Yunlong
2017-04-01
The interval structural-acoustic analysis is mainly accomplished by interval and subinterval perturbation methods. Potential limitations for these intrusive methods include overestimation or interval translation effect for the former and prohibitive computational cost for the latter. In this paper, a dimension-wise analysis method is thus proposed to overcome these potential limitations. In this method, a sectional curve of the system response surface along each input dimensionality is firstly extracted, the minimal and maximal points of which are identified based on its Legendre polynomial approximation. And two input vectors, i.e. the minimal and maximal input vectors, are dimension-wisely assembled by the minimal and maximal points of all sectional curves. Finally, the lower and upper bounds of system response are computed by deterministic finite element analysis at the two input vectors. Two numerical examples are studied to demonstrate the effectiveness of the proposed method and show that, compared to the interval and subinterval perturbation method, a better accuracy is achieved without much compromise on efficiency by the proposed method, especially for nonlinear problems with large interval parameters.
ERIC Educational Resources Information Center
Parry-Jones, R.
1980-01-01
Described are some new uses and procedures of titration procedures. Topics included are titration in non-aqueous solvents, thermometric titration and catalytic methods for end-point detection, titration finish in organic elemental analysis, and sub-micro analysis and automatic titration procedures. (CS)
Results of the first provisional technical secretariat interlaboratory comparison test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuff, J.R.; Hoffland, L.
1995-06-01
The principal task of this laboratory in the first Provisional Technical Secretariat (PTS) Interlaboratory Comparison Test was to verify and test the extraction and preparation procedures outlined in the Recommended Operating Procedures for Sampling and Analysis in the Verification of Chemical Disarmament in addition to our laboratory extraction methods and our laboratory analysis methods. Sample preparation began on 16 May 1994 and analysis was completed on 12 June 1994. The analytical methods used included NMR ({sup 1}H and {sup 31}P) GC/AED, GC/MS (EI and methane CI), GC/IRD, HPLC/IC, HPLC/TSP/MS, MS/MS(Electrospray), and CZE.
2012-01-01
Background Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. The inclusion of mixed methods studies in systematic reviews is increasing, but these studies pose particular challenges to methods of review. This article examines the quality of the reporting of mixed methods and qualitative-only studies. Methods We used two completed systematic reviews to generate a sample of qualitative studies and mixed method studies in order to make an assessment of how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. Results Overall, the reporting of qualitative studies in our sample was consistently better when compared with the reporting of mixed methods studies. We found that mixed methods studies are less likely to provide a description of the research conduct or qualitative data analysis procedures and less likely to be judged credible or provide rich data and thick description compared with standalone qualitative studies. Our time-related analysis shows that for both types of study, papers published since 2003 are more likely to report on the study context, describe analysis procedures, and be judged credible and provide rich data. However, the reporting of other aspects of research conduct (i.e. descriptions of the research question, the sampling strategy, and data collection methods) in mixed methods studies does not appear to have improved over time. Conclusions Mixed methods research makes an important contribution to health research in general, and could make a more substantial contribution to systematic reviews. Through our careful analysis of the quality of reporting of mixed methods and qualitative-only research, we have identified areas that deserve more attention in the conduct and reporting of mixed methods research. PMID:22545681
Classical methods and modern analysis for studying fungal diversity
John Paul Schmit
2005-01-01
In this chapter, we examine the use of classical methods to study fungal diversity. Classical methods rely on the direct observation of fungi, rather than sampling fungal DNA. We summarize a wide variety of classical methods, including direct sampling of fungal fruiting bodies, incubation of substrata in moist chambers, culturing of endophytes, and particle plating. We...
Classical Methods and Modern Analysis for Studying Fungal Diversity
J. P. Schmit; D. J. Lodge
2005-01-01
In this chapter, we examine the use of classical methods to study fungal diversity. Classical methods rely on the direct observation of fungi, rather than sampling fungal DNA. We summarize a wide variety of classical methods, including direct sampling of fungal fruiting bodies, incubation of substrata in moist chambers, culturing of endophytes, and particle plating. We...
Evaluation of the finite element fuel rod analysis code (FRANCO)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, K.; Feltus, M.A.
1994-12-31
Knowledge of temperature distribution in a nuclear fuel rod is required to predict the behavior of fuel elements during operating conditions. The thermal and mechanical properties and performance characteristics are strongly dependent on the temperature, which can vary greatly inside the fuel rod. A detailed model of fuel rod behavior can be described by various numerical methods, including the finite element approach. The finite element method has been successfully used in many engineering applications, including nuclear piping and reactor component analysis. However, fuel pin analysis has traditionally been carried out with finite difference codes, with the exception of Electric Powermore » Research Institute`s FREY code, which was developed for mainframe execution. This report describes FRANCO, a finite element fuel rod analysis code capable of computing temperature disrtibution and mechanical deformation of a single light water reactor fuel rod.« less
Sensitivity analysis of discrete structural systems: A survey
NASA Technical Reports Server (NTRS)
Adelman, H. M.; Haftka, R. T.
1984-01-01
Methods for calculating sensitivity derivatives for discrete structural systems are surveyed, primarily covering literature published during the past two decades. Methods are described for calculating derivatives of static displacements and stresses, eigenvalues and eigenvectors, transient structural response, and derivatives of optimum structural designs with respect to problem parameters. The survey is focused on publications addressed to structural analysis, but also includes a number of methods developed in nonstructural fields such as electronics, controls, and physical chemistry which are directly applicable to structural problems. Most notable among the nonstructural-based methods are the adjoint variable technique from control theory, and the Green's function and FAST methods from physical chemistry.
2012-01-01
Background Adverse consequences of medical interventions are a source of concern, but clinical trials may lack power to detect elevated rates of such events, while observational studies have inherent limitations. Meta-analysis allows the combination of individual studies, which can increase power and provide stronger evidence relating to adverse events. However, meta-analysis of adverse events has associated methodological challenges. The aim of this study was to systematically identify and review the methodology used in meta-analyses where a primary outcome is an adverse or unintended event, following a therapeutic intervention. Methods Using a collection of reviews identified previously, 166 references including a meta-analysis were selected for review. At least one of the primary outcomes in each review was an adverse or unintended event. The nature of the intervention, source of funding, number of individual meta-analyses performed, number of primary studies included in the review, and use of meta-analytic methods were all recorded. Specific areas of interest relating to the methods used included the choice of outcome metric, methods of dealing with sparse events, heterogeneity, publication bias and use of individual patient data. Results The 166 included reviews were published between 1994 and 2006. Interventions included drugs and surgery among other interventions. Many of the references being reviewed included multiple meta-analyses with 44.6% (74/166) including more than ten. Randomised trials only were included in 42.2% of meta-analyses (70/166), observational studies only in 33.7% (56/166) and a mix of observational studies and trials in 15.7% (26/166). Sparse data, in the form of zero events in one or both arms where the outcome was a count of events, was found in 64 reviews of two-arm studies, of which 41 (64.1%) had zero events in both arms. Conclusions Meta-analyses of adverse events data are common and useful in terms of increasing the power to detect an association with an intervention, especially when the events are infrequent. However, with regard to existing meta-analyses, a wide variety of different methods have been employed, often with no evident rationale for using a particular approach. More specifically, the approach to dealing with zero events varies, and guidelines on this issue would be desirable. PMID:22553987
A general framework for the use of logistic regression models in meta-analysis.
Simmonds, Mark C; Higgins, Julian Pt
2016-12-01
Where individual participant data are available for every randomised trial in a meta-analysis of dichotomous event outcomes, "one-stage" random-effects logistic regression models have been proposed as a way to analyse these data. Such models can also be used even when individual participant data are not available and we have only summary contingency table data. One benefit of this one-stage regression model over conventional meta-analysis methods is that it maximises the correct binomial likelihood for the data and so does not require the common assumption that effect estimates are normally distributed. A second benefit of using this model is that it may be applied, with only minor modification, in a range of meta-analytic scenarios, including meta-regression, network meta-analyses and meta-analyses of diagnostic test accuracy. This single model can potentially replace the variety of often complex methods used in these areas. This paper considers, with a range of meta-analysis examples, how random-effects logistic regression models may be used in a number of different types of meta-analyses. This one-stage approach is compared with widely used meta-analysis methods including Bayesian network meta-analysis and the bivariate and hierarchical summary receiver operating characteristic (ROC) models for meta-analyses of diagnostic test accuracy. © The Author(s) 2014.
Building a Practical Natural Laminar Flow Design Capability
NASA Technical Reports Server (NTRS)
Campbell, Richard L.; Lynde, Michelle N.
2017-01-01
A preliminary natural laminar flow (NLF) design method that has been developed and applied to supersonic and transonic wings with moderate-to-high leading-edge sweeps at flight Reynolds numbers is further extended and evaluated in this paper. The modular design approach uses a knowledge-based design module linked with different flow solvers and boundary layer stability analysis methods to provide a multifidelity capability for NLF analysis and design. An assessment of the effects of different options for stability analysis is included using pressures and geometry from an NLF wing designed for the Common Research Model (CRM). Several extensions to the design module are described, including multiple new approaches to design for controlling attachment line contamination and transition. Finally, a modification to the NLF design algorithm that allows independent control of Tollmien-Schlichting (TS) and cross flow (CF) modes is proposed. A preliminary evaluation of the TS-only option applied to the design of an NLF nacelle for the CRM is performed that includes the use of a low-fidelity stability analysis directly in the design module.
Wang, Monan; Zhang, Kai; Yang, Ning
2018-04-09
To help doctors decide their treatment from the aspect of mechanical analysis, the work built a computer assisted optimal system for treatment of femoral neck fracture oriented to clinical application. The whole system encompassed the following three parts: Preprocessing module, finite element mechanical analysis module, post processing module. Preprocessing module included parametric modeling of bone, parametric modeling of fracture face, parametric modeling of fixed screw and fixed position and input and transmission of model parameters. Finite element mechanical analysis module included grid division, element type setting, material property setting, contact setting, constraint and load setting, analysis method setting and batch processing operation. Post processing module included extraction and display of batch processing operation results, image generation of batch processing operation, optimal program operation and optimal result display. The system implemented the whole operations from input of fracture parameters to output of the optimal fixed plan according to specific patient real fracture parameter and optimal rules, which demonstrated the effectiveness of the system. Meanwhile, the system had a friendly interface, simple operation and could improve the system function quickly through modifying single module.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Analysis of continuous-time switching networks
NASA Astrophysics Data System (ADS)
Edwards, R.
2000-11-01
Models of a number of biological systems, including gene regulation and neural networks, can be formulated as switching networks, in which the interactions between the variables depend strongly on thresholds. An idealized class of such networks in which the switching takes the form of Heaviside step functions but variables still change continuously in time has been proposed as a useful simplification to gain analytic insight. These networks, called here Glass networks after their originator, are simple enough mathematically to allow significant analysis without restricting the range of dynamics found in analogous smooth systems. A number of results have been obtained before, particularly regarding existence and stability of periodic orbits in such networks, but important cases were not considered. Here we present a coherent method of analysis that summarizes previous work and fills in some of the gaps as well as including some new results. Furthermore, we apply this analysis to a number of examples, including surprising long and complex limit cycles involving sequences of hundreds of threshold transitions. Finally, we show how the above methods can be extended to investigate aperiodic behaviour in specific networks, though a complete analysis will have to await new results in matrix theory and symbolic dynamics.
Methods for Mediation Analysis with Missing Data
ERIC Educational Resources Information Center
Zhang, Zhiyong; Wang, Lijuan
2013-01-01
Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…
Examining, Documenting, and Modeling the Problem Space of a Variable Domain
2002-06-14
Feature-Oriented Domain Analysis ( FODA ) .............................................................................................. 9...development of this proposed process include: Feature-Oriented Domain Analysis ( FODA ) [3,4], Organization Domain Modeling (ODM) [2,5,6], Family-Oriented...configuration knowledge using generators [2]. 8 Existing Methods of Domain Engineering Feature-Oriented Domain Analysis ( FODA ) FODA is a domain
Analysis of live cell images: Methods, tools and opportunities.
Nketia, Thomas A; Sailem, Heba; Rohde, Gustavo; Machiraju, Raghu; Rittscher, Jens
2017-02-15
Advances in optical microscopy, biosensors and cell culturing technologies have transformed live cell imaging. Thanks to these advances live cell imaging plays an increasingly important role in basic biology research as well as at all stages of drug development. Image analysis methods are needed to extract quantitative information from these vast and complex data sets. The aim of this review is to provide an overview of available image analysis methods for live cell imaging, in particular required preprocessing image segmentation, cell tracking and data visualisation methods. The potential opportunities recent advances in machine learning, especially deep learning, and computer vision provide are being discussed. This review includes overview of the different available software packages and toolkits. Copyright © 2017. Published by Elsevier Inc.
Analysis of Electrowetting Dynamics with Level Set Method
NASA Astrophysics Data System (ADS)
Park, Jun Kwon; Hong, Jiwoo; Kang, Kwan Hyoung
2009-11-01
Electrowetting is a versatile tool to handle tiny droplets and forms a backbone of digital microfluidics. Numerical analysis is necessary to fully understand the dynamics of electrowetting, especially in designing electrowetting-based liquid lenses and reflective displays. We developed a numerical method to analyze the general contact-line problems, incorporating dynamic contact angle models. The method was applied to the analysis of spreading process of a sessile droplet for step input voltages in electrowetting. The result was compared with experimental data and analytical result which is based on the spectral method. It is shown that contact line friction significantly affects the contact line motion and the oscillation amplitude. The pinning process of contact line was well represented by including the hysteresis effect in the contact angle models.
Oka, Megan; Whiting, Jason
2013-01-01
In Marriage and Family Therapy (MFT), as in many clinical disciplines, concern surfaces about the clinician/researcher gap. This gap includes a lack of accessible, practical research for clinicians. MFT clinical research often borrows from the medical tradition of randomized control trials, which typically use linear methods, or follow procedures distanced from "real-world" therapy. We review traditional research methods and their use in MFT and propose increased use of methods that are more systemic in nature and more applicable to MFTs: process research, dyadic data analysis, and sequential analysis. We will review current research employing these methods, as well as suggestions and directions for further research. © 2013 American Association for Marriage and Family Therapy.
Durability reliability analysis for corroding concrete structures under uncertainty
NASA Astrophysics Data System (ADS)
Zhang, Hao
2018-02-01
This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.
APPLE - An aeroelastic analysis system for turbomachines and propfans
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Bakhle, Milind A.; Srivastava, R.; Mehmed, Oral
1992-01-01
This paper reviews aeroelastic analysis methods for propulsion elements (advanced propellers, compressors and turbines) being developed and used at NASA Lewis Research Center. These aeroelastic models include both structural and aerodynamic components. The structural models include the typical section model, the beam model with and without disk flexibility, and the finite element blade model with plate bending elements. The aerodynamic models are based on the solution of equations ranging from the two-dimensional linear potential equation for a cascade to the three-dimensional Euler equations for multi-blade configurations. Typical results are presented for each aeroelastic model. Suggestions for further research are indicated. All the available aeroelastic models and analysis methods are being incorporated into a unified computer program named APPLE (Aeroelasticity Program for Propulsion at LEwis).
Molinski, Tadeusz F.; Reynolds, Kirk A.; Morinaka, Brandon I.
2012-01-01
The absolute stereostructures of the components of symplocin A (3), a new N,N-dimethyl-terminated peptide from the Bahamian cyanobacterium, Symploca sp., were assigned from spectroscopic analysis, including MS and 2D NMR and Marfey’s analysis. The complete absolute configuration of symplocin A, including the unexpected D-configurations of the terminal N,N-dimethylisoleucine and valic acid residues, were assigned by chiral-phase HPLC of the corresponding 2-naphthacyl esters, a highly sensitive, complementary strategy for assignment of N-blocked peptide residues where Marfey’s method is ineffectual, or other methods fall short. Symplocin A exhibited potent activity as an inhibitor of cathepsin E (IC50 300 pM). PMID:22360587
Advanced analysis technique for the evaluation of linear alternators and linear motors
NASA Technical Reports Server (NTRS)
Holliday, Jeffrey C.
1995-01-01
A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.
System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO
NASA Technical Reports Server (NTRS)
Olds, John R.
1994-01-01
This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.
An evaluation method for nanoscale wrinkle
NASA Astrophysics Data System (ADS)
Liu, Y. P.; Wang, C. G.; Zhang, L. M.; Tan, H. F.
2016-06-01
In this paper, a spectrum-based wrinkling analysis method via two-dimensional Fourier transformation is proposed aiming to solve the difficulty of nanoscale wrinkle evaluation. It evaluates the wrinkle characteristics including wrinkling wavelength and direction simply using a single wrinkling image. Based on this method, the evaluation results of nanoscale wrinkle characteristics show agreement with the open experimental results within an error of 6%. It is also verified to be appropriate for the macro wrinkle evaluation without scale limitations. The spectrum-based wrinkling analysis is an effective method for nanoscale evaluation, which contributes to reveal the mechanism of nanoscale wrinkling.
Methods for Human Dehydration Measurement
NASA Astrophysics Data System (ADS)
Trenz, Florian; Weigel, Robert; Hagelauer, Amelie
2018-03-01
The aim of this article is to give a broad overview of current methods for the identification and quantification of the human dehydration level. Starting off from most common clinical setups, including vital parameters and general patients' appearance, more quantifiable results from chemical laboratory and electromagnetic measurement methods will be reviewed. Different analysis methods throughout the electromagnetic spectrum, ranging from direct current (DC) conductivity measurements up to neutron activation analysis (NAA), are discussed on the base of published results. Finally, promising technologies, which allow for an integration of a dehydration assessment system in a compact and portable way, will be spotted.
CFD Methods and Tools for Multi-Element Airfoil Analysis
NASA Technical Reports Server (NTRS)
Rogers, Stuart E.; George, Michael W. (Technical Monitor)
1995-01-01
This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.
Azeem, Hafiz Abdul; Martinsson, Johan; Stenström, Kristina Eriksson; Swietlicki, Erik; Sandahl, Margareta
2017-07-01
Air-starved combustion of biomass and fossil fuels releases aerosols, including airborne carbonaceous particles, causing negative climatic and health effects. Radiocarbon analysis of the elemental carbon (EC) fraction can help apportion sources of its emission, which is greatly constrained by the challenges in isolation of EC from organic compounds in atmospheric aerosols. The isolation of EC using thermo-optical analysis is however biased by the presence of interfering compounds that undergo pyrolysis during the analysis. EC is considered insoluble in all acidic, basic, and organic solvents. Based on the property of insolubility, a sample preparation method using supercritical CO 2 and methanol as co-solvent was developed to remove interfering organic compounds. The efficiency of the method was studied by varying the density of supercritical carbon dioxide by means of temperature and pressure and by varying the methanol content. Supercritical CO 2 with 10% methanol by volume at a temperature of 60 °C, a pressure of 350 bar and 20 min static mode extraction were found to be the most suitable conditions for the removal of 59 ± 3% organic carbon, including compounds responsible for pyrolysis with 78 ± 16% EC recovery. The results indicate that the method has potential for the estimation and isolation of EC from OC for subsequent analysis methods and source apportionment studies.
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Galaxy two-point covariance matrix estimation for next generation surveys
NASA Astrophysics Data System (ADS)
Howlett, Cullan; Percival, Will J.
2017-12-01
We perform a detailed analysis of the covariance matrix of the spherically averaged galaxy power spectrum and present a new, practical method for estimating this within an arbitrary survey without the need for running mock galaxy simulations that cover the full survey volume. The method uses theoretical arguments to modify the covariance matrix measured from a set of small-volume cubic galaxy simulations, which are computationally cheap to produce compared to larger simulations and match the measured small-scale galaxy clustering more accurately than is possible using theoretical modelling. We include prescriptions to analytically account for the window function of the survey, which convolves the measured covariance matrix in a non-trivial way. We also present a new method to include the effects of super-sample covariance and modes outside the small simulation volume which requires no additional simulations and still allows us to scale the covariance matrix. As validation, we compare the covariance matrix estimated using our new method to that from a brute-force calculation using 500 simulations originally created for analysis of the Sloan Digital Sky Survey Main Galaxy Sample. We find excellent agreement on all scales of interest for large-scale structure analysis, including those dominated by the effects of the survey window, and on scales where theoretical models of the clustering normally break down, but the new method produces a covariance matrix with significantly better signal-to-noise ratio. Although only formally correct in real space, we also discuss how our method can be extended to incorporate the effects of redshift space distortions.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
A critical methodological review of discourse and conversation analysis studies of family therapy.
Tseliou, Eleftheria
2013-12-01
Discourse (DA) and conversation (CA) analysis, two qualitative research methods, have been recently suggested as potentially promising for the study of family therapy due to common epistemological adherences and their potential for an in situ study of therapeutic dialog. However, to date, there is no systematic methodological review of the few existing DA and CA studies of family therapy. This study aims at addressing this lack by critically reviewing published DA and CA studies of family therapy on methodological grounds. Twenty-eight articles in total are reviewed in relation to certain methodological axes identified in the relevant literature. These include choice of method, framing of research question(s), data/sampling, type of analysis, epistemological perspective, content/type of knowledge claims, and attendance to criteria for good quality practice. It is argued that the reviewed studies show "glimpses" of the methods' potential for family therapy research despite the identification of certain "shortcomings" regarding their methodological rigor. These include unclearly framed research questions and the predominance of case study designs. They also include inconsistencies between choice of method, stated or unstated epistemological orientations and knowledge claims, and limited attendance to criteria for good quality practice. In conclusion, it is argued that DA and CA can add to the existing quantitative and qualitative methods for family therapy research. They can both offer unique ways for a detailed study of the actual therapeutic dialog, provided that future attempts strive for a methodologically rigorous practice and against their uncritical deployment. © FPI, Inc.
Ertefaie, Ashkan; Flory, James H; Hennessy, Sean; Small, Dylan S
2017-06-15
Instrumental variable (IV) methods provide unbiased treatment effect estimation in the presence of unmeasured confounders under certain assumptions. To provide valid estimates of treatment effect, treatment effect confounders that are associated with the IV (IV-confounders) must be included in the analysis, and not including observations with missing values may lead to bias. Missing covariate data are particularly problematic when the probability that a value is missing is related to the value itself, which is known as nonignorable missingness. In such cases, imputation-based methods are biased. Using health-care provider preference as an IV method, we propose a 2-step procedure with which to estimate a valid treatment effect in the presence of baseline variables with nonignorable missing values. First, the provider preference IV value is estimated by performing a complete-case analysis using a random-effects model that includes IV-confounders. Second, the treatment effect is estimated using a 2-stage least squares IV approach that excludes IV-confounders with missing values. Simulation results are presented, and the method is applied to an analysis comparing the effects of sulfonylureas versus metformin on body mass index, where the variables baseline body mass index and glycosylated hemoglobin have missing values. Our result supports the association of sulfonylureas with weight gain. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Quantification of Microbial Phenotypes
Martínez, Verónica S.; Krömer, Jens O.
2016-01-01
Metabolite profiling technologies have improved to generate close to quantitative metabolomics data, which can be employed to quantitatively describe the metabolic phenotype of an organism. Here, we review the current technologies available for quantitative metabolomics, present their advantages and drawbacks, and the current challenges to generate fully quantitative metabolomics data. Metabolomics data can be integrated into metabolic networks using thermodynamic principles to constrain the directionality of reactions. Here we explain how to estimate Gibbs energy under physiological conditions, including examples of the estimations, and the different methods for thermodynamics-based network analysis. The fundamentals of the methods and how to perform the analyses are described. Finally, an example applying quantitative metabolomics to a yeast model by 13C fluxomics and thermodynamics-based network analysis is presented. The example shows that (1) these two methods are complementary to each other; and (2) there is a need to take into account Gibbs energy errors. Better estimations of metabolic phenotypes will be obtained when further constraints are included in the analysis. PMID:27941694
Vidyasagar, Mathukumalli
2015-01-01
This article reviews several techniques from machine learning that can be used to study the problem of identifying a small number of features, from among tens of thousands of measured features, that can accurately predict a drug response. Prediction problems are divided into two categories: sparse classification and sparse regression. In classification, the clinical parameter to be predicted is binary, whereas in regression, the parameter is a real number. Well-known methods for both classes of problems are briefly discussed. These include the SVM (support vector machine) for classification and various algorithms such as ridge regression, LASSO (least absolute shrinkage and selection operator), and EN (elastic net) for regression. In addition, several well-established methods that do not directly fall into machine learning theory are also reviewed, including neural networks, PAM (pattern analysis for microarrays), SAM (significance analysis for microarrays), GSEA (gene set enrichment analysis), and k-means clustering. Several references indicative of the application of these methods to cancer biology are discussed.
Parasites as valuable stock markers for fisheries in Australasia, East Asia and the Pacific Islands.
Lester, R J G; Moore, B R
2015-01-01
Over 30 studies in Australasia, East Asia and the Pacific Islands region have collected and analysed parasite data to determine the ranges of individual fish, many leading to conclusions about stock delineation. Parasites used as biological tags have included both those known to have long residence times in the fish and those thought to be relatively transient. In many cases the parasitological conclusions have been supported by other methods especially analysis of the chemical constituents of otoliths, and to a lesser extent, genetic data. In analysing parasite data, authors have applied multiple different statistical methodologies, including summary statistics, and univariate and multivariate approaches. Recently, a growing number of researchers have found non-parametric methods, such as analysis of similarities and cluster analysis, to be valuable. Future studies into the residence times, life cycles and geographical distributions of parasites together with more robust analytical methods will yield much important information to clarify stock structures in the area.
Content-based fused off-axis object illumination direct-to-digital holography
Price, Jeffery R.
2006-05-02
Systems and methods are described for content-based fused off-axis illumination direct-to-digital holography. A method includes calculating an illumination angle with respect to an optical axis defined by a focusing lens as a function of data representing a Fourier analyzed spatially heterodyne hologram; reflecting a reference beam from a reference mirror at a non-normal angle; reflecting an object beam from an object the object beam incident upon the object at the illumination angle; focusing the reference beam and the object beam at a focal plane of a digital recorder to from the content-based off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; and digitally recording the content based off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis.
A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approach.
Tipton, Elizabeth; Shuster, Jonathan
2017-10-15
Bland-Altman method comparison studies are common in the medical sciences and are used to compare a new measure to a gold-standard (often costlier or more invasive) measure. The distribution of these differences is summarized by two statistics, the 'bias' and standard deviation, and these measures are combined to provide estimates of the limits of agreement (LoA). When these LoA are within the bounds of clinically insignificant differences, the new non-invasive measure is preferred. Very often, multiple Bland-Altman studies have been conducted comparing the same two measures, and random-effects meta-analysis provides a means to pool these estimates. We provide a framework for the meta-analysis of Bland-Altman studies, including methods for estimating the LoA and measures of uncertainty (i.e., confidence intervals). Importantly, these LoA are likely to be wider than those typically reported in Bland-Altman meta-analyses. Frequently, Bland-Altman studies report results based on repeated measures designs but do not properly adjust for this design in the analysis. Meta-analyses of Bland-Altman studies frequently exclude these studies for this reason. We provide a meta-analytic approach that allows inclusion of estimates from these studies. This includes adjustments to the estimate of the standard deviation and a method for pooling the estimates based upon robust variance estimation. An example is included based on a previously published meta-analysis. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Byrne, A. R.; Benedik, L.
1999-01-01
Neutron activation analysis (NAA), being essentially an isotopic and not an elemental method of analysis, is capable of determining a number of important radionuclides of radioecological interest by transformation into another, more easily quantifiable radionuclide. The nuclear characteristics which favour this technique may be summarized in an advantage factor relative to radiometric analysis of the original radioanalyte. Well known or hardly known examples include235U,238U,232Th,230Th,129I,99Tc,237Np and231Pa; a number of these are discussed and illustrated in analysis of real samples of environmental and biological origin. In particular, determination of231Pa by RNAA was performed using both postirradiation and preseparation methods. Application of INAA to enable the use of238U and232Th as endogenous (internal) radiotracers in alpha spectrometric analyses of uranium and thorium radioisotopes in radioecological studies is described, also allowing independent data sets to be obtained for quality control.
Network analysis of mesoscale optical recordings to assess regional, functional connectivity.
Lim, Diana H; LeDue, Jeffrey M; Murphy, Timothy H
2015-10-01
With modern optical imaging methods, it is possible to map structural and functional connectivity. Optical imaging studies that aim to describe large-scale neural connectivity often need to handle large and complex datasets. In order to interpret these datasets, new methods for analyzing structural and functional connectivity are being developed. Recently, network analysis, based on graph theory, has been used to describe and quantify brain connectivity in both experimental and clinical studies. We outline how to apply regional, functional network analysis to mesoscale optical imaging using voltage-sensitive-dye imaging and channelrhodopsin-2 stimulation in a mouse model. We include links to sample datasets and an analysis script. The analyses we employ can be applied to other types of fluorescence wide-field imaging, including genetically encoded calcium indicators, to assess network properties. We discuss the benefits and limitations of using network analysis for interpreting optical imaging data and define network properties that may be used to compare across preparations or other manipulations such as animal models of disease.
ERIC Educational Resources Information Center
Ma, T. S.; Wang, C. Y.
1984-01-01
Presents a literature review on methods used to analyze organic elements. Topic areas include methods for: (1) analyzing carbon, hydrogen, and nitrogen; (2) analyzing oxygen, sulfur, and halogens; (3) analyzing other elements; (4) simultaneously determining several elements; and (5) determing trace elements. (JN)
Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime
2010-01-01
This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks.
Modeling Predictors of Duties Not Including Flying Status.
Tvaryanas, Anthony P; Griffith, Converse
2018-01-01
The purpose of this study was to reuse available datasets to conduct an analysis of potential predictors of U.S. Air Force aircrew nonavailability in terms of being in "duties not to include flying" (DNIF) status. This study was a retrospective cohort analysis of U.S. Air Force aircrew on active duty during the period from 2003-2012. Predictor variables included age, Air Force Specialty Code (AFSC), clinic location, diagnosis, gender, pay grade, and service component. The response variable was DNIF duration. Nonparametric methods were used for the exploratory analysis and parametric methods were used for model building and statistical inference. Out of a set of 783 potential predictor variables, 339 variables were identified from the nonparametric exploratory analysis for inclusion in the parametric analysis. Of these, 54 variables had significant associations with DNIF duration in the final model fitted to the validation data set. The predicted results of this model for DNIF duration had a correlation of 0.45 with the actual number of DNIF days. Predictor variables included age, 6 AFSCs, 7 clinic locations, and 40 primary diagnosis categories. Specific demographic (i.e., age), occupational (i.e., AFSC), and health (i.e., clinic location and primary diagnosis category) DNIF drivers were identified. Subsequent research should focus on the application of primary, secondary, and tertiary prevention measures to ameliorate the potential impact of these DNIF drivers where possible.Tvaryanas AP, Griffith C Jr. Modeling predictors of duties not including flying status. Aerosp Med Hum Perform. 2018; 89(1):52-57.
Testing alternative ground water models using cross-validation and other methods
Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.
2007-01-01
Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.
Lee, Dae-Hee; Kim, Hyun-Jung; Ahn, Hyeong-Sik; Bin, Seong-Il
2016-12-01
Although three-dimensional computed tomography (3D-CT) has been used to compare femoral tunnel position following transtibial and anatomical anterior cruciate ligament (ACL) reconstruction, no consensus has been reached on which technique results in a more anatomical position because methods of quantifying femoral tunnel position on 3D-CT have not been consistent. This meta-analysis was therefore performed to compare femoral tunnel location following transtibial and anatomical ACL reconstruction, in both the low-to-high and deep-to-shallow directions. This meta-analysis included all studies that used 3D-CT to compare femoral tunnel location, using quadrant or anatomical coordinate axis methods, following transtibial and anatomical (AM portal or OI) single-bundle ACL reconstruction. Six studies were included in the meta-analysis. Femoral tunnel location was 18 % higher in the low-to-high direction, but was not significant in the deep-to-shallow direction, using the transtibial technique than the anatomical methods, when measured using the anatomical coordinate axis method. When measured using the quadrant method, however, femoral tunnel positions were significantly higher (21 %) and shallower (6 %) with transtibial than anatomical methods of ACL reconstruction. The anatomical ACL reconstruction techniques led to a lower femoral tunnel aperture location than the transtibial technique, suggesting the superiority of anatomical techniques for creating new femoral tunnels during revision ACL reconstruction in femoral tunnel aperture location in the low-to-high direction. However, the mean difference in the deep-to-shallow direction differed by method of measurement. Meta-analysis, Level II.
Fountoulakis, Konstantinos N; Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried
2017-02-01
This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. © The Author 2016. Published by Oxford University Press on behalf of CINP.
Methods for assessing the stability of slopes during earthquakes-A retrospective
Jibson, R.W.
2011-01-01
During the twentieth century, several methods to assess the stability of slopes during earthquakes were developed. Pseudostatic analysis was the earliest method; it involved simply adding a permanent body force representing the earthquake shaking to a static limit-equilibrium analysis. Stress-deformation analysis, a later development, involved much more complex modeling of slopes using a mesh in which the internal stresses and strains within elements are computed based on the applied external loads, including gravity and seismic loads. Stress-deformation analysis provided the most realistic model of slope behavior, but it is very complex and requires a high density of high-quality soil-property data as well as an accurate model of soil behavior. In 1965, Newmark developed a method that effectively bridges the gap between these two types of analysis. His sliding-block model is easy to apply and provides a useful index of co-seismic slope performance. Subsequent modifications to sliding-block analysis have made it applicable to a wider range of landslide types. Sliding-block analysis provides perhaps the greatest utility of all the types of analysis. It is far easier to apply than stress-deformation analysis, and it yields much more useful information than does pseudostatic analysis. ?? 2010.
ERIC Educational Resources Information Center
Thatcher, L. L.; And Others
Analytical methods for determining important components of fission and natural radioactivity found in water are reported. The discussion of each method includes conditions for application of the method, a summary of the method, interferences, required apparatus, procedures, calculations and estimation of precision. Isotopes considered are…
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components
NASA Technical Reports Server (NTRS)
1991-01-01
Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.
Device and method for enhanced collection and assay of chemicals with high surface area ceramic
Addleman, Raymond S.; Li, Xiaohong Shari; Chouyyok, Wilaiwan; Cinson, Anthony D.; Bays, John T.; Wallace, Krys
2016-02-16
A method and device for enhanced capture of target analytes is disclosed. This invention relates to collection of chemicals for separations and analysis. More specifically, this invention relates to a solid phase microextraction (SPME) device having better capability for chemical collection and analysis. This includes better physical stability, capacity for chemical collection, flexible surface chemistry and high affinity for target analyte.
Optimum Repair Level Analysis (ORLA) for the Space Transportation System (STS)
NASA Technical Reports Server (NTRS)
Henry, W. R.
1979-01-01
A repair level analysis method applied to a space shuttle scenario is presented. A determination of the most cost effective level of repair for reparable hardware, the location for the repair, and a system which will accrue minimum total support costs within operational and technical constraints over the system design are defined. The method includes cost equations for comparison of selected costs to completion for assumed repair alternates.
Profiling a Periodicals Collection
ERIC Educational Resources Information Center
Bolgiano, Christina E.; King, Mary Kathryn
1978-01-01
Libraries need solid information upon which to base collection development decisions. Specific evaluative methods for determining scope, access, and usefullness are described. Approaches used for data collection include analysis of interlibrary loan requests, comparison with major bibliographies, and analysis of accessibility through available…
NASA Astrophysics Data System (ADS)
Bonhivers, Jean-Christophe
The increase in production of goods over the last decades has led to the need for improving the management of natural resources management and the efficiency of processes. As a consequence, heat integration methods for industry have been developed. These have been successful for the design of new plants: the integration principles are largely employed, and energy intensity has dramatically decreased in many processes. Although progress has also been achieved in integration methods for retrofit, these methods still need further conceptual development. Furthermore, methodological difficulties increase when trying to retrofit heat exchange networks that are closely interrelated to water networks, such as the case of pulp and paper mills. The pulp and paper industry seeks to increase its profitability by reducing production costs and optimizing supply chains. Recent process developments in forestry biorefining give this industry the opportunity for diversification into bio-products, increasing potential profit margins, and at the same time modernizing its energy systems. Identification of energy strategies for a mill in a changing environment, including the possibility of adding a biorefinery process on the industrial site, requires better integration methods for retrofit situations. The objective of this thesis is to develop an energy integration method for the retrofit of industrial systems and the transformation of pulp and paper mills, ant to demonstrate the method in case studies. Energy is conserved and degraded in a process. Heat can be converted into electricity, stored as chemical energy, or rejected to the environment. A systematic analysis of successive degradations of energy between the hot utilities until the environment, through process operations and existing heat exchangers, is essential in order to reduce the heat consumption. In this thesis, the "Bridge Method" for energy integration by heat exchanger network retrofit has been developed. This method is the first that considers the analysis of these degradations. The fundamental mechanism to reduce the heat consumption in an existing network has been made explicit; it is the basis of the developed method. The Bridge Method includes the definition of "a bridge", which is a set of modifications leading to heat reduction in a heat exchanger network. It is proven that, for a given set of streams, only bridges can lead to heat savings. The Bridge Method also includes (1) a global procedure for heat exchanger network retrofit, (2) a procedure to enumerate systematically the bridges, (3) "a network table" to easily evaluate them, and (4) an "energy transfer diagram" showing the effect of the two first principles of thermodynamics of energy conservation and degradation in industrial processes in order to identify energy savings opportunities. The Bridge Method can be used for the analysis of networks including several types of heat transfer, and site-wide analysis. The Bridge Method has been applied in case studies for retrofitting networks composed of indirect-contact heat exchangers, including the network of a kraft pulp mill, and also networks of direct-contact heat exchangers, including the hot water production system of a pulp mill. The method has finally been applied for the evaluation of a biorefinery process, alone or hosted in a kraft pulp mill. Results show that the use of the method significantly reduces the search space and leads to identification of the relevant solutions. The necessity of a bridge to reduce the inputs and outputs of a process is a consequence of the two first thermodynamics principles of energy conservation and increase in entropy. The concept of bridge alone can also be used as a tool for process analysis, and in numerical optimization-based approaches for energy integration.
NASA Technical Reports Server (NTRS)
Allen, G.
1972-01-01
The use of the theta-operator method and generalized hypergeometric functions in obtaining solutions to nth-order linear ordinary differential equations is explained. For completeness, the analysis of the differential equation to determine whether the point of expansion is an ordinary point or a regular singular point is included. The superiority of the two methods shown over the standard method is demonstrated by using all three of the methods to work out several examples. Also included is a compendium of formulae and properties of the theta operator and generalized hypergeometric functions which is complete enough to make the report self-contained.
Analysis of Urinary Calculi Using Infrared Spectroscopic Imaging
NASA Astrophysics Data System (ADS)
Sablinskas, Valdas; Lesciute, Daiva; Hendrixson, Vaiva
2009-06-01
Kidney stone disease is a cosmopolitan disease, occurring in both industrialized and developing countries and mainly affecting adults aged 2060 years. The formation of kidney stones is a process that includes many factors. Its primary and contributing pathogenic factors are genetic, nutritional and environmental, but also include personal habits. Information about the chemical structure of kidney stones is of great importance to the treatment of the kidney diseases. The usefulness of such information was first recognized in early 1950s. Analysis of urinary stones by various chemical methods, polarization microscopy, x-ray diffraction, porosity determination, solid phase NMR, and thermo analytical procedures have been widely used. Unfortunately, no one method is sufficient to provide all the clinically useful information about the structure and composition of the stones. Infrared spectroscopy can be considered a relatively new method of kidney stone analysis. It allows to identify any organic or inorganic molecules the constituents of kidney stones. So far this method had never been used to collect information about kidney stone component patterns in Lithuania. Since no epidemiological studies have been performed in this field, the medical treatment of kidney stone disease is empirical and often ineffective in hospitals around the country. The aim of this paper is to present some results of analysis of kidney stones extracted from local patients using FTIR spectroscopical microscopy.
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.
1995-01-01
This guide describes the input data required for using ECAP2D (Euler Cascade Aeroelastic Program-Two Dimensional). ECAP2D can be used for steady or unsteady aerodynamic and aeroelastic analysis of two dimensional cascades. Euler equations are used to obtain aerodynamic forces. The structural dynamic equations are written for a rigid typical section undergoing pitching (torsion) and plunging (bending) motion. The solution methods include harmonic oscillation method, influence coefficient method, pulse response method, and time integration method. For harmonic oscillation method, example inputs and outputs are provided for pitching motion and plunging motion. For the rest of the methods, input and output for pitching motion only are given.
NASA Astrophysics Data System (ADS)
Hano, Mitsuo; Hotta, Masashi
A new multigrid method based on high-order vector finite elements is proposed in this paper. Low level discretizations in this method are obtained by using low-order vector finite elements for the same mesh. Gauss-Seidel method is used as a smoother, and a linear equation of lowest level is solved by ICCG method. But it is often found that multigrid solutions do not converge into ICCG solutions. An elimination algolithm of constant term using a null space of the coefficient matrix is also described. In three dimensional magnetostatic field analysis, convergence time and number of iteration of this multigrid method are discussed with the convectional ICCG method.
Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diprete, D.; McCabe, D.
2016-09-28
The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat ® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tankmore » waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.« less
Darwish, Hany W; Bakheit, Ahmed H; Abdelhameed, Ali S
2016-03-01
Simultaneous spectrophotometric analysis of a multi-component dosage form of olmesartan, amlodipine and hydrochlorothiazide used for the treatment of hypertension has been carried out using various chemometric methods. Multivariate calibration methods include classical least squares (CLS) executed by net analyte processing (NAP-CLS), orthogonal signal correction (OSC-CLS) and direct orthogonal signal correction (DOSC-CLS) in addition to multivariate curve resolution-alternating least squares (MCR-ALS). Results demonstrated the efficiency of the proposed methods as quantitative tools of analysis as well as their qualitative capability. The three analytes were determined precisely using the aforementioned methods in an external data set and in a dosage form after optimization of experimental conditions. Finally, the efficiency of the models was validated via comparison with the partial least squares (PLS) method in terms of accuracy and precision.
Static aeroelastic analysis and tailoring of a single-element racing car wing
NASA Astrophysics Data System (ADS)
Sadd, Christopher James
This thesis presents the research from an Engineering Doctorate research programme in collaboration with Reynard Motorsport Ltd, a manufacturer of racing cars. Racing car wing design has traditionally considered structures to be rigid. However, structures are never perfectly rigid and the interaction between aerodynamic loading and structural flexibility has a direct impact on aerodynamic performance. This interaction is often referred to as static aeroelasticity and the focus of this research has been the development of a computational static aeroelastic analysis method to improve the design of a single-element racing car wing. A static aeroelastic analysis method has been developed by coupling a Reynolds-Averaged Navier-Stokes CFD analysis method with a Finite Element structural analysis method using an iterative scheme. Development of this method has included assessment of CFD and Finite Element analysis methods and development of data transfer and mesh deflection methods. Experimental testing was also completed to further assess the computational analyses. The computational and experimental results show a good correlation and these studies have also shown that a Navier-Stokes static aeroelastic analysis of an isolated wing can be performed at an acceptable computational cost. The static aeroelastic analysis tool was used to assess methods of tailoring the structural flexibility of the wing to increase its aerodynamic performance. These tailoring methods were then used to produce two final wing designs to increase downforce and reduce drag respectively. At the average operating dynamic pressure of the racing car, the computational analysis predicts that the downforce-increasing wing has a downforce of C[1]=-1.377 in comparison to C[1]=-1.265 for the original wing. The computational analysis predicts that the drag-reducing wing has a drag of C[d]=0.115 in comparison to C[d]=0.143 for the original wing.
Probabilistic Structural Analysis Program
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Analysis of New Composite Architectures
NASA Technical Reports Server (NTRS)
Whitcomb, John D.
1996-01-01
Efficient and accurate specialty finite elements methods to analyze textile composites were developed and are described. Textile composites present unique challenges to the analyst because of the large, complex 'microstructure'. The geometry of the microstructure is difficult to model and it introduces unusual free surface effects. The size of the microstructure complicates the use of traditional homogenization methods. The methods developed constitute considerable progress in addressing the modeling difficulties. The details of the methods and attended results obtained therefrom, are described in the various chapters included in Part 1 of the report. Specific conclusions and computer codes generated are included in Part 2 of the report.
Szymańska, Ewa; Tinnevelt, Gerjen H; Brodrick, Emma; Williams, Mark; Davies, Antony N; van Manen, Henk-Jan; Buydens, Lutgarde M C
2016-08-05
Current challenges of clinical breath analysis include large data size and non-clinically relevant variations observed in exhaled breath measurements, which should be urgently addressed with competent scientific data tools. In this study, three different baseline correction methods are evaluated within a previously developed data size reduction strategy for multi capillary column - ion mobility spectrometry (MCC-IMS) datasets. Introduced for the first time in breath data analysis, the Top-hat method is presented as the optimum baseline correction method. A refined data size reduction strategy is employed in the analysis of a large breathomic dataset on a healthy and respiratory disease population. New insights into MCC-IMS spectra differences associated with respiratory diseases are provided, demonstrating the additional value of the refined data analysis strategy in clinical breath analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
Text analysis devices, articles of manufacture, and text analysis methods
Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C
2013-05-28
Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.
Project delay analysis of HRSG
NASA Astrophysics Data System (ADS)
Silvianita; Novega, A. S.; Rosyid, D. M.; Suntoyo
2017-08-01
Completion of HRSG (Heat Recovery Steam Generator) fabrication project sometimes is not sufficient with the targeted time written on the contract. The delay on fabrication process can cause some disadvantages for fabricator, including forfeit payment, delay on HRSG construction process up until HRSG trials delay. In this paper, the author is using semi quantitative on HRSG pressure part fabrication delay with configuration plant 1 GT (Gas Turbine) + 1 HRSG + 1 STG (Steam Turbine Generator) using bow-tie analysis method. Bow-tie analysis method is a combination from FTA (Fault tree analysis) and ETA (Event tree analysis) to develop the risk matrix of HRSG. The result from FTA analysis is use as a threat for preventive measure. The result from ETA analysis is use as impact from fabrication delay.
Self organising maps for visualising and modelling
2012-01-01
The paper describes the motivation of SOMs (Self Organising Maps) and how they are generally more accessible due to the wider available modern, more powerful, cost-effective computers. Their advantages compared to Principal Components Analysis and Partial Least Squares are discussed. These allow application to non-linear data, are not so dependent on least squares solutions, normality of errors and less influenced by outliers. In addition there are a wide variety of intuitive methods for visualisation that allow full use of the map space. Modern problems in analytical chemistry include applications to cultural heritage studies, environmental, metabolomic and biological problems result in complex datasets. Methods for visualising maps are described including best matching units, hit histograms, unified distance matrices and component planes. Supervised SOMs for classification including multifactor data and variable selection are discussed as is their use in Quality Control. The paper is illustrated using four case studies, namely the Near Infrared of food, the thermal analysis of polymers, metabolomic analysis of saliva using NMR, and on-line HPLC for pharmaceutical process monitoring. PMID:22594434
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groth, R.H.; Calabro, D.S.
1969-11-01
The two methods normally used for the analysis of NO/sub x/ are the Saltzman and the phenoldisulfonic acid technique. This paper describes an evaluation of these wet chemical methods to determine their practical application to engine exhaust gas analysis. Parameters considered for the Saltzman method included bubbler collection efficiency, NO to NO/sub 2/ conversion efficiency, masking effect of other contaminants usually present in exhaust gases and the time-temperature effect of these contaminants on store developed solutions. Collection efficiency and the effects of contaminants were also considered for the phenoldisulfonic acid method. Test results indicated satisfactory collection and conversion efficiencies formore » the Saltzman method, but contaminants seriously affected the measurement accuracy particularly if the developed solution was stored for a number of hours at room temperature before analysis. Storage at 32/sup 0/F minimized effect. The standard procedure for the phenoldisulfonic acid method gave good results, but the process was found to be too time consuming for routine analysis and measured only total NO/sub x/. 3 references, 9 tables.« less
2010-01-01
Background Cluster analysis, and in particular hierarchical clustering, is widely used to extract information from gene expression data. The aim is to discover new classes, or sub-classes, of either individuals or genes. Performing a cluster analysis commonly involve decisions on how to; handle missing values, standardize the data and select genes. In addition, pre-processing, involving various types of filtration and normalization procedures, can have an effect on the ability to discover biologically relevant classes. Here we consider cluster analysis in a broad sense and perform a comprehensive evaluation that covers several aspects of cluster analyses, including normalization. Result We evaluated 2780 cluster analysis methods on seven publicly available 2-channel microarray data sets with common reference designs. Each cluster analysis method differed in data normalization (5 normalizations were considered), missing value imputation (2), standardization of data (2), gene selection (19) or clustering method (11). The cluster analyses are evaluated using known classes, such as cancer types, and the adjusted Rand index. The performances of the different analyses vary between the data sets and it is difficult to give general recommendations. However, normalization, gene selection and clustering method are all variables that have a significant impact on the performance. In particular, gene selection is important and it is generally necessary to include a relatively large number of genes in order to get good performance. Selecting genes with high standard deviation or using principal component analysis are shown to be the preferred gene selection methods. Hierarchical clustering using Ward's method, k-means clustering and Mclust are the clustering methods considered in this paper that achieves the highest adjusted Rand. Normalization can have a significant positive impact on the ability to cluster individuals, and there are indications that background correction is preferable, in particular if the gene selection is successful. However, this is an area that needs to be studied further in order to draw any general conclusions. Conclusions The choice of cluster analysis, and in particular gene selection, has a large impact on the ability to cluster individuals correctly based on expression profiles. Normalization has a positive effect, but the relative performance of different normalizations is an area that needs more research. In summary, although clustering, gene selection and normalization are considered standard methods in bioinformatics, our comprehensive analysis shows that selecting the right methods, and the right combinations of methods, is far from trivial and that much is still unexplored in what is considered to be the most basic analysis of genomic data. PMID:20937082
A review of the handling of missing longitudinal outcome data in clinical trials
2014-01-01
The aim of this review was to establish the frequency with which trials take into account missingness, and to discover what methods trialists use for adjustment in randomised controlled trials with longitudinal measurements. Failing to address the problems that can arise from missing outcome data can result in misleading conclusions. Missing data should be addressed as a means of a sensitivity analysis of the complete case analysis results. One hundred publications of randomised controlled trials with longitudinal measurements were selected randomly from trial publications from the years 2005 to 2012. Information was extracted from these trials, including whether reasons for dropout were reported, what methods were used for handing the missing data, whether there was any explanation of the methods for missing data handling, and whether a statistician was involved in the analysis. The main focus of the review was on missing data post dropout rather than missing interim data. Of all the papers in the study, 9 (9%) had no missing data. More than half of the papers included in the study failed to make any attempt to explain the reasons for their choice of missing data handling method. Of the papers with clear missing data handling methods, 44 papers (50%) used adequate methods of missing data handling, whereas 30 (34%) of the papers used missing data methods which may not have been appropriate. In the remaining 17 papers (19%), it was difficult to assess the validity of the methods used. An imputation method was used in 18 papers (20%). Multiple imputation methods were introduced in 1987 and are an efficient way of accounting for missing data in general, and yet only 4 papers used these methods. Out of the 18 papers which used imputation, only 7 displayed the results as a sensitivity analysis of the complete case analysis results. 61% of the papers that used an imputation explained the reasons for their chosen method. Just under a third of the papers made no reference to reasons for missing outcome data. There was little consistency in reporting of missing data within longitudinal trials. PMID:24947664
Pulse analysis of acoustic emission signals. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Houghton, J. R.
1976-01-01
A method for the signature analysis of pulses in the frequency domain and the time domain is presented. Fourier spectrum, Fourier transfer function, shock spectrum and shock spectrum ratio are examined in the frequency domain analysis, and pulse shape deconvolution is developed for use in the time domain analysis. To demonstrate the relative sensitivity of each of the methods to small changes in the pulse shape, signatures of computer modeled systems with analytical pulses are presented. Optimization techniques are developed and used to indicate the best design parameters values for deconvolution of the pulse shape. Several experiments are presented that test the pulse signature analysis methods on different acoustic emission sources. These include acoustic emissions associated with: (1) crack propagation, (2) ball dropping on a plate, (3) spark discharge and (4) defective and good ball bearings.
An R package for state-trace analysis.
Prince, Melissa; Hawkins, Guy; Love, Jonathon; Heathcote, Andrew
2012-09-01
State-trace analysis (Bamber, Journal of Mathematical Psychology, 19, 137-181, 1979) is a graphical analysis that can determine whether one or more than one latent variable mediates an apparent dissociation between the effects of two experimental manipulations. State-trace analysis makes only ordinal assumptions and so, is not confounded by range effects that plague alternative methods, especially when performance is measured on a bounded scale (such as accuracy). We describe and illustrate the application of a freely available GUI driven package, StateTrace, for the R language. StateTrace automates many aspects of a state-trace analysis of accuracy and other binary response data, including customizable graphics and the efficient management of computationally intensive Bayesian methods for quantifying evidence about the outcomes of a state-trace experiment, developed by Prince, Brown, and Heathcote (Psychological Methods, 17, 78-99, 2012).
2009-01-01
Background Decisions about interim analysis and early stopping of clinical trials, as based on recommendations of Data Monitoring Committees (DMCs), have far reaching consequences for the scientific validity and clinical impact of a trial. Our aim was to evaluate the frequency and quality of the reporting on DMC composition and roles, interim analysis and early termination in pediatric trials. Methods We conducted a systematic review of randomized controlled clinical trials published from 2005 to 2007 in a sample of four general and four pediatric journals. We used full-text databases to identify trials which reported on DMCs, interim analysis or early termination, and included children or adolescents. Information was extracted on general trial characteristics, risk of bias, and a set of parameters regarding DMC composition and roles, interim analysis and early termination. Results 110 of the 648 pediatric trials in this sample (17%) reported on DMC or interim analysis or early stopping, and were included; 68 from general and 42 from pediatric journals. The presence of DMCs was reported in 89 of the 110 included trials (81%); 62 papers, including 46 of the 89 that reported on DMCs (52%), also presented information about interim analysis. No paper adequately reported all DMC parameters, and nine (15%) reported all interim analysis details. Of 32 trials which terminated early, 22 (69%) did not report predefined stopping guidelines and 15 (47%) did not provide information on statistical monitoring methods. Conclusions Reporting on DMC composition and roles, on interim analysis results and on early termination of pediatric trials is incomplete and heterogeneous. We propose a minimal set of reporting parameters that will allow the reader to assess the validity of trial results. PMID:20003383
NASA Astrophysics Data System (ADS)
Sun, Li; Wang, Deyu
2011-09-01
A new multi-level analysis method of introducing the super-element modeling method, derived from the multi-level analysis method first proposed by O. F. Hughes, has been proposed in this paper to solve the problem of high time cost in adopting a rational-based optimal design method for ship structural design. Furthermore, the method was verified by its effective application in optimization of the mid-ship section of a container ship. A full 3-D FEM model of a ship, suffering static and quasi-static loads, was used as the analyzing object for evaluating the structural performance of the mid-ship module, including static strength and buckling performance. Research results reveal that this new method could substantially reduce the computational cost of the rational-based optimization problem without decreasing its accuracy, which increases the feasibility and economic efficiency of using a rational-based optimal design method in ship structural design.
El-Yazbi, Amira F
2017-07-01
Sofosbuvir (SOFO) was approved by the U.S. Food and Drug Administration in 2013 for the treatment of hepatitis C virus infection with enhanced antiviral potency compared with earlier analogs. Notwithstanding, all current editions of the pharmacopeias still do not present any analytical methods for the quantification of SOFO. Thus, rapid, simple, and ecofriendly methods for the routine analysis of commercial formulations of SOFO are desirable. In this study, five accurate methods for the determination of SOFO in pharmaceutical tablets were developed and validated. These methods include HPLC, capillary zone electrophoresis, HPTLC, and UV spectrophotometric and derivative spectrometry methods. The proposed methods proved to be rapid, simple, sensitive, selective, and accurate analytical procedures that were suitable for the reliable determination of SOFO in pharmaceutical tablets. An analysis of variance test with P-value > 0.05 confirmed that there were no significant differences between the proposed assays. Thus, any of these methods can be used for the routine analysis of SOFO in commercial tablets.
Fan, Xinghua; Kubwabo, Cariton; Wu, Fang; Rasmussen, Pat E
2018-06-26
Background: Ingestion of house dust has been demonstrated to be an important exposure pathway to several contaminants in young children. These compounds include bisphenol A (BPA), alkylphenols (APs), and alkylphenol ethoxylates (APEOs). Analysis of these compounds in house dust is challenging because of the complex composition of the sample matrix. Objective: The objective was to develop a simple and sensitive method to measure BPA, APs, and APEOs in indoor house dust. Methods: An integrated method that involved solvent extraction using sonication, sample cleanup by solid-phase extraction, derivatization by 2,2,2-trifluoro- N -methyl- N -(trimethylsilyl)acetamide, and analysis by GC coupled with tandem MS was developed for the simultaneous determination of BPA, APs, and APEOs in NIST Standard Reference Material (SRM) 2585 (Organic contaminants in house dust) and in settled house dust samples. Results: Target analytes included BPA, 4- tert -octylphenol (OP), OP monoethoxylate, OP diethoxylate, 4- n -nonylphenol (4 n NP), 4 n NP monoethoxylate (4 n NP 1 EO), branched nonylphenol (NP), NP monoethoxylate, NP diethoxylate, NP triethoxylate, and NP tetraethoxylate. The method was sensitive, with method detection limits ranging from 0.05 to 5.1 μg/g, and average recoveries between 82 and 115%. All target analytes were detected in SRM 2585 and house dust except 4 n NP and 4 n NP 1 EO. Conclusions: The method is simple and fast, with high sensitivity and good reproducibility. It is applicable to the analysis of target analytes in similar matrixes, such as sediments, soil, and biosolids. Highlights: Values measured in SRM 2585 will be useful for future research in method development and method comparison.
ERIC Educational Resources Information Center
Thomas, Barbara Schalk
Studied was the effect of instructional method on educational outcomes in an interdisciplinary science uni t taught to 143 eighth grade students of earth science. Compared were the didactic and guided discovery methods of teaching. Also analyzed were the interactions of methods with student characteristics including sex, intelligence, creativity,…
Multiscale modeling of brain dynamics: from single neurons and networks to mathematical tools.
Siettos, Constantinos; Starke, Jens
2016-09-01
The extreme complexity of the brain naturally requires mathematical modeling approaches on a large variety of scales; the spectrum ranges from single neuron dynamics over the behavior of groups of neurons to neuronal network activity. Thus, the connection between the microscopic scale (single neuron activity) to macroscopic behavior (emergent behavior of the collective dynamics) and vice versa is a key to understand the brain in its complexity. In this work, we attempt a review of a wide range of approaches, ranging from the modeling of single neuron dynamics to machine learning. The models include biophysical as well as data-driven phenomenological models. The discussed models include Hodgkin-Huxley, FitzHugh-Nagumo, coupled oscillators (Kuramoto oscillators, Rössler oscillators, and the Hindmarsh-Rose neuron), Integrate and Fire, networks of neurons, and neural field equations. In addition to the mathematical models, important mathematical methods in multiscale modeling and reconstruction of the causal connectivity are sketched. The methods include linear and nonlinear tools from statistics, data analysis, and time series analysis up to differential equations, dynamical systems, and bifurcation theory, including Granger causal connectivity analysis, phase synchronization connectivity analysis, principal component analysis (PCA), independent component analysis (ICA), and manifold learning algorithms such as ISOMAP, and diffusion maps and equation-free techniques. WIREs Syst Biol Med 2016, 8:438-458. doi: 10.1002/wsbm.1348 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei
2015-01-01
A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.
Modelling Coastal Cliff Recession Based on the GIM-DDD Method
NASA Astrophysics Data System (ADS)
Gong, Bin; Wang, Shanyong; Sloan, Scott William; Sheng, Daichao; Tang, Chun'an
2018-04-01
The unpredictable and instantaneous collapse behaviour of coastal rocky cliffs may cause damage that extends significantly beyond the area of failure. Gravitational movements that occur during coastal cliff recession involve two major stages: the small deformation stage and the large displacement stage. In this paper, a method of simulating the entire progressive failure process of coastal rocky cliffs is developed based on the gravity increase method (GIM), the rock failure process analysis method and the discontinuous deformation analysis method, and it is referred to as the GIM-DDD method. The small deformation stage, which includes crack initiation, propagation and coalescence processes, and the large displacement stage, which includes block translation and rotation processes during the rocky cliff collapse, are modelled using the GIM-DDD method. In addition, acoustic emissions, stress field variations, crack propagation and failure mode characteristics are further analysed to provide insights that can be used to predict, prevent and minimize potential economic losses and casualties. The calculation and analytical results are consistent with previous studies, which indicate that the developed method provides an effective and reliable approach for performing rocky cliff stability evaluations and coastal cliff recession analyses and has considerable potential for improving the safety and protection of seaside cliff areas.
Healing X-ray scattering images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jiliang; Lhermitte, Julien; Tian, Ye
X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less
Healing X-ray scattering images
Liu, Jiliang; Lhermitte, Julien; Tian, Ye; ...
2017-05-24
X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less
Analysis of the Interaction of Student Characteristics with Method in Micro-Teaching.
ERIC Educational Resources Information Center
Chavers, Katherine; And Others
A study examined the comparative effects on microteaching performance of (1) eight different methods of teacher training and (2) the interaction of method with student characteristics. Subjects, 71 enrollees in an educational psychology course, were randomly assigned to eight treatment groups (including one control group). Treatments consisted of…
The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...
Proximate Composition Analysis.
2016-01-01
The proximate composition of foods includes moisture, ash, lipid, protein and carbohydrate contents. These food components may be of interest in the food industry for product development, quality control (QC) or regulatory purposes. Analyses used may be rapid methods for QC or more accurate but time-consuming official methods. Sample collection and preparation must be considered carefully to ensure analysis of a homogeneous and representative sample, and to obtain accurate results. Estimation methods of moisture content, ash value, crude lipid, total carbohydrates, starch, total free amino acids and total proteins are put together in a lucid manner.
Synthesis of aircraft structures using integrated design and analysis methods
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Goetz, R. C.
1978-01-01
A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.
Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw
2017-01-01
Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare . However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop plants with large and complex genomes.
Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw
2017-01-01
Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop plants with large and complex genomes. PMID:29250096
Guidelines for the design of subsurface drainage systems for highway structural sections
DOT National Transportation Integrated Search
1972-06-01
Design criteria and a design method for pavement subsurface drainage systems include inflow-outflow method of analysis, open graded drainage layers, collector drains, pipe outlets and markers. Design examples are given for embankment sections, cut se...
Temporal radiographic texture analysis in the detection of periprosthetic osteolysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilkie, Joel R.; Giger, Maryellen L.; Chinander, Michael R.
2008-01-15
Periprosthetic osteolysis is one of the most serious long-term problems in total hip arthroplasty. It has been primarily attributed to the body's inflammatory response to submicron polyethylene particles worn from the hip implant, and it leads to bone loss and structural deterioration in the surrounding bone. It was previously demonstrated that radiographic texture analysis (RTA) has the ability to distinguish between osteolysis and normal cases at the time of clinical detection of the disease; however, that analysis did not take into account the changes in texture over time. The goal of this preliminary analysis, however, is to assess the abilitymore » of temporal radiographic texture analysis (tRTA) to distinguish between patients who develop osteolysis and normal cases. Two tRTA methods were used in the study: the RTA feature change from baseline at various follow-up intervals and the slope of the best-fit line to the RTA data series. These tRTA methods included Fourier-based and fractal-based features calculated from digitized images of 202 total hip replacement cases, including 70 that developed osteolysis. Results show that separation between the osteolysis and normal groups increased over time for the feature difference method, as the disease progressed, with area under the curve (AUC) values from receiver operating characteristic analysis of 0.65 to 0.72 at 15 years postsurgery. Separation for the slope method was also evident, with AUC values ranging from 0.65 to 0.76 for the task of distinguishing between osteolysis and normal cases. The results suggest that tRTA methods have the ability to measure changes in trabecular structure, and may be useful in the early detection of periprosthetic osteolysis.« less
Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof
2009-04-01
Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.
Fourth NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)
1997-01-01
This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.
NASA Technical Reports Server (NTRS)
Oden, J. Tinsley; Fly, Gerald W.; Mahadevan, L.
1987-01-01
A hybrid stress finite element method is developed for accurate stress and vibration analysis of problems in linear anisotropic elasticity. A modified form of the Hellinger-Reissner principle is formulated for dynamic analysis and an algorithm for the determination of the anisotropic elastic and compliance constants from experimental data is developed. These schemes were implemented in a finite element program for static and dynamic analysis of linear anisotropic two dimensional elasticity problems. Specific numerical examples are considered to verify the accuracy of the hybrid stress approach and compare it with that of the standard displacement method, especially for highly anisotropic materials. It is that the hybrid stress approach gives much better results than the displacement method. Preliminary work on extensions of this method to three dimensional elasticity is discussed, and the stress shape functions necessary for this extension are included.
Analysis of Glycosaminoglycans Using Mass Spectrometry
Staples, Gregory O.; Zaia, Joseph
2015-01-01
The glycosaminoglycans (GAGs) are linear polysaccharides expressed on animal cell surfaces and in extracellular matrices. Their biosynthesis is under complex control and confers a domain structure that is essential to their ability to bind to protein partners. Key to understanding the functions of GAGs are methods to determine accurately and rapidly patterns of sulfation, acetylation and uronic acid epimerization that correlate with protein binding or other biological activities. Mass spectrometry (MS) is particularly suitable for the analysis of GAGs for biomedical purposes. Using modern ionization techniques it is possible to accurately determine molecular weights of GAG oligosaccharides and their distributions within a mixture. Methods for direct interfacing with liquid chromatography have been developed to permit online mass spectrometric analysis of GAGs. New tandem mass spectrometric methods for fine structure determination of GAGs are emerging. This review summarizes MS-based approaches for analysis of GAGs, including tissue extraction and chromatographic methods compatible with LC/MS and tandem MS. PMID:25705143
High-performance equation solvers and their impact on finite element analysis
NASA Technical Reports Server (NTRS)
Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. Dale, Jr.
1990-01-01
The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number of operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.
High-performance equation solvers and their impact on finite element analysis
NASA Technical Reports Server (NTRS)
Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. D., Jr.
1992-01-01
The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number od operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.
Estimation of ovular fiber production in cotton
Van`t Hof, J.
1998-09-01
The present invention is a method for rendering cotton fiber cells that are post-anthesis and pre-harvest available for analysis of their physical properties. The method includes the steps of hydrolyzing cotton fiber cells and separating cotton fiber cells from cotton ovules thereby rendering the cells available for analysis. The analysis of the fiber cells is through any suitable means, e.g., visual inspection. Visual inspection of the cells can be accomplished by placing the cells under an instrument for detection, such as microscope or other means. 4 figs.
Estimation of ovular fiber production in cotton
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van`t Hof, J.
The present invention is a method for rendering cotton fiber cells that are post-anthesis and pre-harvest available for analysis of their physical properties. The method includes the steps of hydrolyzing cotton fiber cells and separating cotton fiber cells from cotton ovules thereby rendering the cells available for analysis. The analysis of the fiber cells is through any suitable means, e.g., visual inspection. Visual inspection of the cells can be accomplished by placing the cells under an instrument for detection, such as microscope or other means. 4 figs.
NASA Astrophysics Data System (ADS)
Cheng, Jian; Yue, Huiqiang; Yu, Shengjiao; Liu, Tiegang
2018-06-01
In this paper, an adjoint-based high-order h-adaptive direct discontinuous Galerkin method is developed and analyzed for the two dimensional steady state compressible Navier-Stokes equations. Particular emphasis is devoted to the analysis of the adjoint consistency for three different direct discontinuous Galerkin discretizations: including the original direct discontinuous Galerkin method (DDG), the direct discontinuous Galerkin method with interface correction (DDG(IC)) and the symmetric direct discontinuous Galerkin method (SDDG). Theoretical analysis shows the extra interface correction term adopted in the DDG(IC) method and the SDDG method plays a key role in preserving the adjoint consistency. To be specific, for the model problem considered in this work, we prove that the original DDG method is not adjoint consistent, while the DDG(IC) method and the SDDG method can be adjoint consistent with appropriate treatment of boundary conditions and correct modifications towards the underlying output functionals. The performance of those three DDG methods is carefully investigated and evaluated through typical test cases. Based on the theoretical analysis, an adjoint-based h-adaptive DDG(IC) method is further developed and evaluated, numerical experiment shows its potential in the applications of adjoint-based adaptation for simulating compressible flows.
Hand, Carri; Huot, Suzanne; Laliberte Rudman, Debbie; Wijekoon, Sachindri
2017-06-01
Research exploring how places shape and interact with the lives of aging adults must be grounded in the places where aging adults live and participate. Combined participatory geospatial and qualitative methods have the potential to illuminate the complex processes enacted between person and place to create much-needed knowledge in this area. The purpose of this scoping review was to identify methods that can be used to study person-place relationships among aging adults and their neighborhoods by determining the extent and nature of research with aging adults that combines qualitative methods with participatory geospatial methods. A systematic search of nine databases identified 1,965 articles published from 1995 to late 2015. We extracted data and assessed whether the geospatial and qualitative methods were supported by a specified methodology, the methods of data analysis, and the extent of integration of geospatial and qualitative methods. Fifteen studies were included and used the photovoice method, global positioning system tracking plus interview, or go-along interviews. Most included articles provided sufficient detail about data collection methods, yet limited detail about methodologies supporting the study designs and/or data analysis. Approaches that combine participatory geospatial and qualitative methods are beginning to emerge in the aging literature. By more explicitly grounding studies in a methodology, better integrating different types of data during analysis, and reflecting on methods as they are applied, these methods can be further developed and utilized to provide crucial place-based knowledge that can support aging adults' health, well-being, engagement, and participation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Developments in Sampling and Analysis Instrumentation for Stationary Sources
ERIC Educational Resources Information Center
Nader, John S.
1973-01-01
Instrumentation for the measurement of pollutant emissions is considered including sample-site selection, sample transport, sample treatment, sample analysis, and data reduction, display, and interpretation. Measurement approaches discussed involve sample extraction from within the stack and electro-optical methods. (BL)
The Weberian Legacy of Thom Greenfield.
ERIC Educational Resources Information Center
Samier, Eugenie
1996-01-01
Traces through Thomas Greenfield's work his use of Max Weber's interpretive social analysis, including Weber's view of the individual unit of analysis, value topologies, comparative history methods, and analytical ideal topologies. Compares Greenfield's and Weber's metaphysical assumptions, ontological perspectives, and epistemological frameworks.…
Use of Language Sample Analysis by School-Based SLPs: Results of a Nationwide Survey
ERIC Educational Resources Information Center
Pavelko, Stacey L.; Owens, Robert E., Jr.; Ireland, Marie; Hahs-Vaughn, Debbie L.
2016-01-01
Purpose: This article examines use of language sample analysis (LSA) by school-based speech-language pathologists (SLPs), including characteristics of language samples, methods of transcription and analysis, barriers to LSA use, and factors affecting LSA use, such as American Speech-Language-Hearing Association certification, number of years'…
Preventing Child Abuse: A Meta-Analysis of Parent Training Programs
ERIC Educational Resources Information Center
Lundahl, Brad W.; Nimer, Janelle; Parsons, Bruce
2006-01-01
Objective: A meta-analysis was conducted to evaluate the ability of parent training programs to reduce parents' risk of abusing a child. Method: A total of 23 studies were submitted to a meta-analysis. Outcomes of interest included parents' attitudes toward abuse, emotional adjustment, child-rearing skills, and actual abuse. Conclusions:…
Manual of phosphoric acid fuel cell power plant cost model and computer program
NASA Technical Reports Server (NTRS)
Lu, C. Y.; Alkasab, K. A.
1984-01-01
Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.
A Meta-Analysis of Cross Sectional Studies Investigating Language in Maltreated Children
ERIC Educational Resources Information Center
Lum, Jarrad A. G.; Powell, Martine; Timms, Lydia; Snow, Pamela
2015-01-01
Purpose: In this review article, meta-analysis was used to summarize research investigating language skills in maltreated children. Method: A systematic search of published studies was undertaken. Studies were included in the meta-analysis if they investigated language skills in groups comprising maltreated and nonmaltreated children. Studies were…
For the regulatory process, EPA is required to develop a regulatory impact analysis (RIA). This August 2010 RIA includes an economic impact analysis (EIA) and a small entity impacts analysis and documents the RIA methods and results for the 2010 rules
40 CFR Appendix A-7 to Part 60 - Test Methods 19 through 25E
Code of Federal Regulations, 2010 CFR
2010-07-01
... equations for Fw if %H and %O include the unavailable hydrogen and oxygen in the form of H2O.) 12.3.2.2Use... during the averaging period. 12.5.2.1Solid Fossil (Including Waste) Fuel/Sampling and Analysis. Note: For... for each gross sample. 12.5.2.2Liquid Fossil Fuel-Sampling and Analysis. See Note under Section 12.5.2...
40 CFR Appendix A-7 to Part 60 - Test Methods 19 through 25E
Code of Federal Regulations, 2012 CFR
2012-07-01
... equations for Fw if %H and %O include the unavailable hydrogen and oxygen in the form of H2O.) 12.3.2.2Use... during the averaging period. 12.5.2.1Solid Fossil (Including Waste) Fuel/Sampling and Analysis. Note: For... basis for each gross sample. 12.5.2.2Liquid Fossil Fuel-Sampling and Analysis. See Note under Section 12...
40 CFR Appendix A-7 to Part 60 - Test Methods 19 through 25E
Code of Federal Regulations, 2013 CFR
2013-07-01
... equations for Fw if %H and %O include the unavailable hydrogen and oxygen in the form of H2O.) 12.3.2.2Use... during the averaging period. 12.5.2.1Solid Fossil (Including Waste) Fuel/Sampling and Analysis. Note: For... basis for each gross sample. 12.5.2.2Liquid Fossil Fuel-Sampling and Analysis. See Note under Section 12...
40 CFR Appendix A-7 to Part 60 - Test Methods 19 through 25E
Code of Federal Regulations, 2011 CFR
2011-07-01
... equations for Fw if %H and %O include the unavailable hydrogen and oxygen in the form of H2O.) 12.3.2.2Use... during the averaging period. 12.5.2.1Solid Fossil (Including Waste) Fuel/Sampling and Analysis. Note: For... for each gross sample. 12.5.2.2Liquid Fossil Fuel-Sampling and Analysis. See Note under Section 12.5.2...