Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
Improved methods of vibration analysis of pretwisted, airfoil blades
NASA Technical Reports Server (NTRS)
Subrahmanyam, K. B.; Kaza, K. R. V.
1984-01-01
Vibration analysis of pretwisted blades of asymmetric airfoil cross section is performed by using two mixed variational approaches. Numerical results obtained from these two methods are compared to those obtained from an improved finite difference method and also to those given by the ordinary finite difference method. The relative merits, convergence properties and accuracies of all four methods are studied and discussed. The effects of asymmetry and pretwist on natural frequencies and mode shapes are investigated. The improved finite difference method is shown to be far superior to the conventional finite difference method in several respects. Close lower bound solutions are provided by the improved finite difference method for untwisted blades with a relatively coarse mesh while the mixed methods have not indicated any specific bound.
An Improved Manual Method for NOx Emission Measurement.
ERIC Educational Resources Information Center
Dee, L. A.; And Others
The current manual NO (x) sampling and analysis method was evaluated. Improved time-integrated sampling and rapid analysis methods were developed. In the new method, the sample gas is drawn through a heated bed of uniquely active, crystalline, Pb02 where NO (x) is quantitatively absorbed. Nitrate ion is later extracted with water and the…
Some New Mathematical Methods for Variational Objective Analysis
NASA Technical Reports Server (NTRS)
Wahba, G.; Johnson, D. R.
1984-01-01
New and/or improved variational methods for simultaneously combining forecast, heterogeneous observational data, a priori climatology, and physics to obtain improved estimates of the initial state of the atmosphere for the purpose of numerical weather prediction are developed. Cross validated spline methods are applied to atmospheric data for the purpose of improved description and analysis of atmospheric phenomena such as the tropopause and frontal boundary surfaces.
Linnorm: improved statistical analysis for single cell RNA-seq expression data
Yip, Shun H.; Wang, Panwen; Kocher, Jean-Pierre A.; Sham, Pak Chung
2017-01-01
Abstract Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. PMID:28981748
Adapting Job Analysis Methodology to Improve Evaluation Practice
ERIC Educational Resources Information Center
Jenkins, Susan M.; Curtin, Patrick
2006-01-01
This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…
EHR Improvement Using Incident Reports.
Teame, Tesfay; Stålhane, Tor; Nytrø, Øystein
2017-01-01
This paper discusses reactive improvement of clinical software using methods for incident analysis. We used the "Five Whys" method because we had only descriptive data and depended on a domain expert for the analysis. The analysis showed that there are two major root causes for EHR software failure, and that they are related to human and organizational errors. A main identified improvement is allocating more resources to system maintenance and user training.
Linnorm: improved statistical analysis for single cell RNA-seq expression data.
Yip, Shun H; Wang, Panwen; Kocher, Jean-Pierre A; Sham, Pak Chung; Wang, Junwen
2017-12-15
Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Li, Jiangtong; Luo, Yongdao; Dai, Honglin
2018-01-01
Water is the source of life and the essential foundation of all life. With the development of industrialization, the phenomenon of water pollution is becoming more and more frequent, which directly affects the survival and development of human. Water quality detection is one of the necessary measures to protect water resources. Ultraviolet (UV) spectral analysis is an important research method in the field of water quality detection, which partial least squares regression (PLSR) analysis method is becoming predominant technology, however, in some special cases, PLSR's analysis produce considerable errors. In order to solve this problem, the traditional principal component regression (PCR) analysis method was improved by using the principle of PLSR in this paper. The experimental results show that for some special experimental data set, improved PCR analysis method performance is better than PLSR. The PCR and PLSR is the focus of this paper. Firstly, the principal component analysis (PCA) is performed by MATLAB to reduce the dimensionality of the spectral data; on the basis of a large number of experiments, the optimized principal component is extracted by using the principle of PLSR, which carries most of the original data information. Secondly, the linear regression analysis of the principal component is carried out with statistic package for social science (SPSS), which the coefficients and relations of principal components can be obtained. Finally, calculating a same water spectral data set by PLSR and improved PCR, analyzing and comparing two results, improved PCR and PLSR is similar for most data, but improved PCR is better than PLSR for data near the detection limit. Both PLSR and improved PCR can be used in Ultraviolet spectral analysis of water, but for data near the detection limit, improved PCR's result better than PLSR.
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby
2017-01-01
Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.
Improved dynamical scaling analysis using the kernel method for nonequilibrium relaxation.
Echinaka, Yuki; Ozeki, Yukiyasu
2016-10-01
The dynamical scaling analysis for the Kosterlitz-Thouless transition in the nonequilibrium relaxation method is improved by the use of Bayesian statistics and the kernel method. This allows data to be fitted to a scaling function without using any parametric model function, which makes the results more reliable and reproducible and enables automatic and faster parameter estimation. Applying this method, the bootstrap method is introduced and a numerical discrimination for the transition type is proposed.
Coplen, T.B.; Wildman, J.D.; Chen, J.
1991-01-01
Improved precision in the H2-H2O equilibration method for ??D analysis has been achieved in an automated system. Reduction in 1-?? standard deviation of a single mass-spectrometer analysis to 1.3??? is achieved by (1) bonding catalyst to glass rods and assigning use to specific equilibration chambers to monitor performance of catalyst, (2) improving the apparatus design, and (3) reducing the H3+ contribution of the mass-spectrometer ion source. For replicate analysis of a water sample, the standard deviation improved to 0.8???. H2S-bearing samples and samples as small as 0.1 mL can be analyzed routinely with this method.
An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks
NASA Astrophysics Data System (ADS)
Zhao, Peng-yuan; Huang, Xiao-ping
2018-03-01
Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.
NASA Astrophysics Data System (ADS)
Sivasubramaniam, Kiruba
This thesis makes advances in three dimensional finite element analysis of electrical machines and the quantification of their parameters and performance. The principal objectives of the thesis are: (1)the development of a stable and accurate method of nonlinear three-dimensional field computation and application to electrical machinery and devices; and (2)improvement in the accuracy of determination of performance parameters, particularly forces and torque computed from finite elements. Contributions are made in two general areas: a more efficient formulation for three dimensional finite element analysis which saves time and improves accuracy, and new post-processing techniques to calculate flux density values from a given finite element solution. A novel three-dimensional magnetostatic solution based on a modified scalar potential method is implemented. This method has significant advantages over the traditional total scalar, reduced scalar or vector potential methods. The new method is applied to a 3D geometry of an iron core inductor and a permanent magnet motor. The results obtained are compared with those obtained from traditional methods, in terms of accuracy and speed of computation. A technique which has been observed to improve force computation in two dimensional analysis using a local solution of Laplace's equation in the airgap of machines is investigated and a similar method is implemented in the three dimensional analysis of electromagnetic devices. A new integral formulation to improve force calculation from a smoother flux-density profile is also explored and implemented. Comparisons are made and conclusions drawn as to how much improvement is obtained and at what cost. This thesis also demonstrates the use of finite element analysis to analyze torque ripples due to rotor eccentricity in permanent magnet BLDC motors. A new method for analyzing torque harmonics based on data obtained from a time stepping finite element analysis of the machine is explored and implemented.
[Analysis of the stability and adaptability of near infrared spectra qualitative analysis model].
Cao, Wu; Li, Wei-jun; Wang, Ping; Zhang, Li-ping
2014-06-01
The stability and adaptability of model of near infrared spectra qualitative analysis were studied. Method of separate modeling can significantly improve the stability and adaptability of model; but its ability of improving adaptability of model is limited. Method of joint modeling can not only improve the adaptability of the model, but also the stability of model, at the same time, compared to separate modeling, the method can shorten the modeling time, reduce the modeling workload; extend the term of validity of model, and improve the modeling efficiency. The experiment of model adaptability shows that, the correct recognition rate of separate modeling method is relatively low, which can not meet the requirements of application, and joint modeling method can reach the correct recognition rate of 90%, and significantly enhances the recognition effect. The experiment of model stability shows that, the identification results of model by joint modeling are better than the model by separate modeling, and has good application value.
Methods for Conducting Cognitive Task Analysis for a Decision Making Task.
1996-01-01
Cognitive task analysis (CTA) improves traditional task analysis procedures by analyzing the thought processes of performers while they complete a...for using these methods to conduct a CTA for domains which involve critical decision making tasks in naturalistic settings. The cognitive task analysis methods
Tickner, James; Ganly, Brianna; Lovric, Bojan; O'Dwyer, Joel
2017-04-01
Mining companies rely on chemical analysis methods to determine concentrations of gold in mineral ore samples. As gold is often mined commercially at concentrations around 1 part-per-million, it is necessary for any analysis method to provide good sensitivity as well as high absolute accuracy. We describe work to improve both the sensitivity and accuracy of the gamma activation analysis (GAA) method for gold. We present analysis results for several suites of ore samples and discuss the design of a GAA facility designed to replace conventional chemical assay in industrial applications. Copyright © 2017. Published by Elsevier Ltd.
Methods for consistent forewarning of critical events across multiple data channels
Hively, Lee M.
2006-11-21
This invention teaches further method improvements to forewarn of critical events via phase-space dissimilarity analysis of data from biomedical equipment, mechanical devices, and other physical processes. One improvement involves conversion of time-serial data into equiprobable symbols. A second improvement is a method to maximize the channel-consistent total-true rate of forewarning from a plurality of data channels over multiple data sets from the same patient or process. This total-true rate requires resolution of the forewarning indications into true positives, true negatives, false positives and false negatives. A third improvement is the use of various objective functions, as derived from the phase-space dissimilarity measures, to give the best forewarning indication. A fourth improvement uses various search strategies over the phase-space analysis parameters to maximize said objective functions. A fifth improvement shows the usefulness of the method for various biomedical and machine applications.
NASA Astrophysics Data System (ADS)
Hori, Toshikazu; Mohri, Yoshiyuki; Matsushima, Kenichi; Ariyoshi, Mitsuru
In recent years the increase in the number of heavy rainfall occurrences such as through unpredictable cloudbursts have resulted in the safety of the embankments of small earth dams needing to be improved. However, the severe financial condition of the government and local autonomous bodies necessitate the cost of improving them to be reduced. This study concerns the development of a method of evaluating the life cycle cost of small earth dams considered to pose a risk and in order to improve the safety of the downstream areas of small earth dams at minimal cost. Use of a safety evaluation method that is based on a combination of runoff analysis, saturated and unsaturated seepage analysis, and slope stability analysis enables the probability of a dam breach and its life cycle cost with the risk of heavy rainfall taken into account to be calculated. Moreover, use of the life cycle cost evaluation method will lead to the development of a technique for selecting the method of the optimal improvement or countermeasures against heavy rainfall.
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens
We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; ...
2016-12-15
We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less
Improved nonlinear prediction method
NASA Astrophysics Data System (ADS)
Adenan, Nur Hamiza; Md Noorani, Mohd Salmi
2014-06-01
The analysis and prediction of time series data have been addressed by researchers. Many techniques have been developed to be applied in various areas, such as weather forecasting, financial markets and hydrological phenomena involving data that are contaminated by noise. Therefore, various techniques to improve the method have been introduced to analyze and predict time series data. In respect of the importance of analysis and the accuracy of the prediction result, a study was undertaken to test the effectiveness of the improved nonlinear prediction method for data that contain noise. The improved nonlinear prediction method involves the formation of composite serial data based on the successive differences of the time series. Then, the phase space reconstruction was performed on the composite data (one-dimensional) to reconstruct a number of space dimensions. Finally the local linear approximation method was employed to make a prediction based on the phase space. This improved method was tested with data series Logistics that contain 0%, 5%, 10%, 20% and 30% of noise. The results show that by using the improved method, the predictions were found to be in close agreement with the observed ones. The correlation coefficient was close to one when the improved method was applied on data with up to 10% noise. Thus, an improvement to analyze data with noise without involving any noise reduction method was introduced to predict the time series data.
Security Analysis and Improvements to the PsychoPass Method
2013-01-01
Background In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. Objective To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. Methods We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. Results The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. Conclusions The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength. PMID:23942458
Security analysis and improvements to the PsychoPass method.
Brumen, Bostjan; Heričko, Marjan; Rozman, Ivan; Hölbl, Marko
2013-08-13
In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength.
Correlating the EMC analysis and testing methods for space systems in MIL-STD-1541A
NASA Technical Reports Server (NTRS)
Perez, Reinaldo J.
1990-01-01
A study was conducted to improve the correlation between the electromagnetic compatibility (EMC) analysis models stated in MIL-STD-1541A and the suggested testing methods used for space systems. The test and analysis methods outlined in MIL-STD-1541A are described, and a comparative assessment of testing and analysis techniques as they relate to several EMC areas is presented. Suggestions on present analysis and test methods are introduced to harmonize and bring the analysis and testing tools in MIL-STD-1541A into closer agreement. It is suggested that test procedures in MIL-STD-1541A must be improved by providing alternatives to the present use of shielded enclosures as the primary site for such tests. In addition, the alternate use of anechoic chambers and open field test sites must be considered.
NASA Astrophysics Data System (ADS)
Vinh, T.
1980-08-01
There is a need for better and more effective lightning protection for transmission and switching substations. In the past, a number of empirical methods were utilized to design systems to protect substations and transmission lines from direct lightning strokes. The need exists for convenient analytical lightning models adequate for engineering usage. In this study, analytical lightning models were developed along with a method for improved analysis of the physical properties of lightning through their use. This method of analysis is based upon the most recent statistical field data. The result is an improved method for predicting the occurrence of sheilding failure and for designing more effective protection for high and extra high voltage substations from direct strokes.
Efficient multiscale magnetic-domain analysis of iron-core material under mechanical stress
NASA Astrophysics Data System (ADS)
Nishikubo, Atsushi; Ito, Shumpei; Mifune, Takeshi; Matsuo, Tetsuji; Kaido, Chikara; Takahashi, Yasuhito; Fujiwara, Koji
2018-05-01
For an efficient analysis of magnetization, a partial-implicit solution method is improved using an assembled domain structure model with six-domain mesoscopic particles exhibiting pinning-type hysteresis. The quantitative analysis of non-oriented silicon steel succeeds in predicting the stress dependence of hysteresis loss with computation times greatly reduced by using the improved partial-implicit method. The effect of cell division along the thickness direction is also evaluated.
improved and higher throughput methods for analysis of biomass feedstocks Agronomics-using NIR spectroscopy in-house and external client training. She has also developed improved and high-throughput methods
Dai, Sheng-Yun; Xu, Bing; Shi, Xin-Yuan; Xu, Xiang; Sun, Ying-Qiang; Qiao, Yan-Jiang
2017-03-01
This study is aimed to propose a continual improvement strategy based on quality by design (QbD). An ultra high performance liquid chromatography (UPLC) method was developed to accomplish the method transformation from HPLC to UPLC of Panax notogineng saponins (PNS) and achieve the continual improvement of PNS based on QbD, for example. Plackett-Burman screening design and Box-Behnken optimization design were employed to further understand the relationship between the critical method parameters (CMPs) and critical method attributes (CMAs). And then the Bayesian design space was built. The separation degree of the critical peaks (ginsenoside Rg₁ and ginsenoside Re) was over 2.0 and the analysis time was less than 17 min by a method chosen from the design space with 20% of the initial concentration of the acetonitrile, 10 min of the isocratic time and 6%•min⁻¹ of the gradient slope. At last, the optimum method was validated by accuracy profile. Based on the same analytical target profile (ATP), the comparison of HPLC and UPLC including chromatograph method, CMA identification, CMP-CMA model and system suitability test (SST) indicated that the UPLC method could shorten the analysis time, improve the critical separation and satisfy the requirement of the SST. In all, HPLC method could be replaced by UPLC for the quantity analysis of PNS. Copyright© by the Chinese Pharmaceutical Association.
Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polly, B.
2011-09-01
This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.
Selection of remedial alternatives for mine sites: a multicriteria decision analysis approach.
Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon
2013-04-15
The selection of remedial alternatives for mine sites is a complex task because it involves multiple criteria and often with conflicting objectives. However, an existing framework used to select remedial alternatives lacks multicriteria decision analysis (MCDA) aids and does not consider uncertainty in the selection of alternatives. The objective of this paper is to improve the existing framework by introducing deterministic and probabilistic MCDA methods. The Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) methods have been implemented in this study. The MCDA analysis involves processing inputs to the PROMETHEE methods that are identifying the alternatives, defining the criteria, defining the criteria weights using analytical hierarchical process (AHP), defining the probability distribution of criteria weights, and conducting Monte Carlo Simulation (MCS); running the PROMETHEE methods using these inputs; and conducting a sensitivity analysis. A case study was presented to demonstrate the improved framework at a mine site. The results showed that the improved framework provides a reliable way of selecting remedial alternatives as well as quantifying the impact of different criteria on selecting alternatives. Copyright © 2013 Elsevier Ltd. All rights reserved.
López-Pacheco, María G; Sánchez-Fernández, Luis P; Molina-Lozano, Herón
2014-01-15
Noise levels of common sources such as vehicles, whistles, sirens, car horns and crowd sounds are mixed in urban soundscapes. Nowadays, environmental acoustic analysis is performed based on mixture signals recorded by monitoring systems. These mixed signals make it difficult for individual analysis which is useful in taking actions to reduce and control environmental noise. This paper aims at separating, individually, the noise source from recorded mixtures in order to evaluate the noise level of each estimated source. A method based on blind deconvolution and blind source separation in the wavelet domain is proposed. This approach provides a basis to improve results obtained in monitoring and analysis of common noise sources in urban areas. The method validation is through experiments based on knowledge of the predominant noise sources in urban soundscapes. Actual recordings of common noise sources are used to acquire mixture signals using a microphone array in semi-controlled environments. The developed method has demonstrated great performance improvements in identification, analysis and evaluation of common urban sources. © 2013 Elsevier B.V. All rights reserved.
Methods for Improving Information from ’Undesigned’ Human Factors Experiments.
Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction
NASA Astrophysics Data System (ADS)
Mao, Chao; Chen, Shou
2017-01-01
According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.
ERIC Educational Resources Information Center
Hallam, Rena; Hooper, Alison; Bargreen, Kaitlin; Buell, Martha; Han, Myae
2017-01-01
Research Findings: The current study is a mixed-methods investigation of family child care provider participation in voluntary Quality Rating and Improvement Systems (QRIS) in 2 states. Study 1 is an analysis of matched QRIS and child care licensing administrative data extracted from both states in May, 2014. Poverty and population density…
USDA-ARS?s Scientific Manuscript database
The development of genomic selection methodology, with accompanying substantial gains in reliability for low-heritability traits, may dramatically improve the feasibility of genetic improvement of dairy cow health. Many methods for genomic analysis have now been developed, including the “Bayesian Al...
NASA Technical Reports Server (NTRS)
Jordan, F. L., Jr.
1980-01-01
As part of basic research to improve aerial applications technology, methods were developed at the Langley Vortex Research Facility to simulate and measure deposition patterns of aerially-applied sprays and granular materials by means of tests with small-scale models of agricultural aircraft and dynamically-scaled test particles. Interactions between the aircraft wake and the dispersed particles are being studied with the objective of modifying wake characteristics and dispersal techniques to increase swath width, improve deposition pattern uniformity, and minimize drift. The particle scaling analysis, test methods for particle dispersal from the model aircraft, visualization of particle trajectories, and measurement and computer analysis of test deposition patterns are described. An experimental validation of the scaling analysis and test results that indicate improved control of chemical drift by use of winglets are presented to demonstrate test methods.
Faramarzi, Salar; Shamsi, Abdolhossein; Samadi, Maryam; Ahmadzade, Maryam
2015-01-01
Introduction: with due attention to the importance of learning disabilities and necessity of presenting interventions for improvement of these disorders in order to prevent future problems, this study used meta-analysis of the research model on the impact of psychological and educational interventions to improve academic performance of students with learning disabilities. Methods: with the use of meta-analysis method by integrating the results of various researches, this study specifies the effect of psychological and educational interventions. In this order, 57 studies, which their methodology was accepted, were selected and meta-analysis was performed on them. The research instrument was a meta-analysis checklist. Results: The effect size for the effectiveness of psychological-educational interventions on improving the academic performance of students with mathematics disorder (0.57), impaired writing (0.50) and dyslexia (0.55) were reported. Conclusions: The result of meta-analysis showed that according to Cohen's table, the effect size is above average, and it can be said that educational and psychological interventions improve the academic performance of students with learning disabilities. PMID:26430685
STATISTICAL SAMPLING AND DATA ANALYSIS
Research is being conducted to develop approaches to improve soil and sediment sampling techniques, measurement design and geostatistics, and data analysis via chemometric, environmetric, and robust statistical methods. Improvements in sampling contaminated soil and other hetero...
Improved omit set displacement recoveries in dynamic analysis
NASA Technical Reports Server (NTRS)
Allen, Tom; Cook, Greg; Walls, Bill
1993-01-01
Two related methods for improving the dependent (OMIT set) displacements after performing a Guyan reduction are presented. The theoretical bases for the methods are derived. The NASTRAN DMAP ALTERs used to implement the methods in a NASTRAN execution are described. Data are presented that verify the methods and the NASTRAN DMAP ALTERs.
Evaluation of Low-Voltage Distribution Network Index Based on Improved Principal Component Analysis
NASA Astrophysics Data System (ADS)
Fan, Hanlu; Gao, Suzhou; Fan, Wenjie; Zhong, Yinfeng; Zhu, Lei
2018-01-01
In order to evaluate the development level of the low-voltage distribution network objectively and scientifically, chromatography analysis method is utilized to construct evaluation index model of low-voltage distribution network. Based on the analysis of principal component and the characteristic of logarithmic distribution of the index data, a logarithmic centralization method is adopted to improve the principal component analysis algorithm. The algorithm can decorrelate and reduce the dimensions of the evaluation model and the comprehensive score has a better dispersion degree. The clustering method is adopted to analyse the comprehensive score because the comprehensive score of the courts is concentrated. Then the stratification evaluation of the courts is realized. An example is given to verify the objectivity and scientificity of the evaluation method.
NASA Astrophysics Data System (ADS)
Zhao, Shijia; Liu, Zongwei; Wang, Yue; Zhao, Fuquan
2017-01-01
Subjectivity usually causes large fluctuations in evaluation results. Many scholars attempt to establish new mathematical methods to make evaluation results consistent with actual objective situations. An improved catastrophe progression method (ICPM) is constructed to overcome the defects of the original method. The improved method combines the merits of the principal component analysis' information coherence and the catastrophe progression method's none index weight and has the advantage of highly objective comprehensive evaluation. Through the systematic analysis of the influencing factors of the automotive industry's core technology capacity, the comprehensive evaluation model is established according to the different roles that different indices play in evaluating the overall goal with a hierarchical structure. Moreover, ICPM is developed for evaluating the automotive industry's core technology capacity for the typical seven countries in the world, which demonstrates the effectiveness of the method.
biomass analysis methods and is primary author on 11 Laboratory Analytical Procedures, which are ) spectroscopic analysis methods. These methods allow analysts to predict the composition of feedstock and process . Patent No. 6,737,258 (2002) Featured Publications "Improved methods for the determination of drying
On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods
NASA Technical Reports Server (NTRS)
Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.
2003-01-01
Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.
An interactive method based on the live wire for segmentation of the breast in mammography images.
Zewei, Zhang; Tianyue, Wang; Li, Guo; Tingting, Wang; Lu, Xu
2014-01-01
In order to improve accuracy of computer-aided diagnosis of breast lumps, the authors introduce an improved interactive segmentation method based on Live Wire. This paper presents the Gabor filters and FCM clustering algorithm is introduced to the Live Wire cost function definition. According to the image FCM analysis for image edge enhancement, we eliminate the interference of weak edge and access external features clear segmentation results of breast lumps through improving Live Wire on two cases of breast segmentation data. Compared with the traditional method of image segmentation, experimental results show that the method achieves more accurate segmentation of breast lumps and provides more accurate objective basis on quantitative and qualitative analysis of breast lumps.
Maikusa, Norihide; Yamashita, Fumio; Tanaka, Kenichiro; Abe, Osamu; Kawaguchi, Atsushi; Kabasawa, Hiroyuki; Chiba, Shoma; Kasahara, Akihiro; Kobayashi, Nobuhisa; Yuasa, Tetsuya; Sato, Noriko; Matsuda, Hiroshi; Iwatsubo, Takeshi
2013-06-01
Serial magnetic resonance imaging (MRI) images acquired from multisite and multivendor MRI scanners are widely used in measuring longitudinal structural changes in the brain. Precise and accurate measurements are important in understanding the natural progression of neurodegenerative disorders such as Alzheimer's disease. However, geometric distortions in MRI images decrease the accuracy and precision of volumetric or morphometric measurements. To solve this problem, the authors suggest a commercially available phantom-based distortion correction method that accommodates the variation in geometric distortion within MRI images obtained with multivendor MRI scanners. The authors' method is based on image warping using a polynomial function. The method detects fiducial points within a phantom image using phantom analysis software developed by the Mayo Clinic and calculates warping functions for distortion correction. To quantify the effectiveness of the authors' method, the authors corrected phantom images obtained from multivendor MRI scanners and calculated the root-mean-square (RMS) of fiducial errors and the circularity ratio as evaluation values. The authors also compared the performance of the authors' method with that of a distortion correction method based on a spherical harmonics description of the generic gradient design parameters. Moreover, the authors evaluated whether this correction improves the test-retest reproducibility of voxel-based morphometry in human studies. A Wilcoxon signed-rank test with uncorrected and corrected images was performed. The root-mean-square errors and circularity ratios for all slices significantly improved (p < 0.0001) after the authors' distortion correction. Additionally, the authors' method was significantly better than a distortion correction method based on a description of spherical harmonics in improving the distortion of root-mean-square errors (p < 0.001 and 0.0337, respectively). Moreover, the authors' method reduced the RMS error arising from gradient nonlinearity more than gradwarp methods. In human studies, the coefficient of variation of voxel-based morphometry analysis of the whole brain improved significantly from 3.46% to 2.70% after distortion correction of the whole gray matter using the authors' method (Wilcoxon signed-rank test, p < 0.05). The authors proposed a phantom-based distortion correction method to improve reproducibility in longitudinal structural brain analysis using multivendor MRI. The authors evaluated the authors' method for phantom images in terms of two geometrical values and for human images in terms of test-retest reproducibility. The results showed that distortion was corrected significantly using the authors' method. In human studies, the reproducibility of voxel-based morphometry analysis for the whole gray matter significantly improved after distortion correction using the authors' method.
Mornkham, Tanupat; Wangsomnuk, Preeya Puangsomlee; Fu, Yong-Bi; Wangsomnuk, Pinich; Jogloy, Sanun; Patanothai, Aran
2013-04-29
Jerusalem artichoke (Helianthus tuberosus L.) is an important tuber crop. However, Jerusalem artichoke seeds contain high levels of starch and lipid, making the extraction of high-quality RNA extremely difficult and the gene expression analysis challenging. This study was aimed to improve existing methods for extracting total RNA from Jerusalem artichoke dry seeds and to assess the applicability of the improved method in other plant species. Five RNA extraction methods were evaluated on Jerusalem artichoke seeds and two were modified. One modified method with the significant improvement was applied to assay seeds of diverse Jerusalem artichoke accessions, sunflower, rice, maize, peanut and marigold. The effectiveness of the improved method to extract total RNA from seeds was assessed using qPCR analysis of four selected genes. The improved method of Ma and Yang (2011) yielded a maximum RNA solubility and removed most interfering substances. The improved protocol generated 29 to 41 µg RNA/30 mg fresh weight. An A260/A280 ratio of 1.79 to 2.22 showed their RNA purity. Extracted RNA was effective for downstream applications such as first-stranded cDNA synthesis, cDNA cloning and qPCR. The improved method was also effective to extract total RNA from seeds of sunflower, rice, maize and peanut that are rich in polyphenols, lipids and polysaccharides.
Improved score statistics for meta-analysis in single-variant and gene-level association studies.
Yang, Jingjing; Chen, Sai; Abecasis, Gonçalo
2018-06-01
Meta-analysis is now an essential tool for genetic association studies, allowing them to combine large studies and greatly accelerating the pace of genetic discovery. Although the standard meta-analysis methods perform equivalently as the more cumbersome joint analysis under ideal settings, they result in substantial power loss under unbalanced settings with various case-control ratios. Here, we investigate the power loss problem by the standard meta-analysis methods for unbalanced studies, and further propose novel meta-analysis methods performing equivalently to the joint analysis under both balanced and unbalanced settings. We derive improved meta-score-statistics that can accurately approximate the joint-score-statistics with combined individual-level data, for both linear and logistic regression models, with and without covariates. In addition, we propose a novel approach to adjust for population stratification by correcting for known population structures through minor allele frequencies. In the simulated gene-level association studies under unbalanced settings, our method recovered up to 85% power loss caused by the standard methods. We further showed the power gain of our methods in gene-level tests with 26 unbalanced studies of age-related macular degeneration . In addition, we took the meta-analysis of three unbalanced studies of type 2 diabetes as an example to discuss the challenges of meta-analyzing multi-ethnic samples. In summary, our improved meta-score-statistics with corrections for population stratification can be used to construct both single-variant and gene-level association studies, providing a useful framework for ensuring well-powered, convenient, cross-study analyses. © 2018 WILEY PERIODICALS, INC.
Modification of a successive corrections objective analysis for improved higher order calculations
NASA Technical Reports Server (NTRS)
Achtemeier, Gary L.
1988-01-01
The use of objectively analyzed fields of meteorological data for the initialization of numerical prediction models and for complex diagnostic studies places the requirements upon the objective method that derivatives of the gridded fields be accurate and free from interpolation error. A modification was proposed for an objective analysis developed by Barnes that provides improvements in analysis of both the field and its derivatives. Theoretical comparisons, comparisons between analyses of analytical monochromatic waves, and comparisons between analyses of actual weather data are used to show the potential of the new method. The new method restores more of the amplitudes of desired wavelengths while simultaneously filtering more of the amplitudes of undesired wavelengths. These results also hold for the first and second derivatives calculated from the gridded fields. Greatest improvements were for the Laplacian of the height field; the new method reduced the variance of undesirable very short wavelengths by 72 percent. Other improvements were found in the divergence of the gridded wind field and near the boundaries of the field of data.
Probabilistic structural analysis methods for improving Space Shuttle engine reliability
NASA Technical Reports Server (NTRS)
Boyce, L.
1989-01-01
Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.
NASA Astrophysics Data System (ADS)
Zhong, Jiaqi; Zeng, Cheng; Yuan, Yupeng; Zhang, Yuzhe; Zhang, Ye
2018-04-01
The aim of this paper is to present an explicit numerical algorithm based on improved spectral Galerkin method for solving the unsteady diffusion-convection-reaction equation. The principal characteristics of this approach give the explicit eigenvalues and eigenvectors based on the time-space separation method and boundary condition analysis. With the help of Fourier series and Galerkin truncation, we can obtain the finite-dimensional ordinary differential equations which facilitate the system analysis and controller design. By comparing with the finite element method, the numerical solutions are demonstrated via two examples. It is shown that the proposed method is effective.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaiser, Brooke LD; Wunschel, David S.; Sydor, Michael A.
2015-08-07
Proteomic analysis of bacterial samples provides valuable information about cellular responses and functions under different environmental pressures. Proteomic analysis is dependent upon efficient extraction of proteins from bacterial samples without introducing bias toward extraction of particular protein classes. While no single method can recover 100% of the bacterial proteins, selected protocols can improve overall protein isolation, peptide recovery, or enrich for certain classes of proteins. The method presented here is technically simple and does not require specialized equipment such as a mechanical disrupter. Our data reveal that for particularly challenging samples, such as B. anthracis Sterne spores, trichloroacetic acid extractionmore » improved the number of proteins identified within a sample compared to bead beating (714 vs 660, respectively). Further, TCA extraction enriched for 103 known spore specific proteins whereas bead beating resulted in 49 unique proteins. Analysis of C. botulinum samples grown to 5 days, composed of vegetative biomass and spores, showed a similar trend with improved protein yields and identification using our method compared to bead beating. Interestingly, easily lysed samples, such as B. anthracis vegetative cells, were equally as effectively processed via TCA and bead beating, but TCA extraction remains the easiest and most cost effective option. As with all assays, supplemental methods such as implementation of an alternative preparation method may provide additional insight to the protein biology of the bacteria being studied.« less
NASA Technical Reports Server (NTRS)
Kvaternik, Raymond G.
1992-01-01
An overview is presented of government contributions to the program called Design Analysis Methods for Vibrations (DAMV) which attempted to develop finite-element-based analyses of rotorcraft vibrations. NASA initiated the program with a finite-element modeling program for the CH-47D tandem-rotor helicopter. The DAMV program emphasized four areas including: airframe finite-element modeling, difficult components studies, coupled rotor-airframe vibrations, and airframe structural optimization. Key accomplishments of the program include industrywide standards for modeling metal and composite airframes, improved industrial designs for vibrations, and the identification of critical structural contributors to airframe vibratory responses. The program also demonstrated the value of incorporating secondary modeling details to improving correlation, and the findings provide the basis for an improved finite-element-based dynamics design-analysis capability.
Comparison of Methods for Evaluating Urban Transportation Alternatives
DOT National Transportation Integrated Search
1975-02-01
The objective of the report was to compare five alternative methods for evaluating urban transportation improvement options: unaided judgmental evaluation cost-benefit analysis, cost-effectiveness analysis based on a single measure of effectiveness, ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, C. L., E-mail: wangc@ornl.gov; Riedel, R. A.
2016-01-15
A {sup 6}Li-glass scintillator (GS20) based neutron Anger camera was developed for time-of-flight single-crystal diffraction instruments at Spallation Neutron Source. Traditional Pulse-Height Analysis (PHA) for Neutron-Gamma Discrimination (NGD) resulted in the neutron-gamma efficiency ratio (defined as NGD ratio) on the order of 10{sup 4}. The NGD ratios of Anger cameras need to be improved for broader applications including neutron reflectometers. For this purpose, six digital signal analysis methods of individual waveforms acquired from photomultiplier tubes were proposed using (i) charge integration, (ii) pulse-amplitude histograms, (iii) power spectrum analysis combined with the maximum pulse-amplitude, (iv) two event parameters (a{sub 1}, b{submore » 0}) obtained from a Wiener filter, (v) an effective amplitude (m) obtained from an adaptive least-mean-square filter, and (vi) a cross-correlation coefficient between individual and reference waveforms. The NGD ratios are about 70 times those from the traditional PHA method. Our results indicate the NGD capabilities of neutron Anger cameras based on GS20 scintillators can be significantly improved with digital signal analysis methods.« less
Improvements to direct quantitative analysis of multiple microRNAs facilitating faster analysis.
Ghasemi, Farhad; Wegman, David W; Kanoatov, Mirzo; Yang, Burton B; Liu, Stanley K; Yousef, George M; Krylov, Sergey N
2013-11-05
Studies suggest that patterns of deregulation in sets of microRNA (miRNA) can be used as cancer diagnostic and prognostic biomarkers. Establishing a "miRNA fingerprint"-based diagnostic technique requires a suitable miRNA quantitation method. The appropriate method must be direct, sensitive, capable of simultaneous analysis of multiple miRNAs, rapid, and robust. Direct quantitative analysis of multiple microRNAs (DQAMmiR) is a recently introduced capillary electrophoresis-based hybridization assay that satisfies most of these criteria. Previous implementations of the method suffered, however, from slow analysis time and required lengthy and stringent purification of hybridization probes. Here, we introduce a set of critical improvements to DQAMmiR that address these technical limitations. First, we have devised an efficient purification procedure that achieves the required purity of the hybridization probe in a fast and simple fashion. Second, we have optimized the concentrations of the DNA probe to decrease the hybridization time to 10 min. Lastly, we have demonstrated that the increased probe concentrations and decreased incubation time removed the need for masking DNA, further simplifying the method and increasing its robustness. The presented improvements bring DQAMmiR closer to use in a clinical setting.
Multidisciplinary Optimization Methods for Aircraft Preliminary Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian
1994-01-01
This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.
Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis
NASA Astrophysics Data System (ADS)
Kumar, Ranjan; Ghosh, Achyuta Krishna
2017-04-01
Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.
Process safety improvement--quality and target zero.
Van Scyoc, Karl
2008-11-15
Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.
A Lean Six Sigma approach to the improvement of the selenium analysis method.
Cloete, Bronwyn C; Bester, André
2012-11-02
Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.
Tactics, Methods and Techniques to Improve Special Forces In-Service Enlisted Recruiting
2002-06-01
Qualifications.........................................................22 3. SWOT Analysis...26 3. SWOT Analysis...31 3. SWOT Analysis ..................................................................................31 viii a. Strengths
An improved method for determining force balance calibration accuracy
NASA Technical Reports Server (NTRS)
Ferris, Alice T.
1993-01-01
The results of an improved statistical method used at Langley Research Center for determining and stating the accuracy of a force balance calibration are presented. The application of the method for initial loads, initial load determination, auxiliary loads, primary loads, and proof loads is described. The data analysis is briefly addressed.
An Improved Time-Frequency Analysis Method in Interference Detection for GNSS Receivers
Sun, Kewen; Jin, Tian; Yang, Dongkai
2015-01-01
In this paper, an improved joint time-frequency (TF) analysis method based on a reassigned smoothed pseudo Wigner–Ville distribution (RSPWVD) has been proposed in interference detection for Global Navigation Satellite System (GNSS) receivers. In the RSPWVD, the two-dimensional low-pass filtering smoothing function is introduced to eliminate the cross-terms present in the quadratic TF distribution, and at the same time, the reassignment method is adopted to improve the TF concentration properties of the auto-terms of the signal components. This proposed interference detection method is evaluated by experiments on GPS L1 signals in the disturbing scenarios compared to the state-of-the-art interference detection approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-terms problem and also preserves good TF localization properties, which has been proven to be effective and valid to enhance the interference detection performance of the GNSS receivers, particularly in the jamming environments. PMID:25905704
Guadalupe, Zenaida; Soldevilla, Alberto; Sáenz-Navajas, María-Pilar; Ayestarán, Belén
2006-04-21
A multiple-step analytical method was developed to improve the analysis of polymeric phenolics in red wines. With a common initial step based on the fractionation of wine phenolics by gel permeation chromatography (GPC), different analytical techniques were used: high-performance liquid chromatography-diode array detection (HPLC-DAD), HPLC-mass spectrometry (MS), capillary zone electrophoresis (CZE) and spectrophotometry. This method proved to be valid for analyzing different families of phenolic compounds, such as monomeric phenolics and their derivatives, polymeric pigments and proanthocyanidins. The analytical characteristics of fractionation by GPC were studied and the method was fully validated, yielding satisfactory statistical results. GPC fractionation substantially improved the analysis of polymeric pigments by CZE, in terms of response, repeatability and reproducibility. It also represented an improvement in the traditional vanillin assay used for proanthocyanidin (PA) quantification. Astringent proanthocyanidins were also analyzed using a simple combined method that allowed these compounds, for which only general indexes were available, to be quantified.
Use-related risk analysis for medical devices based on improved FMEA.
Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping
2012-01-01
In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.
Kang, Jinbum; Lee, Jae Young; Yoo, Yangmo
2016-06-01
Effective speckle reduction in ultrasound B-mode imaging is important for enhancing the image quality and improving the accuracy in image analysis and interpretation. In this paper, a new feature-enhanced speckle reduction (FESR) method based on multiscale analysis and feature enhancement filtering is proposed for ultrasound B-mode imaging. In FESR, clinical features (e.g., boundaries and borders of lesions) are selectively emphasized by edge, coherence, and contrast enhancement filtering from fine to coarse scales while simultaneously suppressing speckle development via robust diffusion filtering. In the simulation study, the proposed FESR method showed statistically significant improvements in edge preservation, mean structure similarity, speckle signal-to-noise ratio, and contrast-to-noise ratio (CNR) compared with other speckle reduction methods, e.g., oriented speckle reducing anisotropic diffusion (OSRAD), nonlinear multiscale wavelet diffusion (NMWD), the Laplacian pyramid-based nonlinear diffusion and shock filter (LPNDSF), and the Bayesian nonlocal means filter (OBNLM). Similarly, the FESR method outperformed the OSRAD, NMWD, LPNDSF, and OBNLM methods in terms of CNR, i.e., 10.70 ± 0.06 versus 9.00 ± 0.06, 9.78 ± 0.06, 8.67 ± 0.04, and 9.22 ± 0.06 in the phantom study, respectively. Reconstructed B-mode images that were developed using the five speckle reduction methods were reviewed by three radiologists for evaluation based on each radiologist's diagnostic preferences. All three radiologists showed a significant preference for the abdominal liver images obtained using the FESR methods in terms of conspicuity, margin sharpness, artificiality, and contrast, p<0.0001. For the kidney and thyroid images, the FESR method showed similar improvement over other methods. However, the FESR method did not show statistically significant improvement compared with the OBNLM method in margin sharpness for the kidney and thyroid images. These results demonstrate that the proposed FESR method can improve the image quality of ultrasound B-mode imaging by enhancing the visualization of lesion features while effectively suppressing speckle noise.
Cao, Xu-Liang; Popovic, Svetlana
2018-01-01
Solid phase extraction (SPE) of large volumes of water and beverage products was investigated for the GC-MS analysis of bisphenol A (BPA), bisphenol AF (BPAF), bisphenol F (BPF), bisphenol E (BPE), and bisphenol B (BPB). While absolute recoveries of the method were improved for water and some beverage products (e.g. diet cola, iced tea), breakthrough may also have occurred during SPE of 200 mL of other beverages (e.g. BPF in cola). Improvements in method detection limits were observed with the analysis of large sample volumes for all bisphenols at ppt (pg/g) to sub-ppt levels. This improvement was found to be proportional to sample volumes for water and beverage products with less interferences and noise levels around the analytes. Matrix effects and interferences were observed during SPE of larger volumes (100 and 200 mL) of the beverage products, and affected the accurate analysis of BPF. This improved method was used to analyse bisphenols in various beverage samples, and only BPA was detected, with levels ranging from 0.022 to 0.030 ng/g for products in PET bottles, and 0.085 to 0.32 ng/g for products in cans.
Quad-Tree Visual-Calculus Analysis of Satellite Coverage
NASA Technical Reports Server (NTRS)
Lo, Martin W.; Hockney, George; Kwan, Bruce
2003-01-01
An improved method of analysis of coverage of areas of the Earth by a constellation of radio-communication or scientific-observation satellites has been developed. This method is intended to supplant an older method in which the global-coverage-analysis problem is solved from a ground-to-satellite perspective. The present method provides for rapid and efficient analysis. This method is derived from a satellite-to-ground perspective and involves a unique combination of two techniques for multiresolution representation of map features on the surface of a sphere.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
Harmonic analysis of electrified railway based on improved HHT
NASA Astrophysics Data System (ADS)
Wang, Feng
2018-04-01
In this paper, the causes and harms of the current electric locomotive electrical system harmonics are firstly studied and analyzed. Based on the characteristics of the harmonics in the electrical system, the Hilbert-Huang transform method is introduced. Based on the in-depth analysis of the empirical mode decomposition method and the Hilbert transform method, the reasons and solutions to the endpoint effect and modal aliasing problem in the HHT method are explored. For the endpoint effect of HHT, this paper uses point-symmetric extension method to extend the collected data; In allusion to the modal aliasing problem, this paper uses the high frequency harmonic assistant method to preprocess the signal and gives the empirical formula of high frequency auxiliary harmonic. Finally, combining the suppression of HHT endpoint effect and modal aliasing problem, an improved HHT method is proposed and simulated by matlab. The simulation results show that the improved HHT is effective for the electric locomotive power supply system.
An improved UHPLC-UV method for separation and quantification of carotenoids in vegetable crops.
Maurer, Megan M; Mein, Jonathan R; Chaudhuri, Swapan K; Constant, Howard L
2014-12-15
Carotenoid identification and quantitation is critical for the development of improved nutrition plant varieties. Industrial analysis of carotenoids is typically carried out on multiple crops with potentially thousands of samples per crop, placing critical needs on speed and broad utility of the analytical methods. Current chromatographic methods for carotenoid analysis have had limited industrial application due to their low throughput, requiring up to 60 min for complete separation of all compounds. We have developed an improved UHPLC-UV method that resolves all major carotenoids found in broccoli (Brassica oleracea L. var. italica), carrot (Daucus carota), corn (Zea mays), and tomato (Solanum lycopersicum). The chromatographic method is completed in 13.5 min allowing for the resolution of the 11 carotenoids of interest, including the structural isomers lutein/zeaxanthin and α-/β-carotene. Additional minor carotenoids have also been separated and identified with this method, demonstrating the utility of this method across major commercial food crops. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
He, A.; Quan, C.
2018-04-01
The principal component analysis (PCA) and region matching combined method is effective for fringe direction estimation. However, its mask construction algorithm for region matching fails in some circumstances, and the algorithm for conversion of orientation to direction in mask areas is computationally-heavy and non-optimized. We propose an improved PCA based region matching method for the fringe direction estimation, which includes an improved and robust mask construction scheme, and a fast and optimized orientation-direction conversion algorithm for the mask areas. Along with the estimated fringe direction map, filtered fringe pattern by automatic selective reconstruction modification and enhanced fast empirical mode decomposition (ASRm-EFEMD) is used for Hilbert spiral transform (HST) to demodulate the phase. Subsequently, windowed Fourier ridge (WFR) method is used for the refinement of the phase. The robustness and effectiveness of proposed method are demonstrated by both simulated and experimental fringe patterns.
ERIC Educational Resources Information Center
Kennedy, Kate; Peters, Mary; Thomas, Mike
2012-01-01
Value-added analysis is the most robust, statistically significant method available for helping educators quantify student progress over time. This powerful tool also reveals tangible strategies for improving instruction. Built around the work of Battelle for Kids, this book provides a field-tested continuous improvement model for using…
ERIC Educational Resources Information Center
Huang, Shu Rong; Palmer, Peter T.
2017-01-01
This paper describes a method for determination of trihalomethanes (THMs) in drinking water via solid-phase microextraction (SPME) GC/MS as a means to develop and improve student understanding of the use of GC/MS for qualitative and quantitative analysis. In the classroom, students are introduced to SPME, GC/MS instrumentation, and the use of MS…
Optimal Measurement Conditions for Spatiotemporal EEG/MEG Source Analysis.
ERIC Educational Resources Information Center
Huizenga, Hilde M.; Heslenfeld, Dirk J.; Molenaar, Peter C. M.
2002-01-01
Developed a method to determine the required number and position of sensors for human brain electromagnetic source analysis. Studied the method through a simulation study and an empirical study on visual evoked potentials in one adult male. Results indicate the method is fast and reliable and improves source precision. (SLD)
2013-02-11
calibration curves was ±5%. Ion chromatography (IC) was used for analysis of perchlorate and other ionic targets. Analysis was carried out on a...The methods utilize liquid or gas chromatography , techniques that do not lend themselves well to portable devices and methods. Portable methods are...
Improving lip wrinkles: lipstick-related image analysis.
Ryu, Jong-Seong; Park, Sun-Gyoo; Kwak, Taek-Jong; Chang, Min-Youl; Park, Moon-Eok; Choi, Khee-Hwan; Sung, Kyung-Hye; Shin, Hyun-Jong; Lee, Cheon-Koo; Kang, Yun-Seok; Yoon, Moung-Seok; Rang, Moon-Jeong; Kim, Seong-Jin
2005-08-01
The appearance of lip wrinkles is problematic if it is adversely influenced by lipstick make-up causing incomplete color tone, spread phenomenon and pigment remnants. It is mandatory to develop an objective assessment method for lip wrinkle status by which the potential of wrinkle-improving products to lips can be screened. The present study is aimed at finding out the useful parameters from the image analysis of lip wrinkles that is affected by lipstick application. The digital photograph image of lips before and after lipstick application was assessed from 20 female volunteers. Color tone was measured by Hue, Saturation and Intensity parameters, and time-related pigment spread was calculated by the area over vermilion border by image-analysis software (Image-Pro). The efficacy of wrinkle-improving lipstick containing asiaticoside was evaluated from 50 women by using subjective and objective methods including image analysis in a double-blind placebo-controlled fashion. The color tone and spread phenomenon after lipstick make-up were remarkably affected by lip wrinkles. The level of standard deviation by saturation value of image-analysis software was revealed as a good parameter for lip wrinkles. By using the lipstick containing asiaticoside for 8 weeks, the change of visual grading scores and replica analysis indicated the wrinkle-improving effect. As the depth and number of wrinkles were reduced, the lipstick make-up appearance by image analysis also improved significantly. The lip wrinkle pattern together with lipstick make-up can be evaluated by the image-analysis system in addition to traditional assessment methods. Thus, this evaluation system is expected to test the efficacy of wrinkle-reducing lipstick that was not described in previous dermatologic clinical studies.
An improved adaptive weighting function method for State Estimation in Power Systems with VSC-MTDC
NASA Astrophysics Data System (ADS)
Zhao, Kun; Yang, Xiaonan; Lang, Yansheng; Song, Xuri; Wang, Minkun; Luo, Yadi; Wu, Lingyun; Liu, Peng
2017-04-01
This paper presents an effective approach for state estimation in power systems that include multi-terminal voltage source converter based high voltage direct current (VSC-MTDC), called improved adaptive weighting function method. The proposed approach is simplified in which the VSC-MTDC system is solved followed by the AC system. Because the new state estimation method only changes the weight and keeps the matrix dimension unchanged. Accurate and fast convergence of AC/DC system can be realized by adaptive weight function method. This method also provides the technical support for the simulation analysis and accurate regulation of AC/DC system. Both the oretical analysis and numerical tests verify practicability, validity and convergence of new method.
Human Systems Integration (HSI) Associated Development Activities in Japan
2008-06-12
machine learning and data mining methods. The continuous effort ( KAIZEN ) to improve the analysis phases are illustrated in Figure 14. Although there...model Extraction of a workflow Extraction of a control rule Variation analysis and improvement Plant operation KAIZEN Fig. 14
Sigma metric analysis for performance of creatinine with fresh frozen serum.
Kang, Fengfeng; Zhang, Chuanbao; Wang, Wei; Wang, Zhiguo
2016-01-01
Six sigma provides an objective and quantitative methodology to describe the laboratory testing performance. In this study, we conducted a national trueness verification scheme with fresh frozen serum (FFS) for serum creatinine to evaluate its performance in China. Two different concentration levels of FFS, targeted with reference method, were sent to 98 laboratories in China. Imprecision and bias of the measurement procedure were calculated for each participant to further evaluate the sigma value. Quality goal index (QGI) analysis was used to investigate the reason of unacceptable performance for laboratories with σ < 3. Our study indicated that the sample with high concentration of creatinine had preferable sigma values. For the enzymatic method, 7.0% (5/71) to 45.1% (32/71) of the laboratories need to improve their measurement procedures (σ < 3). And for the Jaffe method, the percentages were from 11.5% (3/26) to 73.1% (19/26). QGI analysis suggested that most of the laboratories (62.5% for the enzymatic method and 68.4% for the Jaffe method) should make an effort to improve the trueness (QGI > 1.2). Only 3.1-5.3% of the laboratories should improve both of the precision and trueness. Sigma metric analysis of the serum creatinine assays is disappointing, which was mainly due to the unacceptable analytical bias according to the QGI analysis. Further effort is needed to enhance the trueness of the creatinine measurement.
Measuring Glial Metabolism in Repetitive Brain Trauma and Alzheimer’s Disease
2016-09-01
Six methods: Single value decomposition (SVD), wavelet, sliding window, sliding window with Gaussian weighting, spline and spectral improvements...comparison of a range of different denoising methods for dynamic MRS. Six denoising methods were considered: Single value decomposition (SVD), wavelet...project by improving the software required for the data analysis by developing six different denoising methods. He also assisted with the testing
Vetter, Jeffrey S.
2005-02-01
The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.
Distributed collaborative response surface method for mechanical dynamic assembly reliability design
NASA Astrophysics Data System (ADS)
Bai, Guangchen; Fei, Chengwei
2013-11-01
Because of the randomness of many impact factors influencing the dynamic assembly relationship of complex machinery, the reliability analysis of dynamic assembly relationship needs to be accomplished considering the randomness from a probabilistic perspective. To improve the accuracy and efficiency of dynamic assembly relationship reliability analysis, the mechanical dynamic assembly reliability(MDAR) theory and a distributed collaborative response surface method(DCRSM) are proposed. The mathematic model of DCRSM is established based on the quadratic response surface function, and verified by the assembly relationship reliability analysis of aeroengine high pressure turbine(HPT) blade-tip radial running clearance(BTRRC). Through the comparison of the DCRSM, traditional response surface method(RSM) and Monte Carlo Method(MCM), the results show that the DCRSM is not able to accomplish the computational task which is impossible for the other methods when the number of simulation is more than 100 000 times, but also the computational precision for the DCRSM is basically consistent with the MCM and improved by 0.40˜4.63% to the RSM, furthermore, the computational efficiency of DCRSM is up to about 188 times of the MCM and 55 times of the RSM under 10000 times simulations. The DCRSM is demonstrated to be a feasible and effective approach for markedly improving the computational efficiency and accuracy of MDAR analysis. Thus, the proposed research provides the promising theory and method for the MDAR design and optimization, and opens a novel research direction of probabilistic analysis for developing the high-performance and high-reliability of aeroengine.
NASA Astrophysics Data System (ADS)
Chien, Kuang-Che Chang; Tu, Han-Yen; Hsieh, Ching-Huang; Cheng, Chau-Jern; Chang, Chun-Yen
2018-01-01
This study proposes a regional fringe analysis (RFA) method to detect the regions of a target object in captured shifted images to improve depth measurement in phase-shifting fringe projection profilometry (PS-FPP). In the RFA method, region-based segmentation is exploited to segment the de-fringed image of a target object, and a multi-level fuzzy-based classification with five presented features is used to analyze and discriminate the regions of an object from the segmented regions, which were associated with explicit fringe information. Then, in the experiment, the performance of the proposed method is tested and evaluated on 26 test cases made of five types of materials. The qualitative and quantitative results demonstrate that the proposed RFA method can effectively detect the desired regions of an object to improve depth measurement in the PS-FPP system.
Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly
2015-12-18
This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
Advanced statistical methods for improved data analysis of NASA astrophysics missions
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.
1992-01-01
The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.
[Electroencephalogram Feature Selection Based on Correlation Coefficient Analysis].
Zhou, Jinzhi; Tang, Xiaofang
2015-08-01
In order to improve the accuracy of classification with small amount of motor imagery training data on the development of brain-computer interface (BCD systems, we proposed an analyzing method to automatically select the characteristic parameters based on correlation coefficient analysis. Throughout the five sample data of dataset IV a from 2005 BCI Competition, we utilized short-time Fourier transform (STFT) and correlation coefficient calculation to reduce the number of primitive electroencephalogram dimension, then introduced feature extraction based on common spatial pattern (CSP) and classified by linear discriminant analysis (LDA). Simulation results showed that the average rate of classification accuracy could be improved by using correlation coefficient feature selection method than those without using this algorithm. Comparing with support vector machine (SVM) optimization features algorithm, the correlation coefficient analysis can lead better selection parameters to improve the accuracy of classification.
Tactics, Methods and Techniques to Improve Special Forces In-Service Enlisted Recruiting
2002-06-01
22 3. SWOT Analysis ..................................................................................23 a. Strengths...26 3. SWOT Analysis ..................................................................................27 a. Strengths...31 2. Constraints..........................................................................................31 3. SWOT Analysis
Wang, Cai -Lin; Riedel, Richard A.
2016-01-14
A 6Li-glass scintillator (GS20) based neutron Anger camera was developed for time-of-flight single-crystal diffraction instruments at SNS. Traditional pulse-height analysis (PHA) for neutron-gamma discrimination (NGD) resulted in the neutron-gamma efficiency ratio (defined as NGD ratio) on the order of 10 4. The NGD ratios of Anger cameras need to be improved for broader applications including neutron reflectometers. For this purpose, five digital signal analysis methods of individual waveforms from PMTs were proposed using: i). pulse-amplitude histogram; ii). power spectrum analysis combined with the maximum pulse amplitude; iii). two event parameters (a 1, b 0) obtained from Wiener filter; iv). anmore » effective amplitude (m) obtained from an adaptive least-mean-square (LMS) filter; and v). a cross-correlation (CC) coefficient between an individual waveform and a reference. The NGD ratios can be 1-102 times those from traditional PHA method. A brighter scintillator GS2 has better NGD ratio than GS20, but lower neutron detection efficiency. The ultimate NGD ratio is related to the ambient, high-energy background events. Moreover, our results indicate the NGD capability of neutron Anger cameras can be improved using digital signal analysis methods and brighter neutron scintillators.« less
Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging
Patel, Tapan P.; Man, Karen; Firestein, Bonnie L.; Meaney, David F.
2017-01-01
Background Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s–1000 +neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. New method Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. Results We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. Comparison with existing method(s) We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. Conclusions We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. PMID:25629800
Daxini, S D; Prajapati, J M
2014-01-01
Meshfree methods are viewed as next generation computational techniques. With evident limitations of conventional grid based methods, like FEM, in dealing with problems of fracture mechanics, large deformation, and simulation of manufacturing processes, meshfree methods have gained much attention by researchers. A number of meshfree methods have been proposed till now for analyzing complex problems in various fields of engineering. Present work attempts to review recent developments and some earlier applications of well-known meshfree methods like EFG and MLPG to various types of structure mechanics and fracture mechanics applications like bending, buckling, free vibration analysis, sensitivity analysis and topology optimization, single and mixed mode crack problems, fatigue crack growth, and dynamic crack analysis and some typical applications like vibration of cracked structures, thermoelastic crack problems, and failure transition in impact problems. Due to complex nature of meshfree shape functions and evaluation of integrals in domain, meshless methods are computationally expensive as compared to conventional mesh based methods. Some improved versions of original meshfree methods and other techniques suggested by researchers to improve computational efficiency of meshfree methods are also reviewed here.
Pan, Yijie; Wang, Yongtian; Liu, Juan; Li, Xin; Jia, Jia
2014-03-01
Previous research [Appl. Opt.52, A290 (2013)] has revealed that Fourier analysis of three-dimensional affine transformation theory can be used to improve the computation speed of the traditional polygon-based method. In this paper, we continue our research and propose an improved full analytical polygon-based method developed upon this theory. Vertex vectors of primitive and arbitrary triangles and the pseudo-inverse matrix were used to obtain an affine transformation matrix representing the spatial relationship between the two triangles. With this relationship and the primitive spectrum, we analytically obtained the spectrum of the arbitrary triangle. This algorithm discards low-level angular dependent computations. In order to add diffusive reflection to each arbitrary surface, we also propose a whole matrix computation approach that takes advantage of the affine transformation matrix and uses matrix multiplication to calculate shifting parameters of similar sub-polygons. The proposed method improves hologram computation speed for the conventional full analytical approach. Optical experimental results are demonstrated which prove that the proposed method can effectively reconstruct three-dimensional scenes.
NASA Astrophysics Data System (ADS)
Hao, Qiushi; Zhang, Xin; Wang, Yan; Shen, Yi; Makis, Viliam
2018-07-01
Acoustic emission (AE) technology is sensitive to subliminal rail defects, however strong wheel-rail contact rolling noise under high-speed condition has gravely impeded detecting of rail defects using traditional denoising methods. In this context, the paper develops an adaptive detection method for rail cracks, which combines multiresolution analysis with an improved adaptive line enhancer (ALE). To obtain elaborate multiresolution information of transient crack signals with low computational cost, lifting scheme-based undecimated wavelet packet transform is adopted. In order to feature the impulsive property of crack signals, a Shannon entropy-improved ALE is proposed as a signal enhancing approach, where Shannon entropy is introduced to improve the cost function. Then a rail defect detection plan based on the proposed method for high-speed condition is put forward. From theoretical analysis and experimental verification, it is demonstrated that the proposed method has superior performance in enhancing the rail defect AE signal and reducing the strong background noise, offering an effective multiresolution approach for rail defect detection under high-speed and strong-noise condition.
Research on the raw data processing method of the hydropower construction project
NASA Astrophysics Data System (ADS)
Tian, Zhichao
2018-01-01
In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.
An improved partial least-squares regression method for Raman spectroscopy
NASA Astrophysics Data System (ADS)
Momenpour Tehran Monfared, Ali; Anis, Hanan
2017-10-01
It is known that the performance of partial least-squares (PLS) regression analysis can be improved using the backward variable selection method (BVSPLS). In this paper, we further improve the BVSPLS based on a novel selection mechanism. The proposed method is based on sorting the weighted regression coefficients, and then the importance of each variable of the sorted list is evaluated using root mean square errors of prediction (RMSEP) criterion in each iteration step. Our Improved BVSPLS (IBVSPLS) method has been applied to leukemia and heparin data sets and led to an improvement in limit of detection of Raman biosensing ranged from 10% to 43% compared to PLS. Our IBVSPLS was also compared to the jack-knifing (simpler) and Genetic Algorithm (more complex) methods. Our method was consistently better than the jack-knifing method and showed either a similar or a better performance compared to the genetic algorithm.
2012-06-01
Conducting metrology, surface analysis, and metallography/ fractography interrogations of samples to correlate microstructure with friction...are examined using a variety of methods such as metallography, chemical analysis, fractography , and hardness measurements. These methods assist in
NASA Astrophysics Data System (ADS)
Wu, Jianing; Yan, Shaoze; Xie, Liyang
2011-12-01
To address the impact of solar array anomalies, it is important to perform analysis of the solar array reliability. This paper establishes the fault tree analysis (FTA) and fuzzy reasoning Petri net (FRPN) models of a solar array mechanical system and analyzes reliability to find mechanisms of the solar array fault. The index final truth degree (FTD) and cosine matching function (CMF) are employed to resolve the issue of how to evaluate the importance and influence of different faults. So an improvement reliability analysis method is developed by means of the sorting of FTD and CMF. An example is analyzed using the proposed method. The analysis results show that harsh thermal environment and impact caused by particles in space are the most vital causes of the solar array fault. Furthermore, other fault modes and the corresponding improvement methods are discussed. The results reported in this paper could be useful for the spacecraft designers, particularly, in the process of redesigning the solar array and scheduling its reliability growth plan.
Static analysis of class invariants in Java programs
NASA Astrophysics Data System (ADS)
Bonilla-Quintero, Lidia Dionisia
2011-12-01
This paper presents a technique for the automatic inference of class invariants from Java bytecode. Class invariants are very important for both compiler optimization and as an aid to programmers in their efforts to reduce the number of software defects. We present the original DC-invariant analysis from Adam Webber, talk about its shortcomings and suggest several different ways to improve it. To apply the DC-invariant analysis to identify DC-invariant assertions, all that one needs is a monotonic method analysis function and a suitable assertion domain. The DC-invariant algorithm is very general; however, the method analysis can be highly tuned to the problem in hand. For example, one could choose shape analysis as the method analysis function and use the DC-invariant analysis to simply extend it to an analysis that would yield class-wide invariants describing the shapes of linked data structures. We have a prototype implementation: a system we refer to as "the analyzer" that infers DC-invariant unary and binary relations and provides them to the user in a human readable format. The analyzer uses those relations to identify unnecessary array bounds checks in Java programs and perform null-reference analysis. It uses Adam Webber's relational constraint technique for the class-invariant binary relations. Early results with the analyzer were very imprecise in the presence of "dirty-called" methods. A dirty-called method is one that is called, either directly or transitively, from any constructor of the class, or from any method of the class at a point at which a disciplined field has been altered. This result was unexpected and forced an extensive search for improved techniques. An important contribution of this paper is the suggestion of several ways to improve the results by changing the way dirty-called methods are handled. The new techniques expand the set of class invariants that can be inferred over Webber's original results. The technique that produces better results uses in-line analysis. Final results are promising: we can infer sound class invariants for full-scale, not just toy applications.
NASA Technical Reports Server (NTRS)
Padavala, Satyasrinivas; Palazzolo, Alan B.; Vallely, Pat; Ryan, Steve
1994-01-01
An improved dynamic analysis for liquid annular seals with arbitrary profile based on a method, first proposed by Nelson and Nguyen, is presented. An improved first order solution that incorporates a continuous interpolation of perturbed quantities in the circumferential direction, is presented. The original method uses an approximation scheme for circumferential gradients, based on Fast Fourier Transforms (FFT). A simpler scheme based on cubic splines is found to be computationally more efficient with better convergence at higher eccentricities. A new approach of computing dynamic coefficients based on external specified load is introduced. This improved analysis is extended to account for arbitrarily varying seal profile in both axial and circumferential directions. An example case of an elliptical seal with varying degrees of axial curvature is analyzed. A case study based on actual operating clearances of an interstage seal of the Space Shuttle Main Engine High Pressure Oxygen Turbopump is presented.
A Novel Locally Linear KNN Method With Applications to Visual Recognition.
Liu, Qingfeng; Liu, Chengjun
2017-09-01
A locally linear K Nearest Neighbor (LLK) method is presented in this paper with applications to robust visual recognition. Specifically, the concept of an ideal representation is first presented, which improves upon the traditional sparse representation in many ways. The objective function based on a host of criteria for sparsity, locality, and reconstruction is then optimized to derive a novel representation, which is an approximation to the ideal representation. The novel representation is further processed by two classifiers, namely, an LLK-based classifier and a locally linear nearest mean-based classifier, for visual recognition. The proposed classifiers are shown to connect to the Bayes decision rule for minimum error. Additional new theoretical analysis is presented, such as the nonnegative constraint, the group regularization, and the computational efficiency of the proposed LLK method. New methods such as a shifted power transformation for improving reliability, a coefficients' truncating method for enhancing generalization, and an improved marginal Fisher analysis method for feature extraction are proposed to further improve visual recognition performance. Extensive experiments are implemented to evaluate the proposed LLK method for robust visual recognition. In particular, eight representative data sets are applied for assessing the performance of the LLK method for various visual recognition applications, such as action recognition, scene recognition, object recognition, and face recognition.
SPAR improved structure-fluid dynamic analysis capability, phase 2
NASA Technical Reports Server (NTRS)
Pearson, M. L.
1984-01-01
An efficient and general method of analyzing a coupled dynamic system of fluid flow and elastic structures is investigated. The improvement of Structural Performance Analysis and Redesign (SPAR) code is summarized. All error codes are documented and the SPAR processor/subroutine cross reference is included.
NASA Technical Reports Server (NTRS)
Waszak, M. R.; Schmidt, D. S.
1985-01-01
As aircraft become larger and lighter due to design requirements for increased payload and improved fuel efficiency, they will also become more flexible. For highly flexible vehicles, the handling qualities may not be accurately predicted by conventional methods. This study applies two analysis methods to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop model analysis technique. This method considers the effects of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Volume 1 consists of the development and application of the two analysis methods described above.
Image analysis and modeling in medical image computing. Recent developments and advances.
Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T
2012-01-01
Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.
Visual cluster analysis and pattern recognition methods
Osbourn, Gordon Cecil; Martinez, Rubel Francisco
2001-01-01
A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.
Improving Cluster Analysis with Automatic Variable Selection Based on Trees
2014-12-01
regression trees Daisy DISsimilAritY PAM partitioning around medoids PMA penalized multivariate analysis SPC sparse principal components UPGMA unweighted...unweighted pair-group average method ( UPGMA ). This method measures dissimilarities between all objects in two clusters and takes the average value
An effective fuzzy kernel clustering analysis approach for gene expression data.
Sun, Lin; Xu, Jiucheng; Yin, Jiaojiao
2015-01-01
Fuzzy clustering is an important tool for analyzing microarray data. A major problem in applying fuzzy clustering method to microarray gene expression data is the choice of parameters with cluster number and centers. This paper proposes a new approach to fuzzy kernel clustering analysis (FKCA) that identifies desired cluster number and obtains more steady results for gene expression data. First of all, to optimize characteristic differences and estimate optimal cluster number, Gaussian kernel function is introduced to improve spectrum analysis method (SAM). By combining subtractive clustering with max-min distance mean, maximum distance method (MDM) is proposed to determine cluster centers. Then, the corresponding steps of improved SAM (ISAM) and MDM are given respectively, whose superiority and stability are illustrated through performing experimental comparisons on gene expression data. Finally, by introducing ISAM and MDM into FKCA, an effective improved FKCA algorithm is proposed. Experimental results from public gene expression data and UCI database show that the proposed algorithms are feasible for cluster analysis, and the clustering accuracy is higher than the other related clustering algorithms.
Fusion and quality analysis for remote sensing images using contourlet transform
NASA Astrophysics Data System (ADS)
Choi, Yoonsuk; Sharifahmadian, Ershad; Latifi, Shahram
2013-05-01
Recent developments in remote sensing technologies have provided various images with high spatial and spectral resolutions. However, multispectral images have low spatial resolution and panchromatic images have low spectral resolution. Therefore, image fusion techniques are necessary to improve the spatial resolution of spectral images by injecting spatial details of high-resolution panchromatic images. The objective of image fusion is to provide useful information by improving the spatial resolution and the spectral information of the original images. The fusion results can be utilized in various applications, such as military, medical imaging, and remote sensing. This paper addresses two issues in image fusion: i) image fusion method and ii) quality analysis of fusion results. First, a new contourlet-based image fusion method is presented, which is an improvement over the wavelet-based fusion. This fusion method is then applied to a case study to demonstrate its fusion performance. Fusion framework and scheme used in the study are discussed in detail. Second, quality analysis for the fusion results is discussed. We employed various quality metrics in order to analyze the fusion results both spatially and spectrally. Our results indicate that the proposed contourlet-based fusion method performs better than the conventional wavelet-based fusion methods.
Introduction of statistical information in a syntactic analyzer for document image recognition
NASA Astrophysics Data System (ADS)
Maroneze, André O.; Coüasnon, Bertrand; Lemaitre, Aurélie
2011-01-01
This paper presents an improvement to document layout analysis systems, offering a possible solution to Sayre's paradox (which states that an element "must be recognized before it can be segmented; and it must be segmented before it can be recognized"). This improvement, based on stochastic parsing, allows integration of statistical information, obtained from recognizers, during syntactic layout analysis. We present how this fusion of numeric and symbolic information in a feedback loop can be applied to syntactic methods to improve document description expressiveness. To limit combinatorial explosion during exploration of solutions, we devised an operator that allows optional activation of the stochastic parsing mechanism. Our evaluation on 1250 handwritten business letters shows this method allows the improvement of global recognition scores.
[Improvement of 2-mercaptoimidazoline analysis in rubber products containing chlorine].
Kaneko, Reiko; Haneishi, Nahoko; Kawamura, Yoko
2012-01-01
An improved analysis method for 2-mercaptoimidazoline in rubber products containing chlorine was developed. 2-Mercaptoimidazoline (20 µg/mL) is detected by means of TLC with two developing solvents in the official method. But, this method is not quantitative. Instead, we employed HPLC using water-methanol (9 : 1) as the mobile phase. This procedure decreased interfering peaks, and the quantitation limit was 2 µg/mL of standard solution. 2-Mercaptoimidazoline was confirmed by GC-MS (5 µg/mL) and LC/MS (1 µg/mL) in the scan mode. For preparation of test solution, a soaking extraction method, in which 20 mL of methanol was added to the sample and allowed to stand overnight at about 40°C, was used. This gave similar values to the Soxhlet extraction method (official method) and was more convenient. The results indicate that our procedure is suitable for analysis of 2-mercaptoimidazoline. When 2-mercaptoimidazoline is detected, it is confirmed by either GC/MS or LC/MS.
Efficient alignment-free DNA barcode analytics.
Kuksa, Pavel; Pavlovic, Vladimir
2009-11-10
In this work we consider barcode DNA analysis problems and address them using alternative, alignment-free methods and representations which model sequences as collections of short sequence fragments (features). The methods use fixed-length representations (spectrum) for barcode sequences to measure similarities or dissimilarities between sequences coming from the same or different species. The spectrum-based representation not only allows for accurate and computationally efficient species classification, but also opens possibility for accurate clustering analysis of putative species barcodes and identification of critical within-barcode loci distinguishing barcodes of different sample groups. New alignment-free methods provide highly accurate and fast DNA barcode-based identification and classification of species with substantial improvements in accuracy and speed over state-of-the-art barcode analysis methods. We evaluate our methods on problems of species classification and identification using barcodes, important and relevant analytical tasks in many practical applications (adverse species movement monitoring, sampling surveys for unknown or pathogenic species identification, biodiversity assessment, etc.) On several benchmark barcode datasets, including ACG, Astraptes, Hesperiidae, Fish larvae, and Birds of North America, proposed alignment-free methods considerably improve prediction accuracy compared to prior results. We also observe significant running time improvements over the state-of-the-art methods. Our results show that newly developed alignment-free methods for DNA barcoding can efficiently and with high accuracy identify specimens by examining only few barcode features, resulting in increased scalability and interpretability of current computational approaches to barcoding.
A method for data base management and analysis for wind tunnel data
NASA Technical Reports Server (NTRS)
Biser, Aileen O.
1987-01-01
To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.
High-pressure liquid chromatography analysis of antibiotic susceptibility disks.
Hagel, R B; Waysek, E H; Cort, W M
1979-01-01
The analysis of antibiotic susceptibility disks by high-pressure liquid chromatography (HPLC) was investigated. Methods are presented for the potency determination of mecillinam, ampicillin, carbenicillin, and cephalothin alone and in various combinations. Good agreement between HPLC and microbiological data is observed for potency determinations with recoveries of greater than 95%. Relative standard deviations of lower than 2% are recorded for each HPLC method. HPLC methods offer improved accuracy and greater precision when compared to the standard microbiological methods of analysis for susceptibility disks. PMID:507793
The estimation of the measurement results with using statistical methods
NASA Astrophysics Data System (ADS)
Velychko, O.; Gordiyenko, T.
2015-02-01
The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.
Applications of cluster analysis to satellite soundings
NASA Technical Reports Server (NTRS)
Munteanu, M. J.; Jakubowicz, O.; Kalnay, E.; Piraino, P.
1984-01-01
The advantages of the use of cluster analysis in the improvement of satellite temperature retrievals were evaluated since the use of natural clusters, which are associated with atmospheric temperature soundings characteristic of different types of air masses, has the potential for improving stratified regression schemes in comparison with currently used methods which stratify soundings based on latitude, season, and land/ocean. The method of discriminatory analysis was used. The correct cluster of temperature profiles from satellite measurements was located in 85% of the cases. Considerable improvement was observed at all mandatory levels using regression retrievals derived in the clusters of temperature (weighted and nonweighted) in comparison with the control experiment and with the regression retrievals derived in the clusters of brightness temperatures of 3 MSU and 5 IR channels.
A symmetrical method to obtain shear moduli from microrheology.
Nishi, Kengo; Kilfoil, Maria L; Schmidt, Christoph F; MacKintosh, F C
2018-05-16
Passive microrheology typically deduces shear elastic loss and storage moduli from displacement time series or mean-squared displacements (MSD) of thermally fluctuating probe particles in equilibrium materials. Common data analysis methods use either Kramers-Kronig (KK) transformation or functional fitting to calculate frequency-dependent loss and storage moduli. We propose a new analysis method for passive microrheology that avoids the limitations of both of these approaches. In this method, we determine both real and imaginary components of the complex, frequency-dependent response function χ(ω) = χ'(ω) + iχ''(ω) as direct integral transforms of the MSD of thermal particle motion. This procedure significantly improves the high-frequency fidelity of χ(ω) relative to the use of KK transformation, which has been shown to lead to artifacts in χ'(ω). We test our method on both model and experimental data. Experiments were performed on solutions of worm-like micelles and dilute collagen solutions. While the present method agrees well with established KK-based methods at low frequencies, we demonstrate significant improvement at high frequencies using our symmetric analysis method, up to almost the fundamental Nyquist limit.
Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR
ERIC Educational Resources Information Center
Baglin, James
2014-01-01
Exploratory factor analysis (EFA) methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many…
NASA Astrophysics Data System (ADS)
Zhang, Jingxia; Guo, Yinghai; Shen, Yulin; Zhao, Difei; Li, Mi
2018-06-01
The use of geophysical logging data to identify lithology is an important groundwork in logging interpretation. Inevitably, noise is mixed in during data collection due to the equipment and other external factors and this will affect the further lithological identification and other logging interpretation. Therefore, to get a more accurate lithological identification it is necessary to adopt de-noising methods. In this study, a new de-noising method, namely improved complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN)-wavelet transform, is proposed, which integrates the superiorities of improved CEEMDAN and wavelet transform. Improved CEEMDAN, an effective self-adaptive multi-scale analysis method, is used to decompose non-stationary signals as the logging data to obtain the intrinsic mode function (IMF) of N different scales and one residual. Moreover, one self-adaptive scale selection method is used to determine the reconstruction scale k. Simultaneously, given the possible frequency aliasing problem between adjacent IMFs, a wavelet transform threshold de-noising method is used to reduce the noise of the (k-1)th IMF. Subsequently, the de-noised logging data are reconstructed by the de-noised (k-1)th IMF and the remaining low-frequency IMFs and the residual. Finally, empirical mode decomposition, improved CEEMDAN, wavelet transform and the proposed method are applied for analysis of the simulation and the actual data. Results show diverse performance of these de-noising methods with regard to accuracy for lithological identification. Compared with the other methods, the proposed method has the best self-adaptability and accuracy in lithological identification.
Summary of Technical Operations, 1991
1992-01-01
exploit commonality. The project is using the Feature-Oriented Domain Analysis ( FODA ) method, developed by the project in 1990, to perform this...the development of new movement control software. The analysis will also serve as a means of improving the FODA method. The results of this analysis ...STARS environment. The NASA Program Office has officially decided to expand the use of Rate Monotonic Analysis (RMA), which was originally isolated to
Price Analysis on Commercial Item Purchases Within the Department of Defense
2013-10-02
Department of Defense Introduction Background This research builds upon the work conducted in collaboration with the authors’ thesis students ...market research and price analysis methods . Most contract pricing of acquisitions was conducted using cost analysis before these reforms were added to...analysis methods are being used? b) Do market research reports refer to market information that improves the buyers’ understanding of pricing in the
Visual cluster analysis and pattern recognition template and methods
Osbourn, Gordon Cecil; Martinez, Rubel Francisco
1999-01-01
A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.
Probabilistic power flow using improved Monte Carlo simulation method with correlated wind sources
NASA Astrophysics Data System (ADS)
Bie, Pei; Zhang, Buhan; Li, Hang; Deng, Weisi; Wu, Jiasi
2017-01-01
Probabilistic Power Flow (PPF) is a very useful tool for power system steady-state analysis. However, the correlation among different random injection power (like wind power) brings great difficulties to calculate PPF. Monte Carlo simulation (MCS) and analytical methods are two commonly used methods to solve PPF. MCS has high accuracy but is very time consuming. Analytical method like cumulants method (CM) has high computing efficiency but the cumulants calculating is not convenient when wind power output does not obey any typical distribution, especially when correlated wind sources are considered. In this paper, an Improved Monte Carlo simulation method (IMCS) is proposed. The joint empirical distribution is applied to model different wind power output. This method combines the advantages of both MCS and analytical method. It not only has high computing efficiency, but also can provide solutions with enough accuracy, which is very suitable for on-line analysis.
Samak, M. Mosleh E. Abu; Bakar, A. Ashrif A.; Kashif, Muhammad; Zan, Mohd Saiful Dzulkifly
2016-01-01
This paper discusses numerical analysis methods for different geometrical features that have limited interval values for typically used sensor wavelengths. Compared with existing Finite Difference Time Domain (FDTD) methods, the alternating direction implicit (ADI)-FDTD method reduces the number of sub-steps by a factor of two to three, which represents a 33% time savings in each single run. The local one-dimensional (LOD)-FDTD method has similar numerical equation properties, which should be calculated as in the previous method. Generally, a small number of arithmetic processes, which result in a shorter simulation time, are desired. The alternating direction implicit technique can be considered a significant step forward for improving the efficiency of unconditionally stable FDTD schemes. This comparative study shows that the local one-dimensional method had minimum relative error ranges of less than 40% for analytical frequencies above 42.85 GHz, and the same accuracy was generated by both methods.
Computer Graphics-aided systems analysis: application to well completion design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Detamore, J.E.; Sarma, M.P.
1985-03-01
The development of an engineering tool (in the form of a computer model) for solving design and analysis problems related with oil and gas well production operations is discussed. The development of the method is based on integrating the concepts of ''Systems Analysis'' with the techniques of ''Computer Graphics''. The concepts behind the method are very general in nature. This paper, however, illustrates the application of the method in solving gas well completion design problems. The use of the method will save time and improve the efficiency of such design and analysis problems. The method can be extended to othermore » design and analysis aspects of oil and gas wells.« less
A Kalman Filter for SINS Self-Alignment Based on Vector Observation.
Xu, Xiang; Xu, Xiaosu; Zhang, Tao; Li, Yao; Tong, Jinwu
2017-01-29
In this paper, a self-alignment method for strapdown inertial navigation systems based on the q -method is studied. In addition, an improved method based on integrating gravitational apparent motion to form apparent velocity is designed, which can reduce the random noises of the observation vectors. For further analysis, a novel self-alignment method using a Kalman filter based on adaptive filter technology is proposed, which transforms the self-alignment procedure into an attitude estimation using the observation vectors. In the proposed method, a linear psuedo-measurement equation is adopted by employing the transfer method between the quaternion and the observation vectors. Analysis and simulation indicate that the accuracy of the self-alignment is improved. Meanwhile, to improve the convergence rate of the proposed method, a new method based on parameter recognition and a reconstruction algorithm for apparent gravitation is devised, which can reduce the influence of the random noises of the observation vectors. Simulations and turntable tests are carried out, and the results indicate that the proposed method can acquire sound alignment results with lower standard variances, and can obtain higher alignment accuracy and a faster convergence rate.
A Kalman Filter for SINS Self-Alignment Based on Vector Observation
Xu, Xiang; Xu, Xiaosu; Zhang, Tao; Li, Yao; Tong, Jinwu
2017-01-01
In this paper, a self-alignment method for strapdown inertial navigation systems based on the q-method is studied. In addition, an improved method based on integrating gravitational apparent motion to form apparent velocity is designed, which can reduce the random noises of the observation vectors. For further analysis, a novel self-alignment method using a Kalman filter based on adaptive filter technology is proposed, which transforms the self-alignment procedure into an attitude estimation using the observation vectors. In the proposed method, a linear psuedo-measurement equation is adopted by employing the transfer method between the quaternion and the observation vectors. Analysis and simulation indicate that the accuracy of the self-alignment is improved. Meanwhile, to improve the convergence rate of the proposed method, a new method based on parameter recognition and a reconstruction algorithm for apparent gravitation is devised, which can reduce the influence of the random noises of the observation vectors. Simulations and turntable tests are carried out, and the results indicate that the proposed method can acquire sound alignment results with lower standard variances, and can obtain higher alignment accuracy and a faster convergence rate. PMID:28146059
Improved spectrophotometric analysis of fullerenes C60 and C70 in high-solubility organic solvents.
Törpe, Alexander; Belton, Daniel J
2015-01-01
Fullerenes are among a number of recently discovered carbon allotropes that exhibit unique and versatile properties. The analysis of these materials is of great importance and interest. We present previously unreported spectroscopic data for C60 and C70 fullerenes in high-solubility solvents, including error bounds, so as to allow reliable colorimetric analysis of these materials. The Beer-Lambert-Bouguer law is found to be valid at all wavelengths. The measured data were highly reproducible, and yielded high-precision molar absorbance coefficients for C60 and C70 in o-xylene and o-dichlorobenzene, which both exhibit a high solubility for these fullerenes, and offer the prospect of improved extraction efficiency. A photometric method for a C60/C70 mixture analysis was validated with standard mixtures, and subsequently improved for real samples by correcting for light scattering, using a power-law fit. The method was successfully applied to the analysis of C60/C70 mixtures extracted from fullerene soot.
An improved method for bivariate meta-analysis when within-study correlations are unknown.
Hong, Chuan; D Riley, Richard; Chen, Yong
2018-03-01
Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the sample size is relatively small, we recommend the use of the robust method under the working independence assumption. We illustrate the proposed method through 2 meta-analyses. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Liu, Bilan; Qiu, Xing; Zhu, Tong; Tian, Wei; Hu, Rui; Ekholm, Sven; Schifitto, Giovanni; Zhong, Jianhui
2016-03-01
Subject-specific longitudinal DTI study is vital for investigation of pathological changes of lesions and disease evolution. Spatial Regression Analysis of Diffusion tensor imaging (SPREAD) is a non-parametric permutation-based statistical framework that combines spatial regression and resampling techniques to achieve effective detection of localized longitudinal diffusion changes within the whole brain at individual level without a priori hypotheses. However, boundary blurring and dislocation limit its sensitivity, especially towards detecting lesions of irregular shapes. In the present study, we propose an improved SPREAD (dubbed improved SPREAD, or iSPREAD) method by incorporating a three-dimensional (3D) nonlinear anisotropic diffusion filtering method, which provides edge-preserving image smoothing through a nonlinear scale space approach. The statistical inference based on iSPREAD was evaluated and compared with the original SPREAD method using both simulated and in vivo human brain data. Results demonstrated that the sensitivity and accuracy of the SPREAD method has been improved substantially by adapting nonlinear anisotropic filtering. iSPREAD identifies subject-specific longitudinal changes in the brain with improved sensitivity, accuracy, and enhanced statistical power, especially when the spatial correlation is heterogeneous among neighboring image pixels in DTI.
Improvement of the cost-benefit analysis algorithm for high-rise construction projects
NASA Astrophysics Data System (ADS)
Gafurov, Andrey; Skotarenko, Oksana; Plotnikov, Vladimir
2018-03-01
The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the "Project analysis scenario" flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.
Li, Xueqi; Woodman, Michael; Wang, Selina C
2015-08-01
Pheophytins and pyropheophytin are degradation products of chlorophyll pigments, and their ratios can be used as a sensitive indicator of stress during the manufacturing and storage of olive oil. They increase over time depending on the storage condition and if the oil is exposed to heat treatments during the refining process. The traditional analysis method includes solvent- and time-consuming steps of solid-phase extraction followed by analysis by high-performance liquid chromatography with ultraviolet detection. We developed an improved dilute/fluorescence method where multi-step sample preparation was replaced by a simple isopropanol dilution before the high-performance liquid chromatography injection. A quaternary solvent gradient method was used to include a fourth strong solvent wash on a quaternary gradient pump, which avoided the need to premix any solvents and greatly reduced the oil residues on the column from previous analysis. This new method not only reduces analysis cost and time but shows reliability, repeatability, and improved sensitivity, especially important for low-level samples. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.
2015-07-01
In this paper we present improved methods for discriminating and quantifying Primary Biological Aerosol Particles (PBAP) by applying hierarchical agglomerative cluster analysis to multi-parameter ultra violet-light induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1×106 points on a desktop computer, allowing for each fluorescent particle in a dataset to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient dataset. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best performing methods were applied to the BEACHON-RoMBAS ambient dataset where it was found that the z-score and range normalisation methods yield similar results with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misatrribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed yielding an explict cluster attribution for each particle, improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.
Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin
2017-01-01
Background Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Methods Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. Results A ‘Rich Picture’ was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. Conclusions When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further consideration. PMID:28062603
A Shot Number Based Approach to Performance Analysis in Table Tennis
Yoshida, Kazuto; Yamada, Koshi
2017-01-01
Abstract The current study proposes a novel approach that improves the conventional performance analysis in table tennis by introducing the concept of frequency, or the number of shots, of each shot number. The improvements over the conventional method are as follows: better accuracy of the evaluation of skills and tactics of players, additional insights into scoring and returning skills and ease of understanding the results with a single criterion. The performance analysis of matches played at the 2012 Summer Olympics in London was conducted using the proposed method. The results showed some effects of the shot number and gender differences in table tennis. Furthermore, comparisons were made between Chinese players and players from other countries, what threw light on the skills and tactics of the Chinese players. The present findings demonstrate that the proposed method provides useful information and has some advantages over the conventional method. PMID:28210334
Methods for improved forewarning of critical events across multiple data channels
Hively, Lee M [Philadelphia, TN
2007-04-24
This disclosed invention concerns improvements in forewarning of critical events via phase-space dissimilarity analysis of data from mechanical devices, electrical devices, biomedical data, and other physical processes. First, a single channel of process-indicative data is selected that can be used in place of multiple data channels without sacrificing consistent forewarning of critical events. Second, the method discards data of inadequate quality via statistical analysis of the raw data, because the analysis of poor quality data always yields inferior results. Third, two separate filtering operations are used in sequence to remove both high-frequency and low-frequency artifacts using a zero-phase quadratic filter. Fourth, the method constructs phase-space dissimilarity measures (PSDM) by combining of multi-channel time-serial data into a multi-channel time-delay phase-space reconstruction. Fifth, the method uses a composite measure of dissimilarity (C.sub.i) to provide a forewarning of failure and an indicator of failure onset.
A case study of the sensitivity of forecast skill to data and data analysis techniques
NASA Technical Reports Server (NTRS)
Baker, W. E.; Atlas, R.; Halem, M.; Susskind, J.
1983-01-01
A series of experiments have been conducted to examine the sensitivity of forecast skill to various data and data analysis techniques for the 0000 GMT case of January 21, 1979. These include the individual components of the FGGE observing system, the temperatures obtained with different satellite retrieval methods, and the method of vertical interpolation between the mandatory pressure analysis levels and the model sigma levels. It is found that NESS TIROS-N infrared retrievals seriously degrade a rawinsonde-only analysis over land, resulting in a poorer forecast over North America. Less degradation in the 72-hr forecast skill at sea level and some improvement at 500 mb is noted, relative to the control with TIROS-N retrievals produced with a physical inversion method which utilizes a 6-hr forecast first guess. NESS VTPR oceanic retrievals lead to an improved forecast over North America when added to the control.
A study on characteristics of retrospective optimal interpolation with WRF testbed
NASA Astrophysics Data System (ADS)
Kim, S.; Noh, N.; Lim, G.
2012-12-01
This study presents the application of retrospective optimal interpolation (ROI) with Weather Research and Forecasting model (WRF). Song et al. (2009) suggest ROI method which is an optimal interpolation (OI) that gradually assimilates observations over the analysis window for variance-minimum estimate of an atmospheric state at the initial time of the analysis window. Song and Lim (2011) improve the method by incorporating eigen-decomposition and covariance inflation. ROI method assimilates the data at post analysis time using perturbation method (Errico and Raeder, 1999) without adjoint model. In this study, ROI method is applied to WRF model to validate the algorithm and to investigate the capability. The computational costs for ROI can be reduced due to the eigen-decomposition of background error covariance. Using the background error covariance in eigen-space, 1-profile assimilation experiment is performed. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error by assimilation. The characteristics and strength/weakness of ROI method are investigated by conducting the experiments with other data assimilation method.
ERIC Educational Resources Information Center
Taylor, Purcell; El-Sabawi, Taleed; Cangin, Causenge
2016-01-01
Objective: To improve the CAGE (Cut down, Annoyed, Guilty, Eye opener) questionnaire's predictive accuracy in screening college students. Participants: The sample consisted of 219 midwestern university students who self-administered a confidential survey. Methods: Exploratory factor analysis, confirmatory factor analysis, receiver operating…
Three Dimensional Optical Coherence Tomography Imaging: Advantages and Advances
Gabriele, Michelle L; Wollstein, Gadi; Ishikawa, Hiroshi; Xu, Juan; Kim, Jongsick; Kagemann, Larry; Folio, Lindsey S; Schuman, Joel S.
2010-01-01
Three dimensional (3D) ophthalmic imaging using optical coherence tomography (OCT) has revolutionized assessment of the eye, the retina in particular. Recent technological improvements have made the acquisition of 3D-OCT datasets feasible. However, while volumetric data can improve disease diagnosis and follow-up, novel image analysis techniques are now necessary in order to process the dense 3D-OCT dataset. Fundamental software improvements include methods for correcting subject eye motion, segmenting structures or volumes of interest, extracting relevant data post hoc and signal averaging to improve delineation of retinal layers. In addition, innovative methods for image display, such as C-mode sectioning, provide a unique viewing perspective and may improve interpretation of OCT images of pathologic structures. While all of these methods are being developed, most remain in an immature state. This review describes the current status of 3D-OCT scanning and interpretation, and discusses the need for standardization of clinical protocols as well as the potential benefits of 3D-OCT scanning that could come when software methods for fully exploiting these rich data sets are available clinically. The implications of new image analysis approaches include improved reproducibility of measurements garnered from 3D-OCT, which may then help improve disease discrimination and progression detection. In addition, 3D-OCT offers the potential for preoperative surgical planning and intraoperative surgical guidance. PMID:20542136
NASA Astrophysics Data System (ADS)
Hou, Zhenlong; Huang, Danian
2017-09-01
In this paper, we make a study on the inversion of probability tomography (IPT) with gravity gradiometry data at first. The space resolution of the results is improved by multi-tensor joint inversion, depth weighting matrix and the other methods. Aiming at solving the problems brought by the big data in the exploration, we present the parallel algorithm and the performance analysis combining Compute Unified Device Architecture (CUDA) with Open Multi-Processing (OpenMP) based on Graphics Processing Unit (GPU) accelerating. In the test of the synthetic model and real data from Vinton Dome, we get the improved results. It is also proved that the improved inversion algorithm is effective and feasible. The performance of parallel algorithm we designed is better than the other ones with CUDA. The maximum speedup could be more than 200. In the performance analysis, multi-GPU speedup and multi-GPU efficiency are applied to analyze the scalability of the multi-GPU programs. The designed parallel algorithm is demonstrated to be able to process larger scale of data and the new analysis method is practical.
Material nonlinear analysis via mixed-iterative finite element method
NASA Technical Reports Server (NTRS)
Sutjahjo, Edhi; Chamis, Christos C.
1992-01-01
The performance of elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors are tested using 4-node quadrilateral finite elements. The membrane result is excellent, which indicates the implementation of elastic-plastic mixed-iterative analysis is appropriate. On the other hand, further research to improve bending performance of the method seems to be warranted.
Reliability Validation and Improvement Framework
2012-11-01
systems . Steps in that direction include the use of the Architec- ture Tradeoff Analysis Method ® (ATAM®) developed at the Carnegie Mellon...embedded software • cyber - physical systems (CPSs) to indicate that the embedded software interacts with, manag - es, and controls a physical system [Lee...the use of formal static analysis methods to increase our confidence in system operation beyond testing. However, analysis results
ERIC Educational Resources Information Center
Abrams, Neal M.
2012-01-01
A cloud network system is combined with standard computing applications and a course management system to provide a robust method for sharing data among students. This system provides a unique method to improve data analysis by easily increasing the amount of sampled data available for analysis. The data can be shared within one course as well as…
Analysis of Free Modeling Predictions by RBO Aleph in CASP11
Mabrouk, Mahmoud; Werner, Tim; Schneider, Michael; Putz, Ines; Brock, Oliver
2015-01-01
The CASP experiment is a biannual benchmark for assessing protein structure prediction methods. In CASP11, RBO Aleph ranked as one of the top-performing automated servers in the free modeling category. This category consists of targets for which structural templates are not easily retrievable. We analyze the performance of RBO Aleph and show that its success in CASP was a result of its ab initio structure prediction protocol. A detailed analysis of this protocol demonstrates that two components unique to our method greatly contributed to prediction quality: residue–residue contact prediction by EPC-map and contact–guided conformational space search by model-based search (MBS). Interestingly, our analysis also points to a possible fundamental problem in evaluating the performance of protein structure prediction methods: Improvements in components of the method do not necessarily lead to improvements of the entire method. This points to the fact that these components interact in ways that are poorly understood. This problem, if indeed true, represents a significant obstacle to community-wide progress. PMID:26492194
An improved high-throughput lipid extraction method for the analysis of human brain lipids.
Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett
2013-03-01
We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.
Improving Management of Green Retrofits from a Stakeholder Perspective: A Case Study in China.
Liang, Xin; Shen, Geoffrey Qiping; Guo, Li
2015-10-28
Green retrofits, which improve the environment and energy efficiency of buildings, are considered a potential solution for reducing energy consumption as well as improving human health and productivity. They represent some of the riskiest, most complex, and most uncertain projects to manage. As the foundation of project management, critical success factors (CSFs) have been emphasized by previous research. However, most studies identified and prioritized CSFs independently of stakeholders. This differs from the reality, where the success of green retrofits is tightly interrelated to the stakeholders of projects. To improve the analysis from a stakeholder perspective, the present study proposed an innovative method based on a two-mode social network analysis to integrate CSF analysis with stakeholders. The results of this method can provide further understanding of the interactions between stakeholders and CSFs, and the underlying relationship among CSFs through stakeholders. A pilot study was conducted to apply the proposed method and assess the CSFs for green retrofits in China. The five most significant CSFs are identified in the management of green retrofit. Furthermore, the interrelations between stakeholders and CSFs, coefficient and clusters of CSFs are likewise discussed.
Improving Management of Green Retrofits from a Stakeholder Perspective: A Case Study in China
Liang, Xin; Shen, Geoffrey Qiping; Guo, Li
2015-01-01
Green retrofits, which improve the environment and energy efficiency of buildings, are considered a potential solution for reducing energy consumption as well as improving human health and productivity. They represent some of the riskiest, most complex, and most uncertain projects to manage. As the foundation of project management, critical success factors (CSFs) have been emphasized by previous research. However, most studies identified and prioritized CSFs independently of stakeholders. This differs from the reality, where the success of green retrofits is tightly interrelated to the stakeholders of projects. To improve the analysis from a stakeholder perspective, the present study proposed an innovative method based on a two-mode social network analysis to integrate CSF analysis with stakeholders. The results of this method can provide further understanding of the interactions between stakeholders and CSFs, and the underlying relationship among CSFs through stakeholders. A pilot study was conducted to apply the proposed method and assess the CSFs for green retrofits in China. The five most significant CSFs are identified in the management of green retrofit. Furthermore, the interrelations between stakeholders and CSFs, coefficient and clusters of CSFs are likewise discussed. PMID:26516897
2015-05-01
1 LEXICAL LINK ANALYSIS (LLA) APPLICATION: IMPROVING WEB SERVICE TO DEFENSE ACQUISITION VISIBILITY ENVIRONMENT(DAVE) May 13-14, 2015 Dr. Ying...REPORT DATE MAY 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lexical Link Analysis (LLA) Application...Making 3 2 1 3 L L A Methods • Lexical Link Analysis (LLA) Core – LLA Reports and Visualizations • Collaborative Learning Agents (CLA) for
Visual cluster analysis and pattern recognition template and methods
Osbourn, G.C.; Martinez, R.F.
1999-05-04
A method of clustering using a novel template to define a region of influence is disclosed. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques. 30 figs.
Nannemann, David P; Birmingham, William R; Scism, Robert A; Bachmann, Brian O
2011-01-01
To address the synthesis of increasingly structurally diverse small-molecule drugs, methods for the generation of efficient and selective biological catalysts are becoming increasingly important. ‘Directed evolution’ is an umbrella term referring to a variety of methods for improving or altering the function of enzymes using a nature-inspired twofold strategy of mutagenesis followed by selection. This article provides an objective assessment of the effectiveness of directed evolution campaigns in generating enzymes with improved catalytic parameters for new substrates from the last decade, excluding studies that aimed to select for only improved physical properties and those that lack kinetic characterization. An analysis of the trends of methodologies and their success rates from 81 qualifying examples in the literature reveals the average fold improvement for kcat (or Vmax), Km and kcat/Km to be 366-, 12- and 2548-fold, respectively, whereas the median fold improvements are 5.4, 3 and 15.6. Further analysis by enzyme class, library-generation methodology and screening methodology explores relationships between successful campaigns and the methodologies employed. PMID:21644826
An analysis of temperature-induced errors for an ultrasound distance measuring system. M. S. Thesis
NASA Technical Reports Server (NTRS)
Wenger, David Paul
1991-01-01
The presentation of research is provided in the following five chapters. Chapter 2 presents the necessary background information and definitions for general work with ultrasound and acoustics. It also discusses the basis for errors in the slant range measurements. Chapter 3 presents a method of problem solution and an analysis of the sensitivity of the equations to slant range measurement errors. It also presents various methods by which the error in the slant range measurements can be reduced to improve overall measurement accuracy. Chapter 4 provides a description of a type of experiment used to test the analytical solution and provides a discussion of its results. Chapter 5 discusses the setup of a prototype collision avoidance system, discusses its accuracy, and demonstrates various methods of improving the accuracy along with the improvements' ramifications. Finally, Chapter 6 provides a summary of the work and a discussion of conclusions drawn from it. Additionally, suggestions for further research are made to improve upon what has been presented here.
Efficient alignment-free DNA barcode analytics
Kuksa, Pavel; Pavlovic, Vladimir
2009-01-01
Background In this work we consider barcode DNA analysis problems and address them using alternative, alignment-free methods and representations which model sequences as collections of short sequence fragments (features). The methods use fixed-length representations (spectrum) for barcode sequences to measure similarities or dissimilarities between sequences coming from the same or different species. The spectrum-based representation not only allows for accurate and computationally efficient species classification, but also opens possibility for accurate clustering analysis of putative species barcodes and identification of critical within-barcode loci distinguishing barcodes of different sample groups. Results New alignment-free methods provide highly accurate and fast DNA barcode-based identification and classification of species with substantial improvements in accuracy and speed over state-of-the-art barcode analysis methods. We evaluate our methods on problems of species classification and identification using barcodes, important and relevant analytical tasks in many practical applications (adverse species movement monitoring, sampling surveys for unknown or pathogenic species identification, biodiversity assessment, etc.) On several benchmark barcode datasets, including ACG, Astraptes, Hesperiidae, Fish larvae, and Birds of North America, proposed alignment-free methods considerably improve prediction accuracy compared to prior results. We also observe significant running time improvements over the state-of-the-art methods. Conclusion Our results show that newly developed alignment-free methods for DNA barcoding can efficiently and with high accuracy identify specimens by examining only few barcode features, resulting in increased scalability and interpretability of current computational approaches to barcoding. PMID:19900305
Mezouari, S; Liu, W Yun; Pace, G; Hartman, T G
2015-01-01
The objective of this study was to develop an improved analytical method for the determination of 3-chloro-1,2-propanediol (3-MCPD) and 1,3-dichloropropanol (1,3-DCP) in paper-type food packaging. The established method includes aqueous extraction, matrix spiking of a deuterated surrogate internal standard (3-MCPD-d₅), clean-up using Extrelut solid-phase extraction, derivatisation using a silylation reagent, and GC-MS analysis of the chloropropanols as their corresponding trimethyl silyl ethers. The new method is applicable to food-grade packaging samples using European Commission standard aqueous extraction and aqueous food stimulant migration tests. In this improved method, the derivatisation procedure was optimised; the cost and time of the analysis were reduced by using 10 times less sample, solvents and reagents than in previously described methods. Overall the validation data demonstrate that the method is precise and reliable. The limit of detection (LOD) of the aqueous extract was 0.010 mg kg(-1) (w/w) for both 3-MCPD and 1,3-DCP. Analytical precision had a relative standard deviation (RSD) of 3.36% for 3-MCPD and an RSD of 7.65% for 1,3-DCP. The new method was satisfactorily applied to the analysis of over 100 commercial paperboard packaging samples. The data are being used to guide the product development of a next generation of wet-strength resins with reduced chloropropanol content, and also for risk assessments to calculate the virtual safe dose (VSD).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebert, Christopher Hysjulien
This dissertation describes a variety of studies meant to improve the analytical performance of inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation (LA) ICP-MS. The emission behavior of individual droplets and LA generated particles in an ICP is studied using a high-speed, high frame rate digital camera. Phenomena are observed during the ablation of silicate glass that would cause elemental fractionation during analysis by ICP-MS. Preliminary work for ICP torch developments specifically tailored for the improvement of LA sample introduction are presented. An abnormal scarcity of metal-argon polyatomic ions (MAr{sup +}) is observed during ICP-MS analysis. Evidence shows thatmore » MAr{sup +} ions are dissociated by collisions with background gas in a shockwave near the tip of the skimmer cone. Method development towards the improvement of LA-ICP-MS for environmental monitoring is described. A method is developed to trap small particles in a collodion matrix and analyze each particle individually by LA-ICP-MS.« less
Design and Test of an Improved Crashworthiness Small Composite Airframe
NASA Technical Reports Server (NTRS)
Terry, James E.; Hooper, Steven J.; Nicholson, Mark
2002-01-01
The purpose of this small business innovative research (SBIR) program was to evaluate the feasibility of developing small composite airplanes with improved crashworthiness. A combination of analysis and half scale component tests were used to develop an energy absorbing airframe. Four full scale crash tests were conducted at the NASA Impact Dynamics Research Facility, two on a hard surface and two onto soft soil, replicating earlier NASA tests of production general aviation airplanes. Several seat designs and restraint systems including both an air bag and load limiting shoulder harnesses were tested. Tests showed that occupant loads were within survivable limits with the improved structural design and the proper combination of seats and restraint systems. There was no loss of cabin volume during the events. The analysis method developed provided design guidance but time did not allow extending the analysis to soft soil impact. This project demonstrated that survivability improvements are possible with modest weight penalties. The design methods can be readily applied by airplane designers using the examples in this report.
NASA Astrophysics Data System (ADS)
Mustak, S.
2013-09-01
The correction of atmospheric effects is very essential because visible bands of shorter wavelength are highly affected by atmospheric scattering especially of Rayleigh scattering. The objectives of the paper is to find out the haze values present in the all spectral bands and to correct the haze values for urban analysis. In this paper, Improved Dark Object Subtraction method of P. Chavez (1988) is applied for the correction of atmospheric haze in the Resoucesat-1 LISS-4 multispectral satellite image. Dark object Subtraction is a very simple image-based method of atmospheric haze which assumes that there are at least a few pixels within an image which should be black (% reflectance) and such black reflectance termed as dark object which are clear water body and shadows whose DN values zero (0) or Close to zero in the image. Simple Dark Object Subtraction method is a first order atmospheric correction but Improved Dark Object Subtraction method which tends to correct the Haze in terms of atmospheric scattering and path radiance based on the power law of relative scattering effect of atmosphere. The haze values extracted using Simple Dark Object Subtraction method for Green band (Band2), Red band (Band3) and NIR band (band4) are 40, 34 and 18 but the haze values extracted using Improved Dark Object Subtraction method are 40, 18.02 and 11.80 for aforesaid bands. Here it is concluded that the haze values extracted by Improved Dark Object Subtraction method provides more realistic results than Simple Dark Object Subtraction method.
Jackowetz, J N; Mira de Orduña, R
2013-08-15
Sulphur dioxide (SO2) is essential for the preservation of wines. The presence of SO2 binding compounds in musts and wines may limit sulphite efficacy leading to higher total SO2 additions, which may exceed SO2 limits permitted by law and pose health risks for sensitive individuals. An improved method for the quantification of significant wine SO2 binding compounds is presented that applies a novel sample treatment approach and rapid UHPLC separation. Glucose, galacturonic acid, alpha-ketoglutarate, pyruvate, acetoin and acetaldehyde were derivatised with 2,4-dinitrophenylhydrazine and separated using a solid core C18 phase by ultra high performance liquid chromatography. Addition of EDTA to samples prevented de novo acetaldehyde formation from ethanol oxidation. Optimised derivatisation duration enhanced reproducibility and allowed for glucose and galacturonic acid quantification. High glucose residues were found to interfere with the recovery of other SO2 binders, but practical SO2 concentrations and red wine pigments did not affect derivatisation efficiency. The calibration range, method accuracy, precision and limits of detection were found to be satisfactory for routine analysis of SO2 binders in wines. The current method represents a significant improvement in the comprehensive analysis of SO2 binding wine carbonyls. It allows for the quantification of major SO2 binders at practical analyte concentrations, and uses a simple sample treatment method that prevents treatment artifacts. Equipment utilisation could be reduced by rapid LC separation while maintaining analytical performance parameters. The improved method will be a valuable addition for the analysis of total SO2 binder pools in oenological samples. Published by Elsevier Ltd.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos
2014-06-01
Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.
Levman, Jacob E D; Gallego-Ortiz, Cristina; Warner, Ellen; Causer, Petrina; Martel, Anne L
2016-02-01
Magnetic resonance imaging (MRI)-enabled cancer screening has been shown to be a highly sensitive method for the early detection of breast cancer. Computer-aided detection systems have the potential to improve the screening process by standardizing radiologists to a high level of diagnostic accuracy. This retrospective study was approved by the institutional review board of Sunnybrook Health Sciences Centre. This study compares the performance of a proposed method for computer-aided detection (based on the second-order spatial derivative of the relative signal intensity) with the signal enhancement ratio (SER) on MRI-based breast screening examinations. Comparison is performed using receiver operating characteristic (ROC) curve analysis as well as free-response receiver operating characteristic (FROC) curve analysis. A modified computer-aided detection system combining the proposed approach with the SER method is also presented. The proposed method provides improvements in the rates of false positive markings over the SER method in the detection of breast cancer (as assessed by FROC analysis). The modified computer-aided detection system that incorporates both the proposed method and the SER method yields ROC results equal to that produced by SER while simultaneously providing improvements over the SER method in terms of false positives per noncancerous exam. The proposed method for identifying malignancies outperforms the SER method in terms of false positives on a challenging dataset containing many small lesions and may play a useful role in breast cancer screening by MRI as part of a computer-aided detection system.
Zhang, Peng; Li, Houqiang; Wang, Honghui; Wong, Stephen T C; Zhou, Xiaobo
2011-01-01
Peak detection is one of the most important steps in mass spectrometry (MS) analysis. However, the detection result is greatly affected by severe spectrum variations. Unfortunately, most current peak detection methods are neither flexible enough to revise false detection results nor robust enough to resist spectrum variations. To improve flexibility, we introduce peak tree to represent the peak information in MS spectra. Each tree node is a peak judgment on a range of scales, and each tree decomposition, as a set of nodes, is a candidate peak detection result. To improve robustness, we combine peak detection and common peak alignment into a closed-loop framework, which finds the optimal decomposition via both peak intensity and common peak information. The common peak information is derived and loopily refined from the density clustering of the latest peak detection result. Finally, we present an improved ant colony optimization biomarker selection method to build a whole MS analysis system. Experiment shows that our peak detection method can better resist spectrum variations and provide higher sensitivity and lower false detection rates than conventional methods. The benefits from our peak-tree-based system for MS disease analysis are also proved on real SELDI data.
Zhi, Ruicong; Zhao, Lei; Shi, Jingye
2016-07-01
Developing innovative products that satisfy various groups of consumers helps a company maintain a leading market share. The hedonic scale and just-about-right (JAR) scale are 2 popular methods for hedonic assessment and product diagnostics. In this paper, we chose to study flavored liquid milk because it is one of the most necessary nutrient sources in China. The hedonic scale and JAR scale methods were combined to provide directional information for flavored liquid milk optimization. Two methods of analysis (penalty analysis and partial least squares regression on dummy variables) were used and the results were compared. This paper had 2 aims: (1) to investigate consumer preferences of basic flavor attributes of milk from various cities in China; and (2) to determine the improvement direction for specific products and the ideal overall liking for consumers in various cities. The results showed that consumers in China have local-specific requirements for characteristics of flavored liquid milk. Furthermore, we provide a consumer-oriented product design method to improve sensory quality according to the preference of particular consumers. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
[Quality assessment in anesthesia].
Kupperwasser, B
1996-01-01
Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.
Analysis of Non Local Image Denoising Methods
NASA Astrophysics Data System (ADS)
Pardo, Álvaro
Image denoising is probably one of the most studied problems in the image processing community. Recently a new paradigm on non local denoising was introduced. The Non Local Means method proposed by Buades, Morel and Coll attracted the attention of other researches who proposed improvements and modifications to their proposal. In this work we analyze those methods trying to understand their properties while connecting them to segmentation based on spectral graph properties. We also propose some improvements to automatically estimate the parameters used on these methods.
NASA Technical Reports Server (NTRS)
Mickey, F. E.; Mcewan, A. J.; Ewing, E. G.; Huyler, W. C., Jr.; Khajeh-Nouri, B.
1970-01-01
An analysis was conducted with the objective of upgrading and improving the loads, stress, and performance prediction methods for Apollo spacecraft parachutes. The subjects considered were: (1) methods for a new theoretical approach to the parachute opening process, (2) new experimental-analytical techniques to improve the measurement of pressures, stresses, and strains in inflight parachutes, and (3) a numerical method for analyzing the dynamical behavior of rapidly loaded pilot chute risers.
Human body region enhancement method based on Kinect infrared imaging
NASA Astrophysics Data System (ADS)
Yang, Lei; Fan, Yubo; Song, Xiaowei; Cai, Wenjing
2016-10-01
To effectively improve the low contrast of human body region in the infrared images, a combing method of several enhancement methods is utilized to enhance the human body region. Firstly, for the infrared images acquired by Kinect, in order to improve the overall contrast of the infrared images, an Optimal Contrast-Tone Mapping (OCTM) method with multi-iterations is applied to balance the contrast of low-luminosity infrared images. Secondly, to enhance the human body region better, a Level Set algorithm is employed to improve the contour edges of human body region. Finally, to further improve the human body region in infrared images, Laplacian Pyramid decomposition is adopted to enhance the contour-improved human body region. Meanwhile, the background area without human body region is processed by bilateral filtering to improve the overall effect. With theoretical analysis and experimental verification, the results show that the proposed method could effectively enhance the human body region of such infrared images.
USDA-ARS?s Scientific Manuscript database
Headspace solid-phase microextraction (HS-SPME) coupled with gas chromatography–mass spectrometry (GC-MS) is commonly used in analyzing insect volatiles. In order to improve the detection of volatiles in insects, a freeze-thaw method was applied to insect samples before the HS-SPME-GC-MS analysis. ...
RESEARCH ASSOCIATED WITH THE DEVELOPMENT OF EPA METHOD 552.2
The work presented in this paper entails the development of a method for haloacetic acid (HAA) analysis, Environmental Protection Agency (EPA)method 552.2, that improves the saftey and efficiency of previous methods and incorporates three additional trihalogenated acetic acids: b...
An automatic step adjustment method for average power analysis technique used in fiber amplifiers
NASA Astrophysics Data System (ADS)
Liu, Xue-Ming
2006-04-01
An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.
Guided SAR image despeckling with probabilistic non local weights
NASA Astrophysics Data System (ADS)
Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny
2017-12-01
SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.
Padula, William V; Lee, Ken K H; Pronovost, Peter J
2017-08-03
To scale and sustain successful quality improvement (QI) interventions, it is recommended for health system leaders to calculate the economic and financial sustainability of the intervention. Many methods of economic evaluation exist, and the type of method depends on the audience: providers, researchers, and hospital executives. This is a primer to introduce cost-effectiveness analysis, budget impact analysis, and return on investment calculation as 3 distinct methods for each stakeholder needing a measurement of the value of QI at the health system level. Using cases for the QI of hospital-acquired condition rates (e.g., pressure injuries), this primer proceeds stepwise through each method beginning from the same starting point of constructing a model so that the repetition of steps is minimized and thereby capturing the attention of all intended audiences.
Improved method for HPLC analysis of polyamines, agmatine and aromatic monoamines in plant tissue
NASA Technical Reports Server (NTRS)
Slocum, R. D.; Flores, H. E.; Galston, A. W.; Weinstein, L. H.
1989-01-01
The high performance liquid chromatographic (HPLC) method of Flores and Galston (1982 Plant Physiol 69: 701) for the separation and quantitation of benzoylated polyamines in plant tissues has been widely adopted by other workers. However, due to previously unrecognized problems associated with the derivatization of agmatine, this important intermediate in plant polyamine metabolism cannot be quantitated using this method. Also, two polyamines, putrescine and diaminopropane, also are not well resolved using this method. A simple modification of the original HPLC procedure greatly improves the separation and quantitation of these amines, and further allows the simulation analysis of phenethylamine and tyramine, which are major monoamine constituents of tobacco and other plant tissues. We have used this modified HPLC method to characterize amine titers in suspension cultured carrot (Daucas carota L.) cells and tobacco (Nicotiana tabacum L.) leaf tissues.
ERIC Educational Resources Information Center
Conn, Vicki S.; Hafdahl, Adam R.; Cooper, Pamela S.; Ruppar, Todd M.; Mehr, David R.; Russell, Cynthia L.
2009-01-01
Purpose: This study investigated the effectiveness of interventions to improve medication adherence (MA) in older adults. Design and Methods: Meta-analysis was used to synthesize results of 33 published and unpublished randomized controlled trials. Random-effects models were used to estimate overall mean effect sizes (ESs) for MA, knowledge,…
ERIC Educational Resources Information Center
Golden, Thomas P.; Karpur, Arun
2012-01-01
This study is a comparative analysis of the impact of traditional face-to-face training contrasted with a blended learning approach, as it relates to improving skills, knowledge and attitudes for enhancing practices for achieving improved employment outcomes for individuals with disabilities. The study included two intervention groups: one…
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
2014-01-01
Background To improve quality of care and patient outcomes, health system decision-makers need to identify and implement effective interventions. An increasing number of systematic reviews document the effects of quality improvement programs to assist decision-makers in developing new initiatives. However, limitations in the reporting of primary studies and current meta-analysis methods (including approaches for exploring heterogeneity) reduce the utility of existing syntheses for health system decision-makers. This study will explore the role of innovative meta-analysis approaches and the added value of enriched and updated data for increasing the utility of systematic reviews of complex interventions. Methods/Design We will use the dataset from our recent systematic review of 142 randomized trials of diabetes quality improvement programs to evaluate novel approaches for exploring heterogeneity. These will include exploratory methods, such as multivariate meta-regression analyses and all-subsets combinatorial meta-analysis. We will then update our systematic review to include new trials and enrich the dataset by surveying authors of all included trials. In doing so, we will explore the impact of variables not, reported in previous publications, such as details of study context, on the effectiveness of the intervention. We will use innovative analytical methods on the enriched and updated dataset to identify key success factors in the implementation of quality improvement interventions for diabetes. Decision-makers will be involved throughout to help identify and prioritize variables to be explored and to aid in the interpretation and dissemination of results. Discussion This study will inform future systematic reviews of complex interventions and describe the value of enriching and updating data for exploring heterogeneity in meta-analysis. It will also result in an updated comprehensive systematic review of diabetes quality improvement interventions that will be useful to health system decision-makers in developing interventions to improve outcomes for people with diabetes. Systematic review registration PROSPERO registration no. CRD42013005165 PMID:25115289
ERIC Educational Resources Information Center
Kapanadze, Dilek Ünveren
2018-01-01
The aim of this study is to identify the effect of using discourse analysis method on the skills of reading comprehension, textual analysis, creating discourse and use of language. In this study, the authentic test model with pre-test and post-test control group was used in order to determine the difference of academic achievement between…
NASA Technical Reports Server (NTRS)
Anstead, R. J. (Editor); Goldberg, E. (Editor)
1975-01-01
Failure analysis test methods are presented for use in analyzing candidate electronic parts and in improving future design reliability. Each test is classified as nondestructive, semidestructive, or destructive. The effects upon applicable part types (i.e. integrated circuit, transitor) are discussed. Methodology is given for performing the following: immersion tests, radio graphic tests, dewpoint tests, gas ambient analysis, cross sectioning, and ultraviolet examination.
puma: a Bioconductor package for propagating uncertainty in microarray analysis.
Pearson, Richard D; Liu, Xuejun; Sanguinetti, Guido; Milo, Marta; Lawrence, Neil D; Rattray, Magnus
2009-07-09
Most analyses of microarray data are based on point estimates of expression levels and ignore the uncertainty of such estimates. By determining uncertainties from Affymetrix GeneChip data and propagating these uncertainties to downstream analyses it has been shown that we can improve results of differential expression detection, principal component analysis and clustering. Previously, implementations of these uncertainty propagation methods have only been available as separate packages, written in different languages. Previous implementations have also suffered from being very costly to compute, and in the case of differential expression detection, have been limited in the experimental designs to which they can be applied. puma is a Bioconductor package incorporating a suite of analysis methods for use on Affymetrix GeneChip data. puma extends the differential expression detection methods of previous work from the 2-class case to the multi-factorial case. puma can be used to automatically create design and contrast matrices for typical experimental designs, which can be used both within the package itself but also in other Bioconductor packages. The implementation of differential expression detection methods has been parallelised leading to significant decreases in processing time on a range of computer architectures. puma incorporates the first R implementation of an uncertainty propagation version of principal component analysis, and an implementation of a clustering method based on uncertainty propagation. All of these techniques are brought together in a single, easy-to-use package with clear, task-based documentation. For the first time, the puma package makes a suite of uncertainty propagation methods available to a general audience. These methods can be used to improve results from more traditional analyses of microarray data. puma also offers improvements in terms of scope and speed of execution over previously available methods. puma is recommended for anyone working with the Affymetrix GeneChip platform for gene expression analysis and can also be applied more generally.
DAMBE7: New and Improved Tools for Data Analysis in Molecular Biology and Evolution.
Xia, Xuhua
2018-06-01
DAMBE is a comprehensive software package for genomic and phylogenetic data analysis on Windows, Linux, and Macintosh computers. New functions include imputing missing distances and phylogeny simultaneously (paving the way to build large phage and transposon trees), new bootstrapping/jackknifing methods for PhyPA (phylogenetics from pairwise alignments), and an improved function for fast and accurate estimation of the shape parameter of the gamma distribution for fitting rate heterogeneity over sites. Previous method corrects multiple hits for each site independently. DAMBE's new method uses all sites simultaneously for correction. DAMBE, featuring a user-friendly graphic interface, is freely available from http://dambe.bio.uottawa.ca (last accessed, April 17, 2018).
Laser-induced fluorescence spectroscopy for improved chemical analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelbwachs, J.A.
1983-09-01
This report summarizes the progress achieved over the past five years in the laser-induced fluorescence spectroscopy (LIFS) for improved chemical analysis program. Our initial efforts yielded significantly lower detection limits for trace elemental analysis by the use of both cw and pulsed laser excitations. New methods of LIFS were developed that were shown to overcome many of the traditional limitations to LIFS techniques. LIFS methods have been applied to yield fundamental scientific data that further the understanding of forces between atoms and other atoms and molecules. In recent work, two-photon ionization was combined with LIFS and applied, for the firstmore » time, to the study of energy transfer in ions.« less
Understanding and benchmarking health service achievement of policy goals for chronic disease
2012-01-01
Background Key challenges in benchmarking health service achievement of policy goals in areas such as chronic disease are: 1) developing indicators and understanding how policy goals might work as indicators of service performance; 2) developing methods for economically collecting and reporting stakeholder perceptions; 3) combining and sharing data about the performance of organizations; 4) interpreting outcome measures; 5) obtaining actionable benchmarking information. This study aimed to explore how a new Boolean-based small-N method from the social sciences—Qualitative Comparative Analysis or QCA—could contribute to meeting these internationally shared challenges. Methods A ‘multi-value QCA’ (MVQCA) analysis was conducted of data from 24 senior staff at 17 randomly selected services for chronic disease, who provided perceptions of 1) whether government health services were improving their achievement of a set of statewide policy goals for chronic disease and 2) the efficacy of state health office actions in influencing this improvement. The analysis produced summaries of configurations of perceived service improvements. Results Most respondents observed improvements in most areas but uniformly good improvements across services were not perceived as happening (regardless of whether respondents identified a state health office contribution to that improvement). The sentinel policy goal of using evidence to develop service practice was not achieved at all in four services and appears to be reliant on other kinds of service improvements happening. Conclusions The QCA method suggested theoretically plausible findings and an approach that with further development could help meet the five benchmarking challenges. In particular, it suggests that achievement of one policy goal may be reliant on achievement of another goal in complex ways that the literature has not yet fully accommodated but which could help prioritize policy goals. The weaknesses of QCA can be found wherever traditional big-N statistical methods are needed and possible, and in its more complex and therefore difficult to empirically validate findings. It should be considered a potentially valuable adjunct method for benchmarking complex health policy goals such as those for chronic disease. PMID:23020943
Jeong, Yeong Ran; Kim, Sun Young; Park, Young Sam; Lee, Gyun Min
2018-03-21
N-glycans of therapeutic glycoproteins are critical quality attributes that should be monitored throughout all stages of biopharmaceutical development. To reduce both the time for sample preparation and the variations in analytical results, we have developed an N-glycan analysis method that includes improved 2-aminobenzoic acid (2-AA) labeling to easily remove deglycosylated proteins. Using this analytical method, 15 major 2-AA-labeled N-glycans of Enbrel ® were separated into single peaks in hydrophilic interaction chromatography mode and therefore could be quantitated. 2-AA-labeled N-glycans were also highly compatible with in-line quadrupole time-of-flight mass spectrometry (MS) for structural identification. The structures of 15 major and 18 minor N-glycans were identified from their mass values determined by quadrupole time-of-flight MS. Furthermore, the structures of 14 major N-glycans were confirmed by interpreting the MS/MS data of each N-glycan. This analytical method was also successfully applied to neutral N-glycans of Humira ® and highly sialylated N-glycans of NESP ® . Furthermore, the analysis data of Enbrel ® that were accumulated for 2.5 years demonstrated the high-level consistency of this analytical method. Taken together, the results show that a wide repertoire of N-glycans of therapeutic glycoproteins can be analyzed with high efficiency and consistency using the improved 2-AA labeling-based N-glycan analysis method. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Multivariate Methods for Meta-Analysis of Genetic Association Studies.
Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G
2018-01-01
Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.
But Is It Nutritious? Computer Analysis Creates Healthier Meals.
ERIC Educational Resources Information Center
Corrigan, Kathleen A.; Aumann, Margaret B.
1993-01-01
A computerized menu-planning method, "Nutrient Standard Menu Planning" (NSMP), uses today's technology to create healthier menus. Field tested in 20 California school districts, the advantages of NSMP are cost effectiveness, increased flexibility, greater productivity, improved public relations, improved finances, and improved student…
Automated Student Model Improvement
ERIC Educational Resources Information Center
Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.
2012-01-01
Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…
[Conversation analysis for improving nursing communication].
Yi, Myungsun
2007-08-01
Nursing communication has become more important than ever before because quality of nursing services largely depends on the quality of communication in a very competitive health care environment. This article was to introduce ways to improve nursing communication using conversation analysis. This was a review study on conversation analysis, critically examining previous studies in nursing communication and interpersonal relationships. This study provided theoretical backgrounds and basic assumptions of conversation analysis which was influenced by ethnomethodology, phenomenology, and sociolinguistic. In addition, the characteristics and analysis methods of conversation analysis were illustrated in detail. Lastly, how conversation analysis could help improve communication was shown, by examining researches using conversation analysis not only for ordinary conversations but also for extraordinary or difficult conversations such as conversations between patients with dementia and their professional nurses. Conversation analysis can help in improving nursing communication by providing various structures and patterns as well as prototypes of conversation, and by suggesting specific problems and problem-solving strategies in communication.
Process Improvement Through Tool Integration in Aero-Mechanical Design
NASA Technical Reports Server (NTRS)
Briggs, Clark
2010-01-01
Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.
Compliance and stress sensitivity of spur gear teeth
NASA Technical Reports Server (NTRS)
Cornell, R. W.
1983-01-01
The magnitude and variation of tooth pair compliance with load position affects the dynamics and loading significantly, and the tooth root stressing per load varies significantly with load position. Therefore, the recently developed time history, interactive, closed form solution for the dynamic tooth loads for both low and high contact ratio spur gears was expanded to include improved and simplified methods for calculating the compliance and stress sensitivity for three involute tooth forms as a function of load position. The compliance analysis has an improved fillet/foundation. The stress sensitivity analysis is a modified version of the Heywood method but with an improvement in the magnitude and location of the peak stress in the fillet. These improved compliance and stress sensitivity analyses are presented along with their evaluation using test, finite element, and analytic transformation results, which showed good agreement.
Measurements and analysis in imaging for biomedical applications
NASA Astrophysics Data System (ADS)
Hoeller, Timothy L.
2009-02-01
A Total Quality Management (TQM) approach can be used to analyze data from biomedical optical and imaging platforms of tissues. A shift from individuals to teams, partnerships, and total participation are necessary from health care groups for improved prognostics using measurement analysis. Proprietary measurement analysis software is available for calibrated, pixel-to-pixel measurements of angles and distances in digital images. Feature size, count, and color are determinable on an absolute and comparative basis. Although changes in images of histomics are based on complex and numerous factors, the variation of changes in imaging analysis to correlations of time, extent, and progression of illness can be derived. Statistical methods are preferred. Applications of the proprietary measurement software are available for any imaging platform. Quantification of results provides improved categorization of illness towards better health. As health care practitioners try to use quantified measurement data for patient diagnosis, the techniques reported can be used to track and isolate causes better. Comparisons, norms, and trends are available from processing of measurement data which is obtained easily and quickly from Scientific Software and methods. Example results for the class actions of Preventative and Corrective Care in Ophthalmology and Dermatology, respectively, are provided. Improved and quantified diagnosis can lead to better health and lower costs associated with health care. Systems support improvements towards Lean and Six Sigma affecting all branches of biology and medicine. As an example for use of statistics, the major types of variation involving a study of Bone Mineral Density (BMD) are examined. Typically, special causes in medicine relate to illness and activities; whereas, common causes are known to be associated with gender, race, size, and genetic make-up. Such a strategy of Continuous Process Improvement (CPI) involves comparison of patient results to baseline data using F-statistics. Self-parings over time are also useful. Special and common causes are identified apart from aging in applying the statistical methods. In the future, implementation of imaging measurement methods by research staff, doctors, and concerned patient partners result in improved health diagnosis, reporting, and cause determination. The long-term prospects for quantified measurements are better quality in imaging analysis with applications of higher utility for heath care providers.
Finite-element grid improvement by minimization of stiffness matrix trace
NASA Technical Reports Server (NTRS)
Kittur, Madan G.; Huston, Ronald L.; Oswald, Fred B.
1989-01-01
A new and simple method of finite-element grid improvement is presented. The objective is to improve the accuracy of the analysis. The procedure is based on a minimization of the trace of the stiffness matrix. For a broad class of problems this minimization is seen to be equivalent to minimizing the potential energy. The method is illustrated with the classical tapered bar problem examined earlier by Prager and Masur. Identical results are obtained.
Finite-element grid improvement by minimization of stiffness matrix trace
NASA Technical Reports Server (NTRS)
Kittur, Madan G.; Huston, Ronald L.; Oswald, Fred B.
1987-01-01
A new and simple method of finite-element grid improvement is presented. The objective is to improve the accuracy of the analysis. The procedure is based on a minimization of the trace of the stiffness matrix. For a broad class of problems this minimization is seen to be equivalent to minimizing the potential energy. The method is illustrated with the classical tapered bar problem examined earlier by Prager and Masur. Identical results are obtained.
liquid chromatography analysis Bench-scale methods Education B.S., Chemistry (Mathematics Minor), Adams ;Improved methods for the determination of drying conditions and fraction insoluble solids (FIS) in biomass
Re-refinement from deposited X-ray data can deliver improved models for most PDB entries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joosten, Robbie P.; Womack, Thomas; Vriend, Gert, E-mail: vriend@cmbi.ru.nl
2009-02-01
An evaluation of validation and real-space intervention possibilities for improving existing automated (re-)refinement methods. The deposition of X-ray data along with the customary structural models defining PDB entries makes it possible to apply large-scale re-refinement protocols to these entries, thus giving users the benefit of improvements in X-ray methods that have occurred since the structure was deposited. Automated gradient refinement is an effective method to achieve this goal, but real-space intervention is most often required in order to adequately address problems detected by structure-validation software. In order to improve the existing protocol, automated re-refinement was combined with structure validation andmore » difference-density peak analysis to produce a catalogue of problems in PDB entries that are amenable to automatic correction. It is shown that re-refinement can be effective in producing improvements, which are often associated with the systematic use of the TLS parameterization of B factors, even for relatively new and high-resolution PDB entries, while the accompanying manual or semi-manual map analysis and fitting steps show good prospects for eventual automation. It is proposed that the potential for simultaneous improvements in methods and in re-refinement results be further encouraged by broadening the scope of depositions to include refinement metadata and ultimately primary rather than reduced X-ray data.« less
NASA Astrophysics Data System (ADS)
Edyani, E. A.; Supriatna, A.; Kurnia; Komalasari, L.
2017-02-01
The research is aimed to investigate how lesson analysis as teacher’s self-reflection changes the teacher’s lesson design on chemical equation topic. Lesson Analysis has been used as part of teacher training programs to improve teacher’s ability in analyzing their own lesson. The method used in this research is a qualitative method. The research starts from build lesson design, implementation lesson design to senior high school student, utilize lesson analysis to get information about the lesson, and revise lesson design. The revised lesson design from the first implementation applied to the second implementation, resulting in better design. This research use lesson analysis Hendayana&Hidayat framework. Video tapped and transcript are employed on each lesson. After first implementation, lesson analysis result shows that teacher-centered still dominating the learning because students are less active in discussion, so the part of lesson design must be revised. After second implementation, lesson analysis result shows that the learning already student-centered. Students are very active in discussion. But some part of learning design still must be revised. In general, lesson analysis was effective for teacher to reflect the lessons. Teacher can utilize lesson analysis any time to improve the next lesson design.
ERIC Educational Resources Information Center
Torreblanca, Maximo
1988-01-01
Discusses the validity of studies of Spanish pronunciation in terms of research methods employed. Topics include data collection in the laboratory vs. in a natural setting; recorded vs. non-recorded data; quality of the recording; aural analysis vs. spectrographic analysis; and transcriber reliability. Suggestions for improving data collection are…
Improving resolution of crosswell seismic section based on time-frequency analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, H.; Li, Y.
1994-12-31
According to signal theory, to improve resolution of seismic section is to extend high-frequency band of seismic signal. In cross-well section, sonic log can be regarded as a reliable source providing high-frequency information to the trace near the borehole. In such case, what to do is to introduce this high-frequency information into the whole section. However, neither traditional deconvolution algorithms nor some new inversion methods such as BCI (Broad Constraint Inversion) are satisfied because of high-frequency noise and nonuniqueness of inversion results respectively. To overcome their disadvantages, this paper presents a new algorithm based on Time-Frequency Analysis (TFA) technology whichmore » has been increasingly received much attention as an useful signal analysis too. Practical applications show that the new method is a stable scheme to improve resolution of cross-well seismic section greatly without decreasing Signal to Noise Ratio (SNR).« less
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
[Improved euler algorithm for trend forecast model and its application to oil spectrum analysis].
Zheng, Chang-song; Ma, Biao
2009-04-01
The oil atomic spectrometric analysis technology is one of the most important methods for fault diagnosis and state monitoring of large machine equipment. The gray method is preponderant in the trend forecast at the same time. With the use of oil atomic spectrometric analysis result and combining the gray forecast theory, the present paper established a gray forecast model of the Fe/Cu concentration trend in the power-shift steering transmission. Aiming at the shortage of the gray method used in the trend forecast, the improved Euler algorithm was put forward for the first time to resolve the problem of the gray model and avoid the non-precision that the old gray model's forecast value depends on the first test value. This new method can make the forecast value more precision as shown in the example. Combined with the threshold value of the oil atomic spectrometric analysis, the new method was applied on the Fe/Cu concentration forecast and the premonition of fault information was obtained. So we can take steps to prevent the fault and this algorithm can be popularized to the state monitoring in the industry.
Wang, Li; Wang, Xiaoyi; Jin, Xuebo; Xu, Jiping; Zhang, Huiyan; Yu, Jiabin; Sun, Qian; Gao, Chong; Wang, Lingbin
2017-03-01
The formation process of algae is described inaccurately and water blooms are predicted with a low precision by current methods. In this paper, chemical mechanism of algae growth is analyzed, and a correlation analysis of chlorophyll-a and algal density is conducted by chemical measurement. Taking into account the influence of multi-factors on algae growth and water blooms, the comprehensive prediction method combined with multivariate time series and intelligent model is put forward in this paper. Firstly, through the process of photosynthesis, the main factors that affect the reproduction of the algae are analyzed. A compensation prediction method of multivariate time series analysis based on neural network and Support Vector Machine has been put forward which is combined with Kernel Principal Component Analysis to deal with dimension reduction of the influence factors of blooms. Then, Genetic Algorithm is applied to improve the generalization ability of the BP network and Least Squares Support Vector Machine. Experimental results show that this method could better compensate the prediction model of multivariate time series analysis which is an effective way to improve the description accuracy of algae growth and prediction precision of water blooms.
Scalable Parameter Estimation for Genome-Scale Biochemical Reaction Networks
Kaltenbacher, Barbara; Hasenauer, Jan
2017-01-01
Mechanistic mathematical modeling of biochemical reaction networks using ordinary differential equation (ODE) models has improved our understanding of small- and medium-scale biological processes. While the same should in principle hold for large- and genome-scale processes, the computational methods for the analysis of ODE models which describe hundreds or thousands of biochemical species and reactions are missing so far. While individual simulations are feasible, the inference of the model parameters from experimental data is computationally too intensive. In this manuscript, we evaluate adjoint sensitivity analysis for parameter estimation in large scale biochemical reaction networks. We present the approach for time-discrete measurement and compare it to state-of-the-art methods used in systems and computational biology. Our comparison reveals a significantly improved computational efficiency and a superior scalability of adjoint sensitivity analysis. The computational complexity is effectively independent of the number of parameters, enabling the analysis of large- and genome-scale models. Our study of a comprehensive kinetic model of ErbB signaling shows that parameter estimation using adjoint sensitivity analysis requires a fraction of the computation time of established methods. The proposed method will facilitate mechanistic modeling of genome-scale cellular processes, as required in the age of omics. PMID:28114351
NASA Astrophysics Data System (ADS)
Ebrahimian, Ali; Wilson, Bruce N.; Gulliver, John S.
2016-05-01
Impervious surfaces are useful indicators of the urbanization impacts on water resources. Effective impervious area (EIA), which is the portion of total impervious area (TIA) that is hydraulically connected to the drainage system, is a better catchment parameter in the determination of actual urban runoff. Development of reliable methods for quantifying EIA rather than TIA is currently one of the knowledge gaps in the rainfall-runoff modeling context. The objective of this study is to improve the rainfall-runoff data analysis method for estimating EIA fraction in urban catchments by eliminating the subjective part of the existing method and by reducing the uncertainty of EIA estimates. First, the theoretical framework is generalized using a general linear least square model and using a general criterion for categorizing runoff events. Issues with the existing method that reduce the precision of the EIA fraction estimates are then identified and discussed. Two improved methods, based on ordinary least square (OLS) and weighted least square (WLS) estimates, are proposed to address these issues. The proposed weighted least squares method is then applied to eleven urban catchments in Europe, Canada, and Australia. The results are compared to map measured directly connected impervious area (DCIA) and are shown to be consistent with DCIA values. In addition, both of the improved methods are applied to nine urban catchments in Minnesota, USA. Both methods were successful in removing the subjective component inherent in the analysis of rainfall-runoff data of the current method. The WLS method is more robust than the OLS method and generates results that are different and more precise than the OLS method in the presence of heteroscedastic residuals in our rainfall-runoff data.
NASA Astrophysics Data System (ADS)
Cenglin, Yao
The car sales enterprises could continuously boost sales and expand customer groups, an important method is to enhance the customer satisfaction. The customer satisfaction of car sales enterprises (4S enterprises) depends on many factors. By using the grey relational analysis method, we could perfectly combine various factors in terms of customer satisfaction. And through the vertical contrast, car sales enterprises could find specific factors which will improve customer satisfaction, thereby increase sales volume and benefits. Gray relational analysis method has become a kind of good method and means to analyze and evaluate the enterprises.
NASA Astrophysics Data System (ADS)
Ignatyev, A. V.; Ignatyev, V. A.; Onischenko, E. V.
2017-11-01
This article is the continuation of the work made bt the authors on the development of the algorithms that implement the finite element method in the form of a classical mixed method for the analysis of geometrically nonlinear bar systems [1-3]. The paper describes an improved algorithm of the formation of the nonlinear governing equations system for flexible plane frames and bars with large displacements of nodes based on the finite element method in a mixed classical form and the use of the procedure of step-by-step loading. An example of the analysis is given.
Munabi, Ian Guyton; Buwembo, William; Joseph, Ruberwa; Peter, Kawungezi; Bajunirwe, Francis; Mwaka, Erisa Sabakaki
2016-01-01
In this study we used a model of adult learning to explore undergraduate students' views on how to improve the teaching of research methods and biostatistics. This was a secondary analysis of survey data of 600 undergraduate students from three medical schools in Uganda. The analysis looked at student's responses to an open ended section of a questionnaire on their views on undergraduate teaching of research methods and biostatistics. Qualitative phenomenological data analysis was done with a bias towards principles of adult learning. Students appreciated the importance of learning research methods and biostatistics as a way of understanding research problems; appropriately interpreting statistical concepts during their training and post-qualification practice; and translating the knowledge acquired. Stressful teaching environment and inadequate educational resource materials were identified as impediments to effective learning. Suggestions for improved learning included: early and continuous exposure to the course; more active and practical approach to teaching; and a need for mentorship. The current methods of teaching research methods and biostatistics leave most of the students in the dissonance phase of learning resulting in none or poor student engagement that results in a failure to comprehend and/or appreciate the principles governing the use of different research methods.
ERIC Educational Resources Information Center
Quach, Hao T.; Steeper, Robert L.; Griffin, William G.
2004-01-01
A simple and fast method, which resolves chlorophyll a and b from spinach leaves on analytical plates while minimizing the appearance of chlorophyll degradation products is shown. An improved mobile phase for the Thin-layer chromatographic analysis of spinach extract that allows for the complete resolution of the common plant pigments found in…
NASA Astrophysics Data System (ADS)
Ward, T.; Fleming, J. S.; Hoffmann, S. M. A.; Kemp, P. M.
2005-11-01
Simulation is useful in the validation of functional image analysis methods, particularly when considering the number of analysis techniques currently available lacking thorough validation. Problems exist with current simulation methods due to long run times or unrealistic results making it problematic to generate complete datasets. A method is presented for simulating known abnormalities within normal brain SPECT images using a measured point spread function (PSF), and incorporating a stereotactic atlas of the brain for anatomical positioning. This allows for the simulation of realistic images through the use of prior information regarding disease progression. SPECT images of cerebral perfusion have been generated consisting of a control database and a group of simulated abnormal subjects that are to be used in a UK audit of analysis methods. The abnormality is defined in the stereotactic space, then transformed to the individual subject space, convolved with a measured PSF and removed from the normal subject image. The dataset was analysed using SPM99 (Wellcome Department of Imaging Neuroscience, University College, London) and the MarsBaR volume of interest (VOI) analysis toolbox. The results were evaluated by comparison with the known ground truth. The analysis showed improvement when using a smoothing kernel equal to system resolution over the slightly larger kernel used routinely. Significant correlation was found between effective volume of a simulated abnormality and the detected size using SPM99. Improvements in VOI analysis sensitivity were found when using the region median over the region mean. The method and dataset provide an efficient methodology for use in the comparison and cross validation of semi-quantitative analysis methods in brain SPECT, and allow the optimization of analysis parameters.
A Cyber-Attack Detection Model Based on Multivariate Analyses
NASA Astrophysics Data System (ADS)
Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi
In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.
Improvement of automatic hemorrhage detection methods using brightness correction on fundus images
NASA Astrophysics Data System (ADS)
Hatanaka, Yuji; Nakagawa, Toshiaki; Hayashi, Yoshinori; Kakogawa, Masakatsu; Sawada, Akira; Kawase, Kazuhide; Hara, Takeshi; Fujita, Hiroshi
2008-03-01
We have been developing several automated methods for detecting abnormalities in fundus images. The purpose of this study is to improve our automated hemorrhage detection method to help diagnose diabetic retinopathy. We propose a new method for preprocessing and false positive elimination in the present study. The brightness of the fundus image was changed by the nonlinear curve with brightness values of the hue saturation value (HSV) space. In order to emphasize brown regions, gamma correction was performed on each red, green, and blue-bit image. Subsequently, the histograms of each red, blue, and blue-bit image were extended. After that, the hemorrhage candidates were detected. The brown regions indicated hemorrhages and blood vessels and their candidates were detected using density analysis. We removed the large candidates such as blood vessels. Finally, false positives were removed by using a 45-feature analysis. To evaluate the new method for the detection of hemorrhages, we examined 125 fundus images, including 35 images with hemorrhages and 90 normal images. The sensitivity and specificity for the detection of abnormal cases was were 80% and 88%, respectively. These results indicate that the new method may effectively improve the performance of our computer-aided diagnosis system for hemorrhages.
NASA Astrophysics Data System (ADS)
Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen
2018-05-01
To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.
Van Driel, Robin; Trask, Catherine; Johnson, Peter W; Callaghan, Jack P; Koehoorn, Mieke; Teschke, Kay
2013-01-01
Measuring trunk posture in the workplace commonly involves subjective observation or self-report methods or the use of costly and time-consuming motion analysis systems (current gold standard). This work compared trunk inclination measurements using a simple data-logging inclinometer with trunk flexion measurements using a motion analysis system, and evaluated adding measures of subject anthropometry to exposure prediction models to improve the agreement between the two methods. Simulated lifting tasks (n=36) were performed by eight participants, and trunk postures were simultaneously measured with each method. There were significant differences between the two methods, with the inclinometer initially explaining 47% of the variance in the motion analysis measurements. However, adding one key anthropometric parameter (lower arm length) to the inclinometer-based trunk flexion prediction model reduced the differences between the two systems and accounted for 79% of the motion analysis method's variance. Although caution must be applied when generalizing lower-arm length as a correction factor, the overall strategy of anthropometric modeling is a novel contribution. In this lifting-based study, by accounting for subject anthropometry, a single, simple data-logging inclinometer shows promise for trunk posture measurement and may have utility in larger-scale field studies where similar types of tasks are performed.
Shape design sensitivity analysis using domain information
NASA Technical Reports Server (NTRS)
Seong, Hwal-Gyeong; Choi, Kyung K.
1985-01-01
A numerical method for obtaining accurate shape design sensitivity information for built-up structures is developed and demonstrated through analysis of examples. The basic character of the finite element method, which gives more accurate domain information than boundary information, is utilized for shape design sensitivity improvement. A domain approach for shape design sensitivity analysis of built-up structures is derived using the material derivative idea of structural mechanics and the adjoint variable method of design sensitivity analysis. Velocity elements and B-spline curves are introduced to alleviate difficulties in generating domain velocity fields. The regularity requirements of the design velocity field are studied.
A sensitive continuum analysis method for gamma ray spectra
NASA Technical Reports Server (NTRS)
Thakur, Alakh N.; Arnold, James R.
1993-01-01
In this work we examine ways to improve the sensitivity of the analysis procedure for gamma ray spectra with respect to small differences in the continuum (Compton) spectra. The method developed is applied to analyze gamma ray spectra obtained from planetary mapping by the Mars Observer spacecraft launched in September 1992. Calculated Mars simulation spectra and actual thick target bombardment spectra have been taken as test cases. The principle of the method rests on the extraction of continuum information from Fourier transforms of the spectra. We study how a better estimate of the spectrum from larger regions of the Mars surface will improve the analysis for smaller regions with poorer statistics. Estimation of signal within the continuum is done in the frequency domain which enables efficient and sensitive discrimination of subtle differences between two spectra. The process is compared to other methods for the extraction of information from the continuum. Finally we explore briefly the possible uses of this technique in other applications of continuum spectra.
NASA Astrophysics Data System (ADS)
Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.
2015-11-01
In this paper we present improved methods for discriminating and quantifying primary biological aerosol particles (PBAPs) by applying hierarchical agglomerative cluster analysis to multi-parameter ultraviolet-light-induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1 × 106 points on a desktop computer, allowing for each fluorescent particle in a data set to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient data set. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best-performing methods were applied to the BEACHON-RoMBAS (Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics and Nitrogen-Rocky Mountain Biogenic Aerosol Study) ambient data set, where it was found that the z-score and range normalisation methods yield similar results, with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misattribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed, yielding an explicit cluster attribution for each particle and improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.
Experiences in Eliciting Security Requirements
2006-12-01
FODA ) FODA is a domain analysis and engineer- ing method that focuses on developing reusable assets [9]. By examining related software systems and...describe a trade-off analysis that we used to select a suitable requirements elici- tation method and present results detailed from a case study of one...disaster planning, and how to improve Medicare. Eventually, technology-oriented problems may emerge from these soft problems, but much more analysis is
Lee, Jack; Zee, Benny Chung Ying; Li, Qing
2013-01-01
Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA), high order spectrum analysis (HOS), fractal analysis (FA), and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC) are obtained. They are 96.3%, 99.1% and 98.5% (99.3%), respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.
Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter
2017-06-28
High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.
The p-version of the finite element method in incremental elasto-plastic analysis
NASA Technical Reports Server (NTRS)
Holzer, Stefan M.; Yosibash, Zohar
1993-01-01
Whereas the higher-order versions of the finite elements method (the p- and hp-version) are fairly well established as highly efficient methods for monitoring and controlling the discretization error in linear problems, little has been done to exploit their benefits in elasto-plastic structural analysis. Aspects of incremental elasto-plastic finite element analysis which are particularly amenable to improvements by the p-version is discussed. These theoretical considerations are supported by several numerical experiments. First, an example for which an analytical solution is available is studied. It is demonstrated that the p-version performs very well even in cycles of elasto-plastic loading and unloading, not only as compared to the traditional h-version but also in respect to the exact solution. Finally, an example of considerable practical importance - the analysis of a cold-worked lug - is presented which demonstrates how the modeling tools offered by higher-order finite element techniques can contribute to an improved approximation of practical problems.
Bala, S; Uniyal, G C; Dubey, T; Singh, S P
2001-01-01
A reversed-phase column liquid chromatographic method for the analysis of sennosides A and B present in leaf and pod extracts of Cassia angustifolia has been developed using a Symmetry C18 column and a linear binary gradient profile. The method can be utilised for the quantitative determination of other sennosides as a baseline resolution for most of the constituents was achieved. The method is economical in terms of the time taken and the amount of solvent used (25 mL) for each analysis. The validity of the method with respect to analysis was confirmed by comparing the UV spectra of each peak with those of reference compounds using a photodiode array detector.
Analysis of de-noising methods to improve the precision of the ILSF BPM electronic readout system
NASA Astrophysics Data System (ADS)
Shafiee, M.; Feghhi, S. A. H.; Rahighi, J.
2016-12-01
In order to have optimum operation and precise control system at particle accelerators, it is required to measure the beam position with the precision of sub-μm. We developed a BPM electronic readout system at Iranian Light Source Facility and it has been experimentally tested at ALBA accelerator facility. The results show the precision of 0.54 μm in beam position measurements. To improve the precision of this beam position monitoring system to sub-μm level, we have studied different de-noising methods such as principal component analysis, wavelet transforms, filtering by FIR, and direct averaging method. An evaluation of the noise reduction was given to testify the ability of these methods. The results show that the noise reduction based on Daubechies wavelet transform is better than other algorithms, and the method is suitable for signal noise reduction in beam position monitoring system.
Exploiting salient semantic analysis for information retrieval
NASA Astrophysics Data System (ADS)
Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui
2016-11-01
Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.
Towards accurate modeling of noncovalent interactions for protein rigidity analysis.
Fox, Naomi; Streinu, Ileana
2013-01-01
Protein rigidity analysis is an efficient computational method for extracting flexibility information from static, X-ray crystallography protein data. Atoms and bonds are modeled as a mechanical structure and analyzed with a fast graph-based algorithm, producing a decomposition of the flexible molecule into interconnected rigid clusters. The result depends critically on noncovalent atomic interactions, primarily on how hydrogen bonds and hydrophobic interactions are computed and modeled. Ongoing research points to the stringent need for benchmarking rigidity analysis software systems, towards the goal of increasing their accuracy and validating their results, either against each other and against biologically relevant (functional) parameters. We propose two new methods for modeling hydrogen bonds and hydrophobic interactions that more accurately reflect a mechanical model, without being computationally more intensive. We evaluate them using a novel scoring method, based on the B-cubed score from the information retrieval literature, which measures how well two cluster decompositions match. To evaluate the modeling accuracy of KINARI, our pebble-game rigidity analysis system, we use a benchmark data set of 20 proteins, each with multiple distinct conformations deposited in the Protein Data Bank. Cluster decompositions for them were previously determined with the RigidFinder method from Gerstein's lab and validated against experimental data. When KINARI's default tuning parameters are used, an improvement of the B-cubed score over a crude baseline is observed in 30% of this data. With our new modeling options, improvements were observed in over 70% of the proteins in this data set. We investigate the sensitivity of the cluster decomposition score with case studies on pyruvate phosphate dikinase and calmodulin. To substantially improve the accuracy of protein rigidity analysis systems, thorough benchmarking must be performed on all current systems and future extensions. We have measured the gain in performance by comparing different modeling methods for noncovalent interactions. We showed that new criteria for modeling hydrogen bonds and hydrophobic interactions can significantly improve the results. The two new methods proposed here have been implemented and made publicly available in the current version of KINARI (v1.3), together with the benchmarking tools, which can be downloaded from our software's website, http://kinari.cs.umass.edu.
Towards accurate modeling of noncovalent interactions for protein rigidity analysis
2013-01-01
Background Protein rigidity analysis is an efficient computational method for extracting flexibility information from static, X-ray crystallography protein data. Atoms and bonds are modeled as a mechanical structure and analyzed with a fast graph-based algorithm, producing a decomposition of the flexible molecule into interconnected rigid clusters. The result depends critically on noncovalent atomic interactions, primarily on how hydrogen bonds and hydrophobic interactions are computed and modeled. Ongoing research points to the stringent need for benchmarking rigidity analysis software systems, towards the goal of increasing their accuracy and validating their results, either against each other and against biologically relevant (functional) parameters. We propose two new methods for modeling hydrogen bonds and hydrophobic interactions that more accurately reflect a mechanical model, without being computationally more intensive. We evaluate them using a novel scoring method, based on the B-cubed score from the information retrieval literature, which measures how well two cluster decompositions match. Results To evaluate the modeling accuracy of KINARI, our pebble-game rigidity analysis system, we use a benchmark data set of 20 proteins, each with multiple distinct conformations deposited in the Protein Data Bank. Cluster decompositions for them were previously determined with the RigidFinder method from Gerstein's lab and validated against experimental data. When KINARI's default tuning parameters are used, an improvement of the B-cubed score over a crude baseline is observed in 30% of this data. With our new modeling options, improvements were observed in over 70% of the proteins in this data set. We investigate the sensitivity of the cluster decomposition score with case studies on pyruvate phosphate dikinase and calmodulin. Conclusion To substantially improve the accuracy of protein rigidity analysis systems, thorough benchmarking must be performed on all current systems and future extensions. We have measured the gain in performance by comparing different modeling methods for noncovalent interactions. We showed that new criteria for modeling hydrogen bonds and hydrophobic interactions can significantly improve the results. The two new methods proposed here have been implemented and made publicly available in the current version of KINARI (v1.3), together with the benchmarking tools, which can be downloaded from our software's website, http://kinari.cs.umass.edu. PMID:24564209
Optoelectronic scanning system upgrade by energy center localization methods
NASA Astrophysics Data System (ADS)
Flores-Fuentes, W.; Sergiyenko, O.; Rodriguez-Quiñonez, J. C.; Rivas-López, M.; Hernández-Balbuena, D.; Básaca-Preciado, L. C.; Lindner, L.; González-Navarro, F. F.
2016-11-01
A problem of upgrading an optoelectronic scanning system with digital post-processing of the signal based on adequate methods of energy center localization is considered. An improved dynamic triangulation analysis technique is proposed by an example of industrial infrastructure damage detection. A modification of our previously published method aimed at searching for the energy center of an optoelectronic signal is described. Application of the artificial intelligence algorithm of compensation for the error of determining the angular coordinate in calculating the spatial coordinate through dynamic triangulation is demonstrated. Five energy center localization methods are developed and tested to select the best method. After implementation of these methods, digital compensation for the measurement error, and statistical data analysis, a non-parametric behavior of the data is identified. The Wilcoxon signed rank test is applied to improve the result further. For optical scanning systems, it is necessary to detect a light emitter mounted on the infrastructure being investigated to calculate its spatial coordinate by the energy center localization method.
van Mourik, Louise M; Leonards, Pim E G; Gaus, Caroline; de Boer, Jacob
2015-10-01
Concerns about the high production volumes, persistency, bioaccumulation potential and toxicity of chlorinated paraffin (CP) mixtures, especially short-chain CPs (SCCPs), are rising. However, information on their levels and fate in the environment is still insufficient, impeding international classifications and regulations. This knowledge gap is mainly due to the difficulties that arise with CP analysis, in particular the chromatographic separation within CPs and between CPs and other compounds. No fully validated routine analytical method is available yet and only semi-quantitative analysis is possible, although the number of studies reporting new and improved methods have rapidly increased since 2010. Better cleanup procedures that remove interfering compounds, and new instrumental techniques, which distinguish between medium-chain CPs (MCCPs) and SCCPs, have been developed. While gas chromatography coupled to an electron capture negative ionisation mass spectrometry (GC/ECNI-MS) remains the most commonly applied technique, novel and promising use of high resolution time of flight MS (TOF-MS) has also been reported. We expect that recent developments in high resolution TOF-MS and Orbitrap technologies will further improve the detection of CPs, including long-chain CPs (LCCPs), and the group separation and quantification of CP homologues. Also, new CP quantification methods have emerged, including the use of mathematical algorithms, multiple linear regression and principal component analysis. These quantification advancements are also reflected in considerably improved interlaboratory agreements since 2010. Analysis of lower chlorinated paraffins (
NASA Astrophysics Data System (ADS)
Xu, Jing; Wang, Yu-Tian; Liu, Xiao-Fei
2015-04-01
Edible blend oil is a mixture of vegetable oils. Eligible blend oil can meet the daily need of two essential fatty acids for human to achieve the balanced nutrition. Each vegetable oil has its different composition, so vegetable oils contents in edible blend oil determine nutritional components in blend oil. A high-precision quantitative analysis method to detect the vegetable oils contents in blend oil is necessary to ensure balanced nutrition for human being. Three-dimensional fluorescence technique is high selectivity, high sensitivity, and high-efficiency. Efficiency extraction and full use of information in tree-dimensional fluorescence spectra will improve the accuracy of the measurement. A novel quantitative analysis is proposed based on Quasi-Monte-Carlo integral to improve the measurement sensitivity and reduce the random error. Partial least squares method is used to solve nonlinear equations to avoid the effect of multicollinearity. The recovery rates of blend oil mixed by peanut oil, soybean oil and sunflower are calculated to verify the accuracy of the method, which are increased, compared the linear method used commonly for component concentration measurement.
Analysis of free modeling predictions by RBO aleph in CASP11.
Mabrouk, Mahmoud; Werner, Tim; Schneider, Michael; Putz, Ines; Brock, Oliver
2016-09-01
The CASP experiment is a biannual benchmark for assessing protein structure prediction methods. In CASP11, RBO Aleph ranked as one of the top-performing automated servers in the free modeling category. This category consists of targets for which structural templates are not easily retrievable. We analyze the performance of RBO Aleph and show that its success in CASP was a result of its ab initio structure prediction protocol. A detailed analysis of this protocol demonstrates that two components unique to our method greatly contributed to prediction quality: residue-residue contact prediction by EPC-map and contact-guided conformational space search by model-based search (MBS). Interestingly, our analysis also points to a possible fundamental problem in evaluating the performance of protein structure prediction methods: Improvements in components of the method do not necessarily lead to improvements of the entire method. This points to the fact that these components interact in ways that are poorly understood. This problem, if indeed true, represents a significant obstacle to community-wide progress. Proteins 2016; 84(Suppl 1):87-104. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Just Culture: A Foundation for Balanced Accountability and Patient Safety
Boysen, Philip G.
2013-01-01
Background The framework of a just culture ensures balanced accountability for both individuals and the organization responsible for designing and improving systems in the workplace. Engineering principles and human factors analysis influence the design of these systems so they are safe and reliable. Methods Approaches for improving patient safety introduced here are (1) analysis of error, (2) specific tools to enhance safety, and (3) outcome engineering. Conclusion The just culture is a learning culture that is constantly improving and oriented toward patient safety. PMID:24052772
Performance Evaluation of an Improved GC-MS Method to Quantify Methylmercury in Fish.
Watanabe, Takahiro; Kikuchi, Hiroyuki; Matsuda, Rieko; Hayashi, Tomoko; Akaki, Koichi; Teshima, Reiko
2015-01-01
Here, we set out to improve our previously developed methylmercury analytical method, involving phenyl derivatization and gas chromatography-mass spectrometry (GC-MS). In the improved method, phenylation of methylmercury with sodium tetraphenylborate was carried out in a toluene/water two-phase system, instead of in water alone. The modification enabled derivatization at optimum pH, and the formation of by-products was dramatically reduced. In addition, adsorption of methyl phenyl mercury in the GC system was suppressed by co-injection of PEG200, enabling continuous analysis without loss of sensitivity. The performance of the improved analytical method was independently evaluated by three analysts using certified reference materials and methylmercury-spiked fresh fish samples. The present analytical method was validated as suitable for determination of compliance with the provisional regulation value for methylmercury in fish, set in the Food Sanitation haw.
Intelligent process mapping through systematic improvement of heuristics
NASA Technical Reports Server (NTRS)
Ieumwananonthachai, Arthur; Aizawa, Akiko N.; Schwartz, Steven R.; Wah, Benjamin W.; Yan, Jerry C.
1992-01-01
The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.
Rolling bearing fault diagnosis based on information fusion using Dempster-Shafer evidence theory
NASA Astrophysics Data System (ADS)
Pei, Di; Yue, Jianhai; Jiao, Jing
2017-10-01
This paper presents a fault diagnosis method for rolling bearing based on information fusion. Acceleration sensors are arranged at different position to get bearing vibration data as diagnostic evidence. The Dempster-Shafer (D-S) evidence theory is used to fuse multi-sensor data to improve diagnostic accuracy. The efficiency of the proposed method is demonstrated by the high speed train transmission test bench. The results of experiment show that the proposed method in this paper improves the rolling bearing fault diagnosis accuracy compared with traditional signal analysis methods.
Soft X-ray astronomy using grazing incidence optics
NASA Technical Reports Server (NTRS)
Davis, John M.
1989-01-01
The instrumental background of X-ray astronomy with an emphasis on high resolution imagery is outlined. Optical and system performance, in terms of resolution, are compared and methods for improving the latter in finite length instruments described. The method of analysis of broadband images to obtain diagnostic information is described and is applied to the analysis of coronal structures.
Bayesian data analysis in population ecology: motivations, methods, and benefits
Dorazio, Robert
2016-01-01
During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.
Classical least squares multivariate spectral analysis
Haaland, David M.
2002-01-01
An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.
Hybrid Tracking Algorithm Improvements and Cluster Analysis Methods.
1982-02-26
UPGMA ), and Ward’s method. Ling’s papers describe a (k,r) clustering method. Each of these methods have individual characteristics which make them...Reference 7), UPGMA is probably the most frequently used clustering strategy. UPGMA tries to group new points into an existing cluster by using an
Laser-induced-fluorescence spectroscopy for improved chemical analysis. Progress report, 1978-1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelbwachs, J.A.
1983-09-01
This report summarizes the progress achieved over the past five years in the laser-induced fluorescence spectroscopy (LIFS) for improved chemical analysis program. Our initial efforts yielded significantly lower detection limits for trace elemental analysis by the use of both cw and pulsed laser excitations. New methods of LIFS were developed that were shown to overcome many of the traditional limitations to LIFS techniques. LIFS methods have been applied to yield fundamental scientific data that further the understanding of forces between atoms and other atoms and molecules. In recent work, two-photon ionization was combined with LIFS and applied, for the firstmore » time, to the study of energy transfer in ions.« less
Laser-induced-fluorescence spectroscopy for improved chemical analysis. Progress report, 1978-1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelbwachs, J.A.
1983-09-01
This report summarizes the progress achieved over the past five years in the laser-induced-fluorescence spectroscopy (LIFS) for improved chemical-analysis program. Our initial efforts yielded significantly lower detection limits for trace elemental analysis by the use of both cw and pulsed-laser excitations. New methods of LIFS were developed that were shown to overcome many of the traditional limitations to LIFS techniques. LIFS methods have been applied to yield fundamental scientific data that further the understanding of forces between atoms and other atoms and molecules. In recent work, two-photon ionization was combined with LIFS and applied, for the first time, to themore » study of energy transfer in ions.« less
Statistical analysis of RHIC beam position monitors performance
NASA Astrophysics Data System (ADS)
Calaga, R.; Tomás, R.
2004-04-01
A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.
Improved Method for Prediction of Attainable Wing Leading-Edge Thrust
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; McElroy, Marcus O.; Lessard, Wendy B.; McCullers, L. Arnold
1996-01-01
Prediction of the loss of wing leading-edge thrust and the accompanying increase in drag due to lift, when flow is not completely attached, presents a difficult but commonly encountered problem. A method (called the previous method) for the prediction of attainable leading-edge thrust and the resultant effect on airplane aerodynamic performance has been in use for more than a decade. Recently, the method has been revised to enhance its applicability to current airplane design and evaluation problems. The improved method (called the present method) provides for a greater range of airfoil shapes from very sharp to very blunt leading edges. It is also based on a wider range of Reynolds numbers than was available for the previous method. The present method, when employed in computer codes for aerodynamic analysis, generally results in improved correlation with experimental wing-body axial-force data and provides reasonable estimates of the measured drag.
Wang, Xinyu; Gao, Jing-Lin; Du, Chaohui; An, Jing; Li, MengJiao; Ma, Haiyan; Zhang, Lina; Jiang, Ye
2017-01-01
People today have a stronger interest in the risk of biosafety in clinical bioanalysis. A safe, simple, effective method of preparation is needed urgently. To improve biosafety of clinical analysis, we used antiviral drugs of adefovir and tenofovir as model drugs and developed a safe pretreatment method combining sealing technique with direct injection technique. The inter- and intraday precision (RSD %) of the method were <4%, and the extraction recoveries ranged from 99.4 to 100.7%. Meanwhile, the results showed that standard solution could be used to prepare calibration curve instead of spiking plasma, acquiring more accuracy result. Compared with traditional methods, the novel method not only improved biosecurity of the pretreatment method significantly, but also achieved several advantages including higher precision, favorable sensitivity and satisfactory recovery. With these highly practical and desirable characteristics, the novel method may become a feasible platform in bioanalysis.
Lin, Zhichao; Wu, Zhongyu
2009-05-01
A rapid and reliable radiochemical method coupled with a simple and compact plating apparatus was developed, validated, and applied for the analysis of (210)Po in variety of food products and bioassay samples. The method performance characteristics, including accuracy, precision, robustness, and specificity, were evaluated along with a detailed measurement uncertainty analysis. With high Po recovery, improved energy resolution, and effective removal of interfering elements by chromatographic extraction, the overall method accuracy was determined to be better than 5% with measurement precision of 10%, at 95% confidence level.
Improved neutron-gamma discrimination for a 3He neutron detector using subspace learning methods
Wang, C. L.; Funk, L. L.; Riedel, R. A.; ...
2017-02-10
3He gas based neutron linear-position-sensitive detectors (LPSDs) have been applied for many neutron scattering instruments. Traditional Pulse-Height Analysis (PHA) for Neutron-Gamma Discrimination (NGD) resulted in the neutron-gamma efficiency ratio on the orders of 10 5-10 6. The NGD ratios of 3He detectors need to be improved for even better scientific results from neutron scattering. Digital Signal Processing (DSP) analyses of waveforms were proposed for obtaining better NGD ratios, based on features extracted from rise-time, pulse amplitude, charge integration, a simplified Wiener filter, and the cross-correlation between individual and template waveforms of neutron and gamma events. Fisher linear discriminant analysis (FLDA)more » and three multivariate analyses (MVAs) of the features were performed. The NGD ratios are improved by about 10 2-10 3 times compared with the traditional PHA method. Finally, our results indicate the NGD capabilities of 3He tube detectors can be significantly improved with subspace-learning based methods, which may result in a reduced data-collection time and better data quality for further data reduction.« less
Combined Use of Integral Experiments and Covariance Data
NASA Astrophysics Data System (ADS)
Palmiotti, G.; Salvatores, M.; Aliberti, G.; Herman, M.; Hoblit, S. D.; McKnight, R. D.; Obložinský, P.; Talou, P.; Hale, G. M.; Hiruta, H.; Kawano, T.; Mattoon, C. M.; Nobre, G. P. A.; Palumbo, A.; Pigni, M.; Rising, M. E.; Yang, W.-S.; Kahler, A. C.
2014-04-01
In the frame of a US-DOE sponsored project, ANL, BNL, INL and LANL have performed a joint multidisciplinary research activity in order to explore the combined use of integral experiments and covariance data with the objective to both give quantitative indications on possible improvements of the ENDF evaluated data files and to reduce at the same time crucial reactor design parameter uncertainties. Methods that have been developed in the last four decades for the purposes indicated above have been improved by some new developments that benefited also by continuous exchanges with international groups working in similar areas. The major new developments that allowed significant progress are to be found in several specific domains: a) new science-based covariance data; b) integral experiment covariance data assessment and improved experiment analysis, e.g., of sample irradiation experiments; c) sensitivity analysis, where several improvements were necessary despite the generally good understanding of these techniques, e.g., to account for fission spectrum sensitivity; d) a critical approach to the analysis of statistical adjustments performance, both a priori and a posteriori; e) generalization of the assimilation method, now applied for the first time not only to multigroup cross sections data but also to nuclear model parameters (the "consistent" method). This article describes the major results obtained in each of these areas; a large scale nuclear data adjustment, based on the use of approximately one hundred high-accuracy integral experiments, will be reported along with a significant example of the application of the new "consistent" method of data assimilation.
Dearing, Chey G; Kilburn, Sally; Lindsay, Kevin S
2014-03-01
Sperm counts have been linked to several fertility outcomes making them an essential parameter of semen analysis. It has become increasingly recognised that Computer-Assisted Semen Analysis (CASA) provides improved precision over manual methods but that systems are seldom validated robustly for use. The objective of this study was to gather the evidence to validate or reject the Sperm Class Analyser (SCA) as a tool for routine sperm counting in a busy laboratory setting. The criteria examined were comparison with the Improved Neubauer and Leja 20-μm chambers, within and between field precision, sperm concentration linearity from a stock diluted in semen and media, accuracy against internal and external quality material, assessment of uneven flow effects and a receiver operating characteristic (ROC) analysis to predict fertility in comparison with the Neubauer method. This work demonstrates that SCA CASA technology is not a standalone 'black box', but rather a tool for well-trained staff that allows rapid, high-number sperm counting providing errors are identified and corrected. The system will produce accurate, linear, precise results, with less analytical variance than manual methods that correlate well against the Improved Neubauer chamber. The system provides superior predictive potential for diagnosing fertility problems.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Groth, Katrina M.; Smith, Curtis L.; Swiler, Laura P.
2014-04-05
In the past several years, several international agencies have begun to collect data on human performance in nuclear power plant simulators [1]. This data provides a valuable opportunity to improve human reliability analysis (HRA), but there improvements will not be realized without implementation of Bayesian methods. Bayesian methods are widely used in to incorporate sparse data into models in many parts of probabilistic risk assessment (PRA), but Bayesian methods have not been adopted by the HRA community. In this article, we provide a Bayesian methodology to formally use simulator data to refine the human error probabilities (HEPs) assigned by existingmore » HRA methods. We demonstrate the methodology with a case study, wherein we use simulator data from the Halden Reactor Project to update the probability assignments from the SPAR-H method. The case study demonstrates the ability to use performance data, even sparse data, to improve existing HRA methods. Furthermore, this paper also serves as a demonstration of the value of Bayesian methods to improve the technical basis of HRA.« less
NASA Astrophysics Data System (ADS)
Cheng, K.; Guo, L. M.; Wang, Y. K.; Zafar, M. T.
2017-11-01
In order to select effective samples in the large number of data of PV power generation years and improve the accuracy of PV power generation forecasting model, this paper studies the application of clustering analysis in this field and establishes forecasting model based on neural network. Based on three different types of weather on sunny, cloudy and rainy days, this research screens samples of historical data by the clustering analysis method. After screening, it establishes BP neural network prediction models using screened data as training data. Then, compare the six types of photovoltaic power generation prediction models before and after the data screening. Results show that the prediction model combining with clustering analysis and BP neural networks is an effective method to improve the precision of photovoltaic power generation.
McCaw, Travis J; Micka, John A; Dewerd, Larry A
2011-10-01
Gafchromic(®) EBT2 film has a yellow marker dye incorporated into the active layer of the film that can be used to correct the film response for small variations in thickness. This work characterizes the effect of the marker-dye correction on the uniformity and uncertainty of dose measurements with EBT2 film. The effect of variations in time postexposure on the uniformity of EBT2 is also investigated. EBT2 films were used to measure the flatness of a (60)Co field to provide a high-spatial resolution evaluation of the film uniformity. As a reference, the flatness of the (60)Co field was also measured with Kodak EDR2 films. The EBT2 films were digitized with a flatbed document scanner 24, 48, and 72 h postexposure, and the images were analyzed using three methods: (1) the manufacturer-recommended marker-dye correction, (2) an in-house marker-dye correction, and (3) a net optical density (OD) measurement in the red color channel. The field flatness was calculated from orthogonal profiles through the center of the field using each analysis method, and the results were compared with the EDR2 measurements. Uncertainty was propagated through a dose calculation for each analysis method. The change in the measured field flatness for increasing times postexposure was also determined. Both marker-dye correction methods improved the field flatness measured with EBT2 film relative to the net OD method, with a maximum improvement of 1% using the manufacturer-recommended correction. However, the manufacturer-recommended correction also resulted in a dose uncertainty an order of magnitude greater than the other two methods. The in-house marker-dye correction lowered the dose uncertainty relative to the net OD method. The measured field flatness did not exhibit any unidirectional change with increasing time postexposure and showed a maximum change of 0.3%. The marker dye in EBT2 can be used to improve the response uniformity of the film. Depending on the film analysis method used, however, application of a marker-dye correction can improve or degrade the dose uncertainty relative to the net OD method. The uniformity of EBT2 was found to be independent of the time postexposure.
Quantitative photogrammetric analysis of the Klapp method for treating idiopathic scoliosis.
Iunes, Denise H; Cecílio, Maria B B; Dozza, Marina A; Almeida, Polyanna R
2010-01-01
Few studies have proved that physical therapy techniques are efficient in the treatment of scoliosis. To analyze the efficiency of the Klapp method for the treatment of scoliosis, through a quantitative analysis using computerized biophotogrammetry. Sixteen participants of a mean age of 15+/-2.61 yrs. with idiopathic scoliosis were treated using the Klapp method. To analyze the results from the treatment, they were all of photographed before and after the treatments, following a standardized photographic method. All of the photographs were analyzed quantitatively by the same examiner using the ALCimagem 2000 software. The statistical analyses were performed using the paired t-test with a significance level of 5%. The treatments showed improvements in the angles which evaluated the symmetry of the shoulders, i.e. the acromioclavicular joint angle (AJ; p=0.00) and sternoclavicular joint angle (SJ; p=0.01). There were also improvements in the angle that evaluated the left Thales triangle (DeltaT; p=0.02). Regarding flexibility, there were improvements in the tibiotarsal angle (TTA; p=0.01) and in the hip joint angles (HJA; p=0.00). There were no changes in the vertebral curvatures and nor improvements in head positioning. Only the lumbar curvature, evaluated by the lumbar lordosis angle (LL; p=0.00), changed after the treatments. The Klapp method was an efficient therapeutic technique for treating asymmetries of the trunk and improving its flexibility. However, it was not efficient for pelvic asymmetry modifications in head positioning, cervical lordosis or thoracic kyphosis.
Behavioral Training as New Treatment for Adult Amblyopia: A Meta-Analysis and Systematic Review.
Tsirlin, Inna; Colpa, Linda; Goltz, Herbert C; Wong, Agnes M F
2015-06-01
New behavioral treatment methods, including dichoptic training, perceptual learning, and video gaming, have been proposed to improve visual function in adult amblyopia. Here, we conducted a meta-analysis of these methods to investigate the factors involved in amblyopia recovery and their clinical significance. Mean and individual participant data meta-analyses were performed on 24 studies using the new behavioral methods in adults. Studies were identified using PubMed, Google Scholar, and published reviews. The new methods yielded a mean improvement in visual acuity of 0.17 logMAR with 32% participants achieving gains ≥ 0.2 logMAR, and a mean improvement in stereo sensitivity of 0.01 arcsec-1 with 42% of participants improving ≥2 octaves. The most significant predictor of treatment outcome was visual acuity at the onset of treatment. Participants with more severe amblyopia improved more on visual acuity and less on stereo sensitivity than those with milder amblyopia. Better initial stereo sensitivity was a predictor of greater gains in stereo sensitivity following treatment. Treatment type, amblyopia type, age, and training duration did not have any significant influence on visual and stereo acuity outcomes. Our analyses showed that some participants may benefit from the new treatments; however, clinical trials are required to confirm these findings. Despite the diverse nature of the new behavioral methods, the lack of significant differences in visual and stereo sensitivity outcomes among them suggests that visual attention-a common element among the varied treatment methods-may play an important role in amblyopia recovery.
Optimization for Peptide Sample Preparation for Urine Peptidomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sigdel, Tara K.; Nicora, Carrie D.; Hsieh, Szu-Chuan
2014-02-25
Analysis of native or endogenous peptides in biofluids can provide valuable insights into disease mechanisms. Furthermore, the detected peptides may also have utility as potential biomarkers for non-invasive monitoring of human diseases. The non-invasive nature of urine collection and the abundance of peptides in the urine makes analysis by high-throughput ‘peptidomics’ methods , an attractive approach for investigating the pathogenesis of renal disease. However, urine peptidomics methodologies can be problematic with regards to difficulties associated with sample preparation. The urine matrix can provide significant background interference in making the analytical measurements that it hampers both the identification of peptides andmore » the depth of the peptidomics read when utilizing LC-MS based peptidome analysis. We report on a novel adaptation of the standard solid phase extraction (SPE) method to a modified SPE (mSPE) approach for improved peptide yield and analysis sensitivity with LC-MS based peptidomics in terms of time, cost, clogging of the LC-MS column, peptide yield, peptide quality, and number of peptides identified by each method. Expense and time requirements were comparable for both SPE and mSPE, but more interfering contaminants from the urine matrix were evident in the SPE preparations (e.g., clogging of the LC-MS columns, yellowish background coloration of prepared samples due to retained urobilin, lower peptide yields) when compared to the mSPE method. When we compared data from technical replicates of 4 runs, the mSPE method provided significantly improved efficiencies for the preparation of samples from urine (e.g., mSPE peptide identification 82% versus 18% with SPE; p = 8.92E-05). Additionally, peptide identifications, when applying the mSPE method, highlighted the biology of differential activation of urine peptidases during acute renal transplant rejection with distinct laddering of specific peptides, which was obscured for most proteins when utilizing the conventional SPE method. In conclusion, the mSPE method was found to be superior to the conventional, standard SPE method for urine peptide sample preparation when applying LC-MS peptidomics analysis due to the optimized sample clean up that provided improved experimental inference from the confidently identified peptides.« less
Operations planning and analysis handbook for NASA/MSFC phase B development projects
NASA Technical Reports Server (NTRS)
Batson, Robert C.
1986-01-01
Current operations planning and analysis practices on NASA/MSFC Phase B projects were investigated with the objectives of (1) formalizing these practices into a handbook and (2) suggesting improvements. The study focused on how Science and Engineering (S&E) Operational Personnel support Program Development (PD) Task Teams. The intimate relationship between systems engineering and operations analysis was examined. Methods identified for use by operations analysts during Phase B include functional analysis, interface analysis methods to calculate/allocate such criteria as reliability, Maintainability, and operations and support cost.
Down-weighting overlapping genes improves gene set analysis
2012-01-01
Background The identification of gene sets that are significantly impacted in a given condition based on microarray data is a crucial step in current life science research. Most gene set analysis methods treat genes equally, regardless how specific they are to a given gene set. Results In this work we propose a new gene set analysis method that computes a gene set score as the mean of absolute values of weighted moderated gene t-scores. The gene weights are designed to emphasize the genes appearing in few gene sets, versus genes that appear in many gene sets. We demonstrate the usefulness of the method when analyzing gene sets that correspond to the KEGG pathways, and hence we called our method Pathway Analysis with Down-weighting of Overlapping Genes (PADOG). Unlike most gene set analysis methods which are validated through the analysis of 2-3 data sets followed by a human interpretation of the results, the validation employed here uses 24 different data sets and a completely objective assessment scheme that makes minimal assumptions and eliminates the need for possibly biased human assessments of the analysis results. Conclusions PADOG significantly improves gene set ranking and boosts sensitivity of analysis using information already available in the gene expression profiles and the collection of gene sets to be analyzed. The advantages of PADOG over other existing approaches are shown to be stable to changes in the database of gene sets to be analyzed. PADOG was implemented as an R package available at: http://bioinformaticsprb.med.wayne.edu/PADOG/or http://www.bioconductor.org. PMID:22713124
Automated Analysis of Renewable Energy Datasets ('EE/RE Data Mining')
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, Brian; Elmore, Ryan; Getman, Dan
This poster illustrates methods to substantially improve the understanding of renewable energy data sets and the depth and efficiency of their analysis through the application of statistical learning methods ('data mining') in the intelligent processing of these often large and messy information sources. The six examples apply methods for anomaly detection, data cleansing, and pattern mining to time-series data (measurements from metering points in buildings) and spatiotemporal data (renewable energy resource datasets).
Particle Identification in the NIMROD-ISiS Detector Array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wuenschel, S.; Hagel, K.; May, L. W.
Interest in the influence of the neutron-to-proton (N/Z) ratio on multifragmenting nuclei has demanded an improvement in the capabilities of multi-detector arrays as well as the companion analysis methods. The particle identification method used in the NIMROD-ISiS 4{pi} array is described. Performance of the detectors and the analysis method are presented for the reaction of {sup 86}Kr+{sup 64}Ni at 35 MeV/u.
A Pilot Study on the Potential Use of Tomatis Method to Improve L2 Reading Fluency
ERIC Educational Resources Information Center
Chou, Peter Tze-Ming
2012-01-01
This was a pilot study that used the Tomatis Method to see the effects it had on L2 reading fluency in a group of Taiwanese learners. Eight volunteers participated in this study undertaking 40-hours of before-and-after-experimental treatments. The results from the analysis showed that the participants had significant improvements in the areas of…
RAPID SEPARATION METHOD FOR EMERGENCY WATER AND URINE SAMPLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, S.; Culligan, B.
2008-08-27
The Savannah River Site Environmental Bioassay Lab participated in the 2008 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2008. A new rapid column separation method was used for analysis of actinides and {sup 90}Sr the NRIP 2008 emergency water and urine samples. Significant method improvements were applied to reduce analytical times. As a result, much faster analysis times were achieved, less than 3 hours for determination of {sup 90}Sr and 3-4 hours for actinides. This represents a 25%-33% improvement in analysis times from NRIP 2007 and a {approx}100% improvement compared tomore » NRIP 2006 report times. Column flow rates were increased by a factor of two, with no significant adverse impact on the method performance. Larger sample aliquots, shorter count times, faster cerium fluoride microprecipitation and streamlined calcium phosphate precipitation were also employed. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and {sup 90}Sr analyses for NRIP 2008 emergency urine samples. High levels of potential matrix interferences may be present in emergency samples and rugged methods are essential. Extremely high levels of {sup 210}Po were found to have an adverse effect on the uranium results for the NRIP-08 urine samples, while uranium results for NRIP-08 water samples were not affected. This problem, which was not observed for NRIP-06 or NRIP-07 urine samples, was resolved by using an enhanced {sup 210}Po removal step, which will be described.« less
Nishi, Kengo
2018-01-01
Passive microrheology typically deduces shear elastic loss and storage moduli from displacement time series or mean-squared displacements (MSD) of thermally fluctuating probe particles in equilibrium materials. Common data analysis methods use either Kramers–Kronig (KK) transformation or functional fitting to calculate frequency-dependent loss and storage moduli. We propose a new analysis method for passive microrheology that avoids the limitations of both of these approaches. In this method, we determine both real and imaginary components of the complex, frequency-dependent response function χ(ω) = χ′(ω) + iχ′′(ω) as direct integral transforms of the MSD of thermal particle motion. This procedure significantly improves the high-frequency fidelity of χ(ω) relative to the use of KK transformation, which has been shown to lead to artifacts in χ′(ω). We test our method on both model and experimental data. Experiments were performed on solutions of worm-like micelles and dilute collagen solutions. While the present method agrees well with established KK-based methods at low frequencies, we demonstrate significant improvement at high frequencies using our symmetric analysis method, up to almost the fundamental Nyquist limit. PMID:29611576
Ha Dinh, Thi Thuy; Bonner, Ann; Clark, Robyn; Ramsbotham, Joanne; Hines, Sonia
2016-01-01
Chronic diseases are increasing worldwide and have become a significant burden to those affected by those diseases. Disease-specific education programs have demonstrated improved outcomes, although people do forget information quickly or memorize it incorrectly. The teach-back method was introduced in an attempt to reinforce education to patients. To date, the evidence regarding the effectiveness of health education employing the teach-back method in improved care has not yet been reviewed systematically. This systematic review examined the evidence on using the teach-back method in health education programs for improving adherence and self-management of people with chronic disease. Adults aged 18 years and over with one or more than one chronic disease.All types of interventions which included the teach-back method in an education program for people with chronic diseases. The comparator was chronic disease education programs that did not involve the teach-back method.Randomized and non-randomized controlled trials, cohort studies, before-after studies and case-control studies.The outcomes of interest were adherence, self-management, disease-specific knowledge, readmission, knowledge retention, self-efficacy and quality of life. Searches were conducted in CINAHL, MEDLINE, EMBASE, Cochrane CENTRAL, Web of Science, ProQuest Nursing and Allied Health Source, and Google Scholar databases. Search terms were combined by AND or OR in search strings. Reference lists of included articles were also searched for further potential references. Two reviewers conducted quality appraisal of papers using the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument. Data were extracted using the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument data extraction instruments. There was significant heterogeneity in selected studies, hence a meta-analysis was not possible and the results were presented in narrative form. Of the 21 articles retrieved in full, 12 on the use of the teach-back method met the inclusion criteria and were selected for analysis. Four studies confirmed improved disease-specific knowledge in intervention participants. One study showed a statistically significant improvement in adherence to medication and diet among type 2 diabetics patients in the intervention group compared to the control group (p < 0.001). Two studies found statistically significant improvements in self-efficacy (p = 0.0026 and p < 0.001) in the intervention groups. One study examined quality of life in heart failure patients but the results did not improve from the intervention (p = 0.59). Five studies found a reduction in readmission rates and hospitalization but these were not always statistically significant. Two studies showed improvement in daily weighing among heart failure participants, and in adherence to diet, exercise and foot care among those with type 2 diabetes. Overall, the teach-back method showed positive effects in a wide range of health care outcomes although these were not always statistically significant. Studies in this systematic review revealed improved outcomes in disease-specific knowledge, adherence, self-efficacy and the inhaler technique. There was a positive but inconsistent trend also seen in improved self-care and reduction of hospital readmission rates. There was limited evidence on improvement in quality of life or disease related knowledge retention.Evidence from the systematic review supports the use of the teach-back method in educating people with chronic disease to maximize their disease understanding and promote knowledge, adherence, self-efficacy and self-care skills.Future studies are required to strengthen the evidence on effects of the teach-back method. Larger randomized controlled trials will be needed to determine the effectiveness of the teach-back method in quality of life, reduction of readmission, and hospitalizations.
2011-01-01
Background Segmentation is the most crucial part in the computer-aided bone age assessment. A well-known type of segmentation performed in the system is adaptive segmentation. While providing better result than global thresholding method, the adaptive segmentation produces a lot of unwanted noise that could affect the latter process of epiphysis extraction. Methods A proposed method with anisotropic diffusion as pre-processing and a novel Bounded Area Elimination (BAE) post-processing algorithm to improve the algorithm of ossification site localization technique are designed with the intent of improving the adaptive segmentation result and the region-of interest (ROI) localization accuracy. Results The results are then evaluated by quantitative analysis and qualitative analysis using texture feature evaluation. The result indicates that the image homogeneity after anisotropic diffusion has improved averagely on each age group for 17.59%. Results of experiments showed that the smoothness has been improved averagely 35% after BAE algorithm and the improvement of ROI localization has improved for averagely 8.19%. The MSSIM has improved averagely 10.49% after performing the BAE algorithm on the adaptive segmented hand radiograph. Conclusions The result indicated that hand radiographs which have undergone anisotropic diffusion have greatly reduced the noise in the segmented image and the result as well indicated that the BAE algorithm proposed is capable of removing the artifacts generated in adaptive segmentation. PMID:21952080
Big Data is a powerful tool for environmental improvements in the construction business
NASA Astrophysics Data System (ADS)
Konikov, Aleksandr; Konikov, Gregory
2017-10-01
The work investigates the possibility of applying the Big Data method as a tool to implement environmental improvements in the construction business. The method is recognized as effective in analyzing big volumes of heterogeneous data. It is noted that all preconditions exist for this method to be successfully used for resolution of environmental issues in the construction business. It is proven that the principal Big Data techniques (cluster analysis, crowd sourcing, data mixing and integration) can be applied in the sphere in question. It is concluded that Big Data is a truly powerful tool to implement environmental improvements in the construction business.
Improved profiling of estrogen metabolites by orbitrap LC/MS
Li, Xingnan; Franke, Adrian A.
2015-01-01
Estrogen metabolites are important biomarkers to evaluate cancer risks and metabolic diseases. Due to their low physiological levels, a sensitive and accurate method is required, especially for the quantitation of unconjugated forms of endogenous steroids and their metabolites in humans. Here, we evaluated various derivatives of estrogens for improved analysis by orbitrap LC/MS in human serum samples. A new chemical derivatization reagent was applied modifying phenolic steroids to form 1-methylimidazole-2-sulfonyl adducts. The method significantly improves the sensitivity 2–100 fold by full scan MS and targeted selected ion monitoring MS over other derivatization methods including, dansyl, picolinoyl, and pyridine-3-sulfonyl products. PMID:25543003
An improved design method for EPC middleware
NASA Astrophysics Data System (ADS)
Lou, Guohuan; Xu, Ran; Yang, Chunming
2014-04-01
For currently existed problems and difficulties during the small and medium enterprises use EPC (Electronic Product Code) ALE (Application Level Events) specification to achieved middleware, based on the analysis of principle of EPC Middleware, an improved design method for EPC middleware is presented. This method combines the powerful function of MySQL database, uses database to connect reader-writer with upper application system, instead of development of ALE application program interface to achieve a middleware with general function. This structure is simple and easy to implement and maintain. Under this structure, different types of reader-writers added can be configured conveniently and the expandability of the system is improved.
Remote sensing image ship target detection method based on visual attention model
NASA Astrophysics Data System (ADS)
Sun, Yuejiao; Lei, Wuhu; Ren, Xiaodong
2017-11-01
The traditional methods of detecting ship targets in remote sensing images mostly use sliding window to search the whole image comprehensively. However, the target usually occupies only a small fraction of the image. This method has high computational complexity for large format visible image data. The bottom-up selective attention mechanism can selectively allocate computing resources according to visual stimuli, thus improving the computational efficiency and reducing the difficulty of analysis. Considering of that, a method of ship target detection in remote sensing images based on visual attention model was proposed in this paper. The experimental results show that the proposed method can reduce the computational complexity while improving the detection accuracy, and improve the detection efficiency of ship targets in remote sensing images.
Warren, Jamie M; Pawliszyn, Janusz
2011-12-16
For air/headspace analysis, needle trap devices (NTDs) are applicable for sampling a wide range of volatiles such as benzene, alkanes, and semi-volatile particulate bound compounds such as pyrene. This paper describes a new NTD that is simpler to produce and improves performance relative to previous NTD designs. A NTD utilizing a side-hole needle used a modified tip, which removed the need to use epoxy glue to hold sorbent particles inside the NTD. This design also improved the seal between the NTD and narrow neck liner of the GC injector; therefore, improving the desorption efficiency. A new packing method has been developed and evaluated using solvent to pack the device, and is compared to NTDs prepared using the previous vacuum aspiration method. The slurry packing method reduced preparation time and improved reproducibility between NTDs. To evaluate the NTDs, automated headspace extraction was completed using benzene, toluene, ethylbenzene, p-xylene (BTEX), anthracene, and pyrene (PAH). NTD geometries evaluated include: blunt tip with side-hole needle, tapered tip with side-hole needle, slider tip with side-hole, dome tapered tip with side-hole and blunt with no side-hole needle (expanded desorptive flow). Results demonstrate that the tapered and slider tip NTDs performed with improved desorption efficiency. Copyright © 2011 Elsevier B.V. All rights reserved.
Effects of eye artifact removal methods on single trial P300 detection, a comparative study.
Ghaderi, Foad; Kim, Su Kyoung; Kirchner, Elsa Andrea
2014-01-15
Electroencephalographic signals are commonly contaminated by eye artifacts, even if recorded under controlled conditions. The objective of this work was to quantitatively compare standard artifact removal methods (regression, filtered regression, Infomax, and second order blind identification (SOBI)) and two artifact identification approaches for independent component analysis (ICA) methods, i.e. ADJUST and correlation. To this end, eye artifacts were removed and the cleaned datasets were used for single trial classification of P300 (a type of event related potentials elicited using the oddball paradigm). Statistical analysis of the results confirms that the combination of Infomax and ADJUST provides a relatively better performance (0.6% improvement on average of all subject) while the combination of SOBI and correlation performs the worst. Low-pass filtering the data at lower cutoffs (here 4 Hz) can also improve the classification accuracy. Without requiring any artifact reference channel, the combination of Infomax and ADJUST improves the classification performance more than the other methods for both examined filtering cutoffs, i.e., 4 Hz and 25 Hz. Copyright © 2013 Elsevier B.V. All rights reserved.
Guo, Yanjie; Chen, Xuefeng; Wang, Shibin; Sun, Ruobin; Zhao, Zhibin
2017-05-18
The gearbox is one of the key components in wind turbines. Gearbox fault signals are usually nonstationary and highly contaminated with noise. The presence of amplitude-modulated and frequency-modulated (AM-FM) characteristics compound the difficulty of precise fault diagnosis of wind turbines, therefore, it is crucial to develop an effective fault diagnosis method for such equipment. This paper presents an improved diagnosis method for wind turbines via the combination of synchrosqueezing transform and local mean decomposition. Compared to the conventional time-frequency analysis techniques, the improved method which is performed in non-real-time can effectively reduce the noise pollution of the signals and preserve the signal characteristics, and hence is suitable for the analysis of nonstationary signals with high noise. This method is further validated by simulated signals and practical vibration data measured from a 1.5 MW wind turbine. The results confirm that the proposed method can simultaneously control the noise and increase the accuracy of time-frequency representation.
Guo, Yanjie; Chen, Xuefeng; Wang, Shibin; Sun, Ruobin; Zhao, Zhibin
2017-01-01
The gearbox is one of the key components in wind turbines. Gearbox fault signals are usually nonstationary and highly contaminated with noise. The presence of amplitude-modulated and frequency-modulated (AM-FM) characteristics compound the difficulty of precise fault diagnosis of wind turbines, therefore, it is crucial to develop an effective fault diagnosis method for such equipment. This paper presents an improved diagnosis method for wind turbines via the combination of synchrosqueezing transform and local mean decomposition. Compared to the conventional time-frequency analysis techniques, the improved method which is performed in non-real-time can effectively reduce the noise pollution of the signals and preserve the signal characteristics, and hence is suitable for the analysis of nonstationary signals with high noise. This method is further validated by simulated signals and practical vibration data measured from a 1.5 MW wind turbine. The results confirm that the proposed method can simultaneously control the noise and increase the accuracy of time-frequency representation. PMID:28524090
Improving Incremental Balance in the GSI 3DVAR Analysis System
NASA Technical Reports Server (NTRS)
Errico, Ronald M.; Yang, Runhua; Kleist, Daryl T.; Parrish, David F.; Derber, John C.; Treadon, Russ
2008-01-01
The Gridpoint Statistical Interpolation (GSI) analysis system is a unified global/regional 3DVAR analysis code that has been under development for several years at the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center. It has recently been implemented into operations at NCEP in both the global and North American data assimilation systems (GDAS and NDAS). An important aspect of this development has been improving the balance of the analysis produced by GSI. The improved balance between variables has been achieved through the inclusion of a Tangent Linear Normal Mode Constraint (TLNMC). The TLNMC method has proven to be very robust and effective. The TLNMC as part of the global GSI system has resulted in substantial improvement in data assimilation both at NCEP and at the NASA Global Modeling and Assimilation Office (GMAO).
Transient loads analysis for space flight applications
NASA Technical Reports Server (NTRS)
Thampi, S. K.; Vidyasagar, N. S.; Ganesan, N.
1992-01-01
A significant part of the flight readiness verification process involves transient analysis of the coupled Shuttle-payload system to determine the low frequency transient loads. This paper describes a methodology for transient loads analysis and its implementation for the Spacelab Life Sciences Mission. The analysis is carried out using two major software tools - NASTRAN and an external FORTRAN code called EZTRAN. This approach is adopted to overcome some of the limitations of NASTRAN's standard transient analysis capabilities. The method uses Data Recovery Matrices (DRM) to improve computational efficiency. The mode acceleration method is fully implemented in the DRM formulation to recover accurate displacements, stresses, and forces. The advantages of the method are demonstrated through a numerical example.
Faramarzi, Salar; Shamsi, Abdolhossein; Samadi, Maryam; Ahmadzade, Maryam
2015-01-01
with due attention to the importance of learning disabilities and necessity of presenting interventions for improvement of these disorders in order to prevent future problems, this study used meta-analysis of the research model on the impact of psychological and educational interventions to improve academic performance of students with learning disabilities. with the use of meta-analysis method by integrating the results of various researches, this study specifies the effect of psychological and educational interventions. In this order, 57 studies, which their methodology was accepted, were selected and meta-analysis was performed on them. The research instrument was a meta-analysis checklist. The effect size for the effectiveness of psychological-educational interventions on improving the academic performance of students with mathematics disorder (0.57), impaired writing (0.50) and dyslexia (0.55) were reported. The result of meta-analysis showed that according to Cohen's table, the effect size is above average, and it can be said that educational and psychological interventions improve the academic performance of students with learning disabilities.
Teaching Improvement Model Designed with DEA Method and Management Matrix
ERIC Educational Resources Information Center
Montoneri, Bernard
2014-01-01
This study uses student evaluation of teachers to design a teaching improvement matrix based on teaching efficiency and performance by combining management matrix and data envelopment analysis. This matrix is designed to formulate suggestions to improve teaching. The research sample consists of 42 classes of freshmen following a course of English…
Analysis method for Thomson scattering diagnostics in GAMMA 10/PDX.
Ohta, K; Yoshikawa, M; Yasuhara, R; Chikatsu, M; Shima, Y; Kohagura, J; Sakamoto, M; Nakasima, Y; Imai, T; Ichimura, M; Yamada, I; Funaba, H; Minami, T
2016-11-01
We have developed an analysis method to improve the accuracies of electron temperature measurement by employing a fitting technique for the raw Thomson scattering (TS) signals. Least square fitting of the raw TS signals enabled reduction of the error in the electron temperature measurement. We applied the analysis method to a multi-pass (MP) TS system. Because the interval between the MPTS signals is very short, it is difficult to separately analyze each Thomson scattering signal intensity by using the raw signals. We used the fitting method to obtain the original TS scattering signals from the measured raw MPTS signals to obtain the electron temperatures in each pass.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2002-01-01
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Improving Quality of Seal Leak Test Product using Six Sigma
NASA Astrophysics Data System (ADS)
Luthfi Malik, Abdullah; Akbar, Muhammad; Irianto, Dradjad
2016-02-01
Seal leak test part is a polyurethane material-based product. Based on past data, defect level of this product was 8%, higher than the target of 5%. Quality improvement effort was done using six sigma method that included phases of define, measure, analyse, improve, and control. In the design phase, a Delphi method was used to identify factors that were critical to quality. In the measure phase, stability and process capability was measured. Fault tree analysis (FTA) and failure mode and effect analysis (FMEA) were used in the next phase to analize the root cause and to determine the priority issues. Improve phase was done by compiling, selecting, and designing alternative repair. Some improvement efforts were identified, i.e. (i) making a checklist for maintenance schedules, (ii) making written reminder form, (iii) modifying the SOP more detail, and (iv) performing a major service to the vacuum machine. To ensure the continuity of improvement efforts, some control activities were executed, i.e. (i) controlling, monitoring, documenting, and setting target frequently, (ii) implementing reward and punishment system, (iii) adding cleaning tool, and (iv) building six sigma organizational structure.
Major advances in testing of dairy products: milk component and dairy product attribute testing.
Barbano, D M; Lynch, J M
2006-04-01
Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.
Hybrid PV/diesel solar power system design using multi-level factor analysis optimization
NASA Astrophysics Data System (ADS)
Drake, Joshua P.
Solar power systems represent a large area of interest across a spectrum of organizations at a global level. It was determined that a clear understanding of current state of the art software and design methods, as well as optimization methods, could be used to improve the design methodology. Solar power design literature was researched for an in depth understanding of solar power system design methods and algorithms. Multiple software packages for the design and optimization of solar power systems were analyzed for a critical understanding of their design workflow. In addition, several methods of optimization were studied, including brute force, Pareto analysis, Monte Carlo, linear and nonlinear programming, and multi-way factor analysis. Factor analysis was selected as the most efficient optimization method for engineering design as it applied to solar power system design. The solar power design algorithms, software work flow analysis, and factor analysis optimization were combined to develop a solar power system design optimization software package called FireDrake. This software was used for the design of multiple solar power systems in conjunction with an energy audit case study performed in seven Tibetan refugee camps located in Mainpat, India. A report of solar system designs for the camps, as well as a proposed schedule for future installations was generated. It was determined that there were several improvements that could be made to the state of the art in modern solar power system design, though the complexity of current applications is significant.
On-line/on-site analysis of heavy metals in water and soils by laser induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Meng, Deshuo; Zhao, Nanjing; Wang, Yuanyuan; Ma, Mingjun; Fang, Li; Gu, Yanhong; Jia, Yao; Liu, Jianguo
2017-11-01
The enrichment method of heavy metal in water with graphite and aluminum electrode was studied, and combined with plasma restraint device for improving the sensitivity of detection and reducing the limit of detection (LOD) of elements. For aluminum electrode enrichment, the LODs of Cd, Pb and Ni can be as low as several ppb. For graphite enrichment, the measurement time can be less than 3 min. The results showed that the graphite enrichment and aluminum electrode enrichment method can effectively improve the LIBS detection ability. The graphite enrichment method combined with plasma spatial confinement is more suitable for on-line monitoring of industrial waste water, the aluminum electrode enrichment method can be used for trace heavy metal detection in water. A LIBS method and device for soil heavy metals analysis was also developed, and a mobile LIBS system was tested in outfield. The measurement results deduced from LIBS and ICP-MS had a good consistency. The results provided an important application support for rapid and on-site monitoring of heavy metals in soil. (Left: the mobile LIBS system for analysis of heavy metals in soils. Top right: the spatial confinement device. Bottom right: automatic graphite enrichment device for on0line analysis of heavy metals in water).
Materials and Methods for Streamlined Laboratory Analysis of Environmental Samples, FY 2016 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Addleman, Raymond S.; Naes, Benjamin E.; McNamara, Bruce K.
The International Atomic Energy Agency (IAEA) relies upon laboratory analysis of environmental samples (typically referred to as “swipes”) collected during on-site inspections of safeguarded facilities to support the detection and deterrence of undeclared activities. Unfortunately, chemical processing and assay of the samples is slow and expensive. A rapid, effective, and simple extraction process and analysis method is needed to provide certified results with improved timeliness at reduced costs (principally in the form of reduced labor), while maintaining or improving sensitivity and efficacy. To address these safeguard needs the Pacific Northwest National Laboratory (PNNL) explored and demonstrated improved methods for environmentalmore » sample (ES) analysis. Improvements for both bulk and particle analysis were explored. To facilitate continuity and adoption, the new sampling materials and processing methods will be compatible with existing IAEA protocols for ES analysis. PNNL collaborated with Oak Ridge National Laboratory (ORNL), which performed independent validation of the new bulk analysis methods and compared performance to traditional IAEA’s Network of Analytical Laboratories (NWAL) protocol. ORNL efforts are reported separately. This report describes PNNL’s FY 2016 progress, which was focused on analytical application supporting environmental monitoring of uranium enrichment plants and nuclear fuel processing. In the future the technology could be applied to other safeguard applications and analytes related to fuel manufacturing, reprocessing, etc. PNNL’s FY 2016 efforts were broken into two tasks and a summary of progress, accomplishments and highlights are provided below. Principal progress and accomplishments on Task 1, Optimize Materials and Methods for ICP-MS Environmental Sample Analysis, are listed below. • Completed initial procedure for rapid uranium extraction from ES swipes based upon carbonate-peroxide chemistry (delivered to ORNL for evaluation). • Explored improvements to carbonate-peroxide rapid uranium extraction chemistry. • Evaluated new sampling materials and methods (in collaboration with ORNL). • Demonstrated successful ES extractions from standard and novel swipes for a wide range uranium compounds of interest including UO 2F 2 and UO 2(NO 3) 2, U 3O 8 and uranium ore concentrate. • Completed initial discussions with commercial suppliers of PTFE swipe materials. • Submitted one manuscript for publication. Two additional drafts are being prepared. Principal progress and accomplishments on Task 2, Optimize Materials and Methods for Direct SIMS Environmental Sample Analysis, are listed below. • Designed a SIMS swipe sample holder that retrofits into existing equipment and provides simple, effective, and rapid mounting of ES samples for direct assay while enabling automation and laboratory integration. • Identified preferred conductive sampling materials with better performance characteristics. • Ran samples on the new PNNL NWAL equivalent Cameca 1280 SIMS system. • Obtained excellent agreement between isotopic ratios for certified materials and direct SIMS assay of very low levels of LEU and HEU UO 2F 2 particles on carbon fiber sampling material. Sample activities range from 1 to 500 CPM (uranium mass on sample is dependent upon specific isotope ratio but is frequently in the subnanogram range). • Found that the presence of the UF molecular ions, as measured by SIMS, provides chemical information about the particle that is separate from the uranium isotopics and strongly suggests that those particles originated from an UF6 enrichment activity. • Submitted one manuscript for publication. Another manuscript is in preparation.« less
A Review of Multivariate Methods for Multimodal Fusion of Brain Imaging Data
Adali, Tülay; Yu, Qingbao; Calhoun, Vince D.
2011-01-01
The development of various neuroimaging techniques is rapidly improving the measurements of brain function/structure. However, despite improvements in individual modalities, it is becoming increasingly clear that the most effective research approaches will utilize multi-modal fusion, which takes advantage of the fact that each modality provides a limited view of the brain. The goal of multimodal fusion is to capitalize on the strength of each modality in a joint analysis, rather than a separate analysis of each. This is a more complicated endeavor that must be approached more carefully and efficient methods should be developed to draw generalized and valid conclusions from high dimensional data with a limited number of subjects. Numerous research efforts have been reported in the field based on various statistical approaches, e.g. independent component analysis (ICA), canonical correlation analysis (CCA) and partial least squares (PLS). In this review paper, we survey a number of multivariate methods appearing in previous reports, which are performed with or without prior information and may have utility for identifying potential brain illness biomarkers. We also discuss the possible strengths and limitations of each method, and review their applications to brain imaging data. PMID:22108139
Large scale analysis of the mutational landscape in HT-SELEX improves aptamer discovery
Hoinka, Jan; Berezhnoy, Alexey; Dao, Phuong; Sauna, Zuben E.; Gilboa, Eli; Przytycka, Teresa M.
2015-01-01
High-Throughput (HT) SELEX combines SELEX (Systematic Evolution of Ligands by EXponential Enrichment), a method for aptamer discovery, with massively parallel sequencing technologies. This emerging technology provides data for a global analysis of the selection process and for simultaneous discovery of a large number of candidates but currently lacks dedicated computational approaches for their analysis. To close this gap, we developed novel in-silico methods to analyze HT-SELEX data and utilized them to study the emergence of polymerase errors during HT-SELEX. Rather than considering these errors as a nuisance, we demonstrated their utility for guiding aptamer discovery. Our approach builds on two main advancements in aptamer analysis: AptaMut—a novel technique allowing for the identification of polymerase errors conferring an improved binding affinity relative to the ‘parent’ sequence and AptaCluster—an aptamer clustering algorithm which is to our best knowledge, the only currently available tool capable of efficiently clustering entire aptamer pools. We applied these methods to an HT-SELEX experiment developing aptamers against Interleukin 10 receptor alpha chain (IL-10RA) and experimentally confirmed our predictions thus validating our computational methods. PMID:25870409
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, W; Wang, J; Zhang, H
Purpose: To review the literature in using computerized PET/CT image analysis for the evaluation of tumor response to therapy. Methods: We reviewed and summarized more than 100 papers that used computerized image analysis techniques for the evaluation of tumor response with PET/CT. This review mainly covered four aspects: image registration, tumor segmentation, image feature extraction, and response evaluation. Results: Although rigid image registration is straightforward, it has been shown to achieve good alignment between baseline and evaluation scans. Deformable image registration has been shown to improve the alignment when complex deformable distortions occur due to tumor shrinkage, weight loss ormore » gain, and motion. Many semi-automatic tumor segmentation methods have been developed on PET. A comparative study revealed benefits of high levels of user interaction with simultaneous visualization of CT images and PET gradients. On CT, semi-automatic methods have been developed for only tumors that show marked difference in CT attenuation between the tumor and the surrounding normal tissues. Quite a few multi-modality segmentation methods have been shown to improve accuracy compared to single-modality algorithms. Advanced PET image features considering spatial information, such as tumor volume, tumor shape, total glycolytic volume, histogram distance, and texture features have been found more informative than the traditional SUVmax for the prediction of tumor response. Advanced CT features, including volumetric, attenuation, morphologic, structure, and texture descriptors, have also been found advantage over the traditional RECIST and WHO criteria in certain tumor types. Predictive models based on machine learning technique have been constructed for correlating selected image features to response. These models showed improved performance compared to current methods using cutoff value of a single measurement for tumor response. Conclusion: This review showed that computerized PET/CT image analysis holds great potential to improve the accuracy in evaluation of tumor response. This work was supported in part by the National Cancer Institute Grant R01CA172638.« less
Li, Peipei; Piao, Yongjun; Shon, Ho Sun; Ryu, Keun Ho
2015-10-28
Recently, rapid improvements in technology and decrease in sequencing costs have made RNA-Seq a widely used technique to quantify gene expression levels. Various normalization approaches have been proposed, owing to the importance of normalization in the analysis of RNA-Seq data. A comparison of recently proposed normalization methods is required to generate suitable guidelines for the selection of the most appropriate approach for future experiments. In this paper, we compared eight non-abundance (RC, UQ, Med, TMM, DESeq, Q, RPKM, and ERPKM) and two abundance estimation normalization methods (RSEM and Sailfish). The experiments were based on real Illumina high-throughput RNA-Seq of 35- and 76-nucleotide sequences produced in the MAQC project and simulation reads. Reads were mapped with human genome obtained from UCSC Genome Browser Database. For precise evaluation, we investigated Spearman correlation between the normalization results from RNA-Seq and MAQC qRT-PCR values for 996 genes. Based on this work, we showed that out of the eight non-abundance estimation normalization methods, RC, UQ, Med, TMM, DESeq, and Q gave similar normalization results for all data sets. For RNA-Seq of a 35-nucleotide sequence, RPKM showed the highest correlation results, but for RNA-Seq of a 76-nucleotide sequence, least correlation was observed than the other methods. ERPKM did not improve results than RPKM. Between two abundance estimation normalization methods, for RNA-Seq of a 35-nucleotide sequence, higher correlation was obtained with Sailfish than that with RSEM, which was better than without using abundance estimation methods. However, for RNA-Seq of a 76-nucleotide sequence, the results achieved by RSEM were similar to without applying abundance estimation methods, and were much better than with Sailfish. Furthermore, we found that adding a poly-A tail increased alignment numbers, but did not improve normalization results. Spearman correlation analysis revealed that RC, UQ, Med, TMM, DESeq, and Q did not noticeably improve gene expression normalization, regardless of read length. Other normalization methods were more efficient when alignment accuracy was low; Sailfish with RPKM gave the best normalization results. When alignment accuracy was high, RC was sufficient for gene expression calculation. And we suggest ignoring poly-A tail during differential gene expression analysis.
Improved method and apparatus for chromatographic quantitative analysis
Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.
An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.
[Analysis of thickening polysaccharides by the improved diethyldithioacetal derivatization method].
Akiyama, Takumi; Yamazaki, Takeshi; Tanamoto, Kenichi
2011-01-01
The identification test for thickening polysaccharides containing neutral saccharides and uronic acids was investigated by GC analysis of constituent monosaccharides. The reported method, in which monosaccharides were converted to diethyldithioacetal derivatives with ethanethiol followed by trimethylsilylation, was improved in terms of operability and reproducibility of GC/MS analysis. The suitability of the improved diethyldithioacetal derivatization method was determined for seven thickening polysaccharides, i.e., carob bean gum, guar gum, karaya gum, gum arabic, gum ghatti, tragacanth gum and peach gum. The samples were acid-hydrolyzed to form monosaccharides. The hydrolysates were derivatized and analyzed with GC/FID. Each sugar derivative was detected as a single peak and was well separated from others on the chromatograms. The amounts of constituent monosaccharides in thickening polysaccharides were successfully estimated. Seven polysaccharides were distinguished from each other on the basis of constituent monosaccharides. Further examination of the time period of hydrolysis of polysaccharides using peach gum showed that the optimal times were not the same for all monosaccharides. A longer time was needed to hydrolyze glucuronic acid than neutral saccharides. The findings suggest that hydrolysis time may sometimes affect the analytical results on composition of constituent monosaccharides in polysaccharides.
Method for improving instrument response
Hahn, David W.; Hencken, Kenneth R.; Johnsen, Howard A.; Flower, William L.
2000-01-01
This invention pertains generally to a method for improving the accuracy of particle analysis under conditions of discrete particle loading and particularly to a method for improving signal-to-noise ratio and instrument response in laser spark spectroscopic analysis of particulate emissions. Under conditions of low particle density loading (particles/m.sup.3) resulting from low overall metal concentrations and/or large particle size uniform sampling can not be guaranteed. The present invention discloses a technique for separating laser sparks that arise from sample particles from those that do not; that is, a process for systematically "gating" the instrument response arising from "sampled" particles from those responses which do not, is dislosed as a solution to his problem. The disclosed approach is based on random sampling combined with a conditional analysis of each pulse. A threshold value is determined for the ratio of the intensity of a spectral line for a given element to a baseline region. If the threshold value is exceeded, the pulse is classified as a "hit" and that data is collected and an average spectrum is generated from an arithmetic average of "hits". The true metal concentration is determined from the averaged spectrum.
1975-12-01
is determined by combustion in an induction furnace with iron as a flux. The methods for moisture, loss in ignition, water-soluble matter, acid... determination of talc in nitro- cellulose-base propellants. The first method (which is the method recom- mended for the usual nitrocellulose -base...In the present report an improved scheme is proposed for the analysis of talc. The silica and magnesium oxide are determined by fusion with sodium
Development of new methodologies for evaluating the energy performance of new commercial buildings
NASA Astrophysics Data System (ADS)
Song, Suwon
The concept of Measurement and Verification (M&V) of a new building continues to become more important because efficient design alone is often not sufficient to deliver an efficient building. Simulation models that are calibrated to measured data can be used to evaluate the energy performance of new buildings if they are compared to energy baselines such as similar buildings, energy codes, and design standards. Unfortunately, there is a lack of detailed M&V methods and analysis methods to measure energy savings from new buildings that would have hypothetical energy baselines. Therefore, this study developed and demonstrated several new methodologies for evaluating the energy performance of new commercial buildings using a case-study building in Austin, Texas. First, three new M&V methods were developed to enhance the previous generic M&V framework for new buildings, including: (1) The development of a method to synthesize weather-normalized cooling energy use from a correlation of Motor Control Center (MCC) electricity use when chilled water use is unavailable, (2) The development of an improved method to analyze measured solar transmittance against incidence angle for sample glazing using different solar sensor types, including Eppley PSP and Li-Cor sensors, and (3) The development of an improved method to analyze chiller efficiency and operation at part-load conditions. Second, three new calibration methods were developed and analyzed, including: (1) A new percentile analysis added to the previous signature method for use with a DOE-2 calibration, (2) A new analysis to account for undocumented exhaust air in DOE-2 calibration, and (3) An analysis of the impact of synthesized direct normal solar radiation using the Erbs correlation on DOE-2 simulation. Third, an analysis of the actual energy savings compared to three different energy baselines was performed, including: (1) Energy Use Index (EUI) comparisons with sub-metered data, (2) New comparisons against Standards 90.1-1989 and 90.1-2001, and (3) A new evaluation of the performance of selected Energy Conservation Design Measures (ECDMs). Finally, potential energy savings were also simulated from selected improvements, including: minimum supply air flow, undocumented exhaust air, and daylighting.
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen
2013-10-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer's disease classification task. As an additional benefit, the technique also allows one to compute informative "error bars" on the volume estimates of individual structures. Copyright © 2013 Elsevier B.V. All rights reserved.
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Leemput, Koen Van
2013-01-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer’s disease classification task. As an additional benefit, the technique also allows one to compute informative “error bars” on the volume estimates of individual structures. PMID:23773521
Cost-benefit analysis of sequential warning lights in nighttime work zone tapers.
DOT National Transportation Integrated Search
2011-06-01
Improving safety at nighttime work zones is important because of the extra visibility concerns. The deployment of sequential lights is an innovative method for improving driver recognition of lane closures and work zone tapers. Sequential lights are ...
Grant, Aileen; Dreischulte, Tobias; Treweek, Shaun; Guthrie, Bruce
2012-08-28
Trials of complex interventions are criticized for being 'black box', so the UK Medical Research Council recommends carrying out a process evaluation to explain the trial findings. We believe it is good practice to pre-specify and publish process evaluation protocols to set standards and minimize bias. Unlike protocols for trials, little guidance or standards exist for the reporting of process evaluations. This paper presents the mixed-method process evaluation protocol of a cluster randomized trial, drawing on a framework designed by the authors. This mixed-method evaluation is based on four research questions and maps data collection to a logic model of how the data-driven quality improvement in primary care (DQIP) intervention is expected to work. Data collection will be predominately by qualitative case studies in eight to ten of the trial practices, focus groups with patients affected by the intervention and quantitative analysis of routine practice data, trial outcome and questionnaire data and data from the DQIP intervention. We believe that pre-specifying the intentions of a process evaluation can help to minimize bias arising from potentially misleading post-hoc analysis. We recognize it is also important to retain flexibility to examine the unexpected and the unintended. From that perspective, a mixed-methods evaluation allows the combination of exploratory and flexible qualitative work, and more pre-specified quantitative analysis, with each method contributing to the design, implementation and interpretation of the other.As well as strengthening the study the authors hope to stimulate discussion among their academic colleagues about publishing protocols for evaluations of randomized trials of complex interventions. DATA-DRIVEN QUALITY IMPROVEMENT IN PRIMARY CARE TRIAL REGISTRATION: ClinicalTrials.gov: NCT01425502.
CORSSA: The Community Online Resource for Statistical Seismicity Analysis
Michael, Andrew J.; Wiemer, Stefan
2010-01-01
Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.
An improved clustering algorithm based on reverse learning in intelligent transportation
NASA Astrophysics Data System (ADS)
Qiu, Guoqing; Kou, Qianqian; Niu, Ting
2017-05-01
With the development of artificial intelligence and data mining technology, big data has gradually entered people's field of vision. In the process of dealing with large data, clustering is an important processing method. By introducing the reverse learning method in the clustering process of PAM clustering algorithm, to further improve the limitations of one-time clustering in unsupervised clustering learning, and increase the diversity of clustering clusters, so as to improve the quality of clustering. The algorithm analysis and experimental results show that the algorithm is feasible.
[The methods of assessment of health risk from exposure to radon and radon daughters].
Demin, V F; Zhukovskiy, M V; Kiselev, S M
2014-01-01
The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.
2005-01-01
The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.
Spectrophotometric analyses of hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) in water.
Shi, Cong; Xu, Zhonghou; Smolinski, Benjamin L; Arienti, Per M; O'Connor, Gregory; Meng, Xiaoguang
2015-07-01
A simple and accurate spectrophotometric method for on-site analysis of royal demolition explosive (RDX) in water samples was developed based on the Berthelot reaction. The sensitivity and accuracy of an existing spectrophotometric method was improved by: replacing toxic chemicals with more stable and safer reagents; optimizing the reagent dose and reaction time; improving color stability; and eliminating the interference from inorganic nitrogen compounds in water samples. Cation and anion exchange resin cartridges were developed and used for sample pretreatment to eliminate the effect of ammonia and nitrate on RDX analyses. The detection limit of the method was determined to be 100 μg/L. The method was used successfully for analysis of RDX in untreated industrial wastewater samples. It can be used for on-site monitoring of RDX in wastewater for early detection of chemical spills and failure of wastewater treatment systems. Copyright © 2015. Published by Elsevier B.V.
The Use of Propensity Scores in Mediation Analysis
ERIC Educational Resources Information Center
Jo, Booil; Stuart, Elizabeth A.; MacKinnon, David P.; Vinokur, Amiram D.
2011-01-01
Mediation analysis uses measures of hypothesized mediating variables to test theory for how a treatment achieves effects on outcomes and to improve subsequent treatments by identifying the most efficient treatment components. Most current mediation analysis methods rely on untested distributional and functional form assumptions for valid…
An algorithmic approach to crustal deformation analysis
NASA Technical Reports Server (NTRS)
Iz, Huseyin Baki
1987-01-01
In recent years the analysis of crustal deformation measurements has become important as a result of current improvements in geodetic methods and an increasing amount of theoretical and observational data provided by several earth sciences. A first-generation data analysis algorithm which combines a priori information with current geodetic measurements was proposed. Relevant methods which can be used in the algorithm were discussed. Prior information is the unifying feature of this algorithm. Some of the problems which may arise through the use of a priori information in the analysis were indicated and preventive measures were demonstrated. The first step in the algorithm is the optimal design of deformation networks. The second step in the algorithm identifies the descriptive model of the deformation field. The final step in the algorithm is the improved estimation of deformation parameters. Although deformation parameters are estimated in the process of model discrimination, they can further be improved by the use of a priori information about them. According to the proposed algorithm this information must first be tested against the estimates calculated using the sample data only. Null-hypothesis testing procedures were developed for this purpose. Six different estimators which employ a priori information were examined. Emphasis was put on the case when the prior information is wrong and analytical expressions for possible improvements under incompatible prior information were derived.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watanabe, Kousuke; Emoto, Noriko; Sunohara, Mitsuhiro
2010-08-27
Research highlights: {yields} Incubating PCR products at a high temperature causes smears in gel electrophoresis. {yields} Smears interfere with the interpretation of methylation analysis using COBRA. {yields} Treatment with exonuclease I and heat-labile alkaline phosphatase eliminates smears. {yields} The elimination of smears improves the visibility of COBRA. -- Abstract: DNA methylation plays a vital role in the regulation of gene expression. Abnormal promoter hypermethylation is an important mechanism of inactivating tumor suppressor genes in human cancers. Combined bisulfite restriction analysis (COBRA) is a widely used method for identifying the DNA methylation of specific CpG sites. Here, we report that exonucleasemore » I and heat-labile alkaline phosphatase can be used for PCR purification for COBRA, improving the visibility of gel electrophoresis after restriction digestion. This improvement is observed when restriction digestion is performed at a high temperature, such as 60 {sup o}C or 65 {sup o}C, with BstUI and TaqI, respectively. This simple method can be applied instead of DNA purification using spin columns or phenol/chloroform extraction. It can also be applied to other situations when PCR products are digested by thermophile-derived restriction enzymes, such as PCR restriction fragment length polymorphism (RFLP) analysis.« less
IRB Process Improvements: A Machine Learning Analysis.
Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A
2017-06-01
Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ochiai, Yoshihiro
Heat-conduction analysis under steady state without heat generation can easily be treated by the boundary element method. However, in the case with heat conduction with heat generation can approximately be solved without a domain integral by an improved multiple-reciprocity boundary element method. The convention multiple-reciprocity boundary element method is not suitable for complicated heat generation. In the improved multiple-reciprocity boundary element method, on the other hand, the domain integral in each step is divided into point, line, and area integrals. In order to solve the problem, the contour lines of heat generation, which approximate the actual heat generation, are used.
Search automation of the generalized method of device operational characteristics improvement
NASA Astrophysics Data System (ADS)
Petrova, I. Yu; Puchkova, A. A.; Zaripova, V. M.
2017-01-01
The article presents brief results of analysis of existing search methods of the closest patents, which can be applied to determine generalized methods of device operational characteristics improvement. There were observed the most widespread clustering algorithms and metrics for determining the proximity degree between two documents. The article proposes the technique of generalized methods determination; it has two implementation variants and consists of 7 steps. This technique has been implemented in the “Patents search” subsystem of the “Intellect” system. Also the article gives an example of the use of the proposed technique.
An exploratory survey of methods used to develop measures of performance
NASA Astrophysics Data System (ADS)
Hamner, Kenneth L.; Lafleur, Charles A.
1993-09-01
Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.
Saturno-Hernández, Pedro J; Gutiérrez-Reyes, Juan Pablo; Vieyra-Romero, Waldo Ivan; Romero-Martínez, Martín; O'Shea-Cuevas, Gabriel Jaime; Lozano-Herrera, Javier; Tavera-Martínez, Sonia; Hernández-Ávila, Mauricio
2016-01-01
To describe the conceptual framework and methods for implementation and analysis of the satisfaction survey of the Mexican System for Social Protection in Health. We analyze the methodological elements of the 2013, 2014 and 2015 surveys, including the instrument, sampling method and study design, conceptual framework, and characteristics and indicators of the analysis. The survey captures information on perceived quality and satisfaction. Sampling has national and State representation. Simple and composite indicators (index of satisfaction and rate of reported quality problems) are built and described. The analysis is completed using Pareto diagrams, correlation between indicators and association with satisfaction by means of multivariate models. The measurement of satisfaction and perceived quality is a complex but necessary process to comply with regulations and to identify strategies for improvement. The described survey presents a design and rigorous analysis focused on its utility for improving.
Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin
2017-08-01
Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. A 'Rich Picture' was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further consideration. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
NASA Technical Reports Server (NTRS)
Ryan, Robert S.; Townsend, John S.
1993-01-01
The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.
NASA Astrophysics Data System (ADS)
Shao, Haidong; Jiang, Hongkai; Zhang, Haizhou; Duan, Wenjing; Liang, Tianchen; Wu, Shuaipeng
2018-02-01
The vibration signals collected from rolling bearing are usually complex and non-stationary with heavy background noise. Therefore, it is a great challenge to efficiently learn the representative fault features of the collected vibration signals. In this paper, a novel method called improved convolutional deep belief network (CDBN) with compressed sensing (CS) is developed for feature learning and fault diagnosis of rolling bearing. Firstly, CS is adopted for reducing the vibration data amount to improve analysis efficiency. Secondly, a new CDBN model is constructed with Gaussian visible units to enhance the feature learning ability for the compressed data. Finally, exponential moving average (EMA) technique is employed to improve the generalization performance of the constructed deep model. The developed method is applied to analyze the experimental rolling bearing vibration signals. The results confirm that the developed method is more effective than the traditional methods.
Using the weighted area under the net benefit curve for decision curve analysis.
Talluri, Rajesh; Shete, Sanjay
2016-07-18
Risk prediction models have been proposed for various diseases and are being improved as new predictors are identified. A major challenge is to determine whether the newly discovered predictors improve risk prediction. Decision curve analysis has been proposed as an alternative to the area under the curve and net reclassification index to evaluate the performance of prediction models in clinical scenarios. The decision curve computed using the net benefit can evaluate the predictive performance of risk models at a given or range of threshold probabilities. However, when the decision curves for 2 competing models cross in the range of interest, it is difficult to identify the best model as there is no readily available summary measure for evaluating the predictive performance. The key deterrent for using simple measures such as the area under the net benefit curve is the assumption that the threshold probabilities are uniformly distributed among patients. We propose a novel measure for performing decision curve analysis. The approach estimates the distribution of threshold probabilities without the need of additional data. Using the estimated distribution of threshold probabilities, the weighted area under the net benefit curve serves as the summary measure to compare risk prediction models in a range of interest. We compared 3 different approaches, the standard method, the area under the net benefit curve, and the weighted area under the net benefit curve. Type 1 error and power comparisons demonstrate that the weighted area under the net benefit curve has higher power compared to the other methods. Several simulation studies are presented to demonstrate the improvement in model comparison using the weighted area under the net benefit curve compared to the standard method. The proposed measure improves decision curve analysis by using the weighted area under the curve and thereby improves the power of the decision curve analysis to compare risk prediction models in a clinical scenario.
Self-consistent analysis of high drift velocity measurements with the STARE system
NASA Technical Reports Server (NTRS)
Reinleitner, L. A.; Nielsen, E.
1985-01-01
The use of the STARE and SABRE coherent radar systems as valuable tools for geophysical research has been enhanced by a new technique called the Superimposed-Grid-Point method. This method permits an analysis of E-layer plasma irregularity phase velocity versus flow angle utilizing only STARE or SABRE data. As previous work with STARE has indicated, this analysis has clearly shown that the cosine law assumption breaks down for velocities near and exceeding the local ion acoustic velocities. Use of this method is improving understanding of naturally-occurring plasma irregularities in the E-layer.
Dong, Fengxia; Mitchell, Paul D; Colquhoun, Jed
2015-01-01
Measuring farm sustainability performance is a crucial component for improving agricultural sustainability. While extensive assessments and indicators exist that reflect the different facets of agricultural sustainability, because of the relatively large number of measures and interactions among them, a composite indicator that integrates and aggregates over all variables is particularly useful. This paper describes and empirically evaluates a method for constructing a composite sustainability indicator that individually scores and ranks farm sustainability performance. The method first uses non-negative polychoric principal component analysis to reduce the number of variables, to remove correlation among variables and to transform categorical variables to continuous variables. Next the method applies common-weight data envelope analysis to these principal components to individually score each farm. The method solves weights endogenously and allows identifying important practices in sustainability evaluation. An empirical application to Wisconsin cranberry farms finds heterogeneity in sustainability practice adoption, implying that some farms could adopt relevant practices to improve the overall sustainability performance of the industry. Copyright © 2014 Elsevier Ltd. All rights reserved.
Takeda, Kayoko; Takahashi, Kiyoshi; Masukawa, Hiroyuki; Shimamori, Yoshimitsu
2017-01-01
Recently, the practice of active learning has spread, increasingly recognized as an essential component of academic studies. Classes incorporating small group discussion (SGD) are conducted at many universities. At present, assessments of the effectiveness of SGD have mostly involved evaluation by questionnaires conducted by teachers, by peer assessment, and by self-evaluation of students. However, qualitative data, such as open-ended descriptions by students, have not been widely evaluated. As a result, we have been unable to analyze the processes and methods involved in how students acquire knowledge in SGD. In recent years, due to advances in information and communication technology (ICT), text mining has enabled the analysis of qualitative data. We therefore investigated whether the introduction of a learning system comprising the jigsaw method and problem-based learning (PBL) would improve student attitudes toward learning; we did this by text mining analysis of the content of student reports. We found that by applying the jigsaw method before PBL, we were able to improve student attitudes toward learning and increase the depth of their understanding of the area of study as a result of working with others. The use of text mining to analyze qualitative data also allowed us to understand the processes and methods by which students acquired knowledge in SGD and also changes in students' understanding and performance based on improvements to the class. This finding suggests that the use of text mining to analyze qualitative data could enable teachers to evaluate the effectiveness of various methods employed to improve learning.
Profitability analysis of KINGLONG nearly 5 years
NASA Astrophysics Data System (ADS)
Zhang, Mei; Wen, Jinghua
2017-08-01
Profitability analysis for measuring business performance and forecast its prospects play an important role. In this paper, the research instance King Long Motor in understanding the basic theory on the basis of financial management, to take a combination of theory and data analysis methods, combined with a measure of profitability related indicators of King Long Motor company’s profitability do a specific analysis to identify factors constraining the profitability of Kinglong company exists and the motivation to improve profitability, which made recommendations to improve the profitability of Kinglong car company to promote the company’s future can be better and faster development.)
NASA Astrophysics Data System (ADS)
Wang, Hongliang; Liu, Baohua; Ding, Zhongjun; Wang, Xiangxin
2017-02-01
Absorption-based optical sensors have been developed for the determination of water pH. In this paper, based on the preparation of a transparent sol-gel thin film with a phenol red (PR) indicator, several calculation methods, including simple linear regression analysis, quadratic regression analysis and dual-wavelength absorbance ratio analysis, were used to calculate water pH. Results of MSSRR show that dual-wavelength absorbance ratio analysis can improve the calculation accuracy of water pH in long-term measurement.
Earthquake Forecasting Through Semi-periodicity Analysis of Labeled Point Processes
NASA Astrophysics Data System (ADS)
Quinteros Cartaya, C. B. M.; Nava Pichardo, F. A.; Glowacka, E.; Gomez-Trevino, E.
2015-12-01
Large earthquakes have semi-periodic behavior as result of critically self-organized processes of stress accumulation and release in some seismogenic region. Thus, large earthquakes in a region constitute semi-periodic sequences with recurrence times varying slightly from periodicity. Nava et al., 2013 and Quinteros et al., 2013 realized that not all earthquakes in a given region need belong to the same sequence, since there can be more than one process of stress accumulation and release in it; they also proposed a method to identify semi-periodic sequences through analytic Fourier analysis. This work presents improvements on the above-mentioned method: the influence of earthquake size on the spectral analysis, and its importance in semi-periodic events identification, which means that earthquake occurrence times are treated as a labeled point process; the estimation of appropriate upper limit uncertainties to use in forecasts; and the use of Bayesian analysis to evaluate the forecast performance. This improved method is applied to specific regions: the southwestern coast of Mexico, the northeastern Japan Arc, the San Andreas Fault zone at Parkfield, and northeastern Venezuela.
How does modifying a DEM to reflect known hydrology affect subsequent terrain analysis?
NASA Astrophysics Data System (ADS)
Callow, John Nikolaus; Van Niel, Kimberly P.; Boggs, Guy S.
2007-01-01
SummaryMany digital elevation models (DEMs) have difficulty replicating hydrological patterns in flat landscapes. Efforts to improve DEM performance in replicating known hydrology have included a variety of soft (i.e. algorithm-based approaches) and hard techniques, such as " Stream burning" or "surface reconditioning" (e.g. Agree or ANUDEM). Using a representation of the known stream network, these methods trench or mathematically warp the original DEM to improve how accurately stream position, stream length and catchment boundaries replicate known hydrological conditions. However, these techniques permanently alter the DEM and may affect further analyses (e.g. slope). This paper explores the impact that commonly used hydrological correction methods ( Stream burning, Agree.aml and ANUDEM v4.6.3 and ANUDEM v5.1) have on the overall nature of a DEM, finding that different methods produce non-convergent outcomes for catchment parameters (such as catchment boundaries, stream position and length), and differentially compromise secondary terrain analysis. All hydrological correction methods successfully improved calculation of catchment area, stream position and length as compared to using the DEM without any modification, but they all increased catchment slope. No single method performing best across all categories. Different hydrological correction methods changed elevation and slope in different spatial patterns and magnitudes, compromising the ability to derive catchment parameters and conduct secondary terrain analysis from a single DEM. Modification of a DEM to better reflect known hydrology can be useful, however knowledge of the magnitude and spatial pattern of the changes are required before using a DEM for subsequent analyses.
Mihic, Marko M; Todorovic, Marija Lj; Obradovic, Vladimir Lj; Mitrovic, Zorica M
2016-01-01
Background Social services aimed at the elderly are facing great challenges caused by progressive aging of the global population but also by the constant pressure to spend funds in a rational manner. Purpose This paper focuses on analyzing the investments into human resources aimed at enhancing home care for the elderly since many countries have recorded progress in the area over the past years. The goal of this paper is to stress the significance of performing an economic analysis of the investment. Methods This paper combines statistical analysis methods such as correlation and regression analysis, methods of economic analysis, and scenario method. Results The economic analysis of investing in human resources for home care service in Serbia showed that the both scenarios of investing in either additional home care hours or more beneficiaries are cost-efficient. However, the optimal solution with the positive (and the highest) value of economic net present value criterion is to invest in human resources to boost the number of home care hours from 6 to 8 hours per week and increase the number of the beneficiaries to 33%. Conclusion This paper shows how the statistical and economic analysis results can be used to evaluate different scenarios and enable quality decision-making based on exact data in order to improve health and quality of life of the elderly and spend funds in a rational manner. PMID:26869778
Methods for Improving Fine-Scale Applications of the WRF-CMAQ Modeling System
Presentation on the work in AMAD to improve fine-scale (e.g. 4km and 1km) WRF-CMAQ simulations. Includes iterative analysis, updated sea surface temperature and snow cover fields, and inclusion of impervious surface information (urban parameterization).
Re-refinement from deposited X-ray data can deliver improved models for most PDB entries.
Joosten, Robbie P; Womack, Thomas; Vriend, Gert; Bricogne, Gérard
2009-02-01
The deposition of X-ray data along with the customary structural models defining PDB entries makes it possible to apply large-scale re-refinement protocols to these entries, thus giving users the benefit of improvements in X-ray methods that have occurred since the structure was deposited. Automated gradient refinement is an effective method to achieve this goal, but real-space intervention is most often required in order to adequately address problems detected by structure-validation software. In order to improve the existing protocol, automated re-refinement was combined with structure validation and difference-density peak analysis to produce a catalogue of problems in PDB entries that are amenable to automatic correction. It is shown that re-refinement can be effective in producing improvements, which are often associated with the systematic use of the TLS parameterization of B factors, even for relatively new and high-resolution PDB entries, while the accompanying manual or semi-manual map analysis and fitting steps show good prospects for eventual automation. It is proposed that the potential for simultaneous improvements in methods and in re-refinement results be further encouraged by broadening the scope of depositions to include refinement metadata and ultimately primary rather than reduced X-ray data.
On Improving the Experiment Methodology in Pedagogical Research
ERIC Educational Resources Information Center
Horakova, Tereza; Houska, Milan
2014-01-01
The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…
NASA Astrophysics Data System (ADS)
Zhang, Fan; Liu, Pinkuan
2018-04-01
In order to improve the inspection precision of the H-drive air-bearing stage for wafer inspection, in this paper the geometric error of the stage is analyzed and compensated. The relationship between the positioning errors and error sources are initially modeled, and seven error components are identified that are closely related to the inspection accuracy. The most effective factor that affects the geometric error is identified by error sensitivity analysis. Then, the Spearman rank correlation method is applied to find the correlation between different error components, aiming at guiding the accuracy design and error compensation of the stage. Finally, different compensation methods, including the three-error curve interpolation method, the polynomial interpolation method, the Chebyshev polynomial interpolation method, and the B-spline interpolation method, are employed within the full range of the stage, and their results are compared. Simulation and experiment show that the B-spline interpolation method based on the error model has better compensation results. In addition, the research result is valuable for promoting wafer inspection accuracy and will greatly benefit the semiconductor industry.
Incorporation of MRI-AIF Information For Improved Kinetic Modelling of Dynamic PET Data
NASA Astrophysics Data System (ADS)
Sari, Hasan; Erlandsson, Kjell; Thielemans, Kris; Atkinson, David; Ourselin, Sebastien; Arridge, Simon; Hutton, Brian F.
2015-06-01
In the analysis of dynamic PET data, compartmental kinetic analysis methods require an accurate knowledge of the arterial input function (AIF). Although arterial blood sampling is the gold standard of the methods used to measure the AIF, it is usually not preferred as it is an invasive method. An alternative method is the simultaneous estimation method (SIME), where physiological parameters and the AIF are estimated together, using information from different anatomical regions. Due to the large number of parameters to estimate in its optimisation, SIME is a computationally complex method and may sometimes fail to give accurate estimates. In this work, we try to improve SIME by utilising an input function derived from a simultaneously obtained DSC-MRI scan. With the assumption that the true value of one of the six parameter PET-AIF model can be derived from an MRI-AIF, the method is tested using simulated data. The results indicate that SIME can yield more robust results when the MRI information is included with a significant reduction in absolute bias of Ki estimates.
Phung, Viet‐Hai; Essam, Nadya; Asghar, Zahid; Spaight, Anne
2015-01-01
Abstract Rationale, aims and objectives Clinical leadership and organizational culture are important contextual factors for quality improvement (QI) but the relationship between these and with organizational change is complex and poorly understood. We aimed to explore the relationship between clinical leadership, culture of innovation and clinical engagement in QI within a national ambulance QI Collaborative (QIC). Methods We used a self‐administered online questionnaire survey sent to front‐line clinicians in all 12 English ambulance services. We conducted a cross‐sectional analysis of quantitative data and qualitative analysis of free‐text responses. Results There were 2743 (12% of 22 117) responses from 11 of the 12 participating ambulance services. In the 3% of responders that were directly involved with the QIC, leadership behaviour was significantly higher than for those not directly involved. QIC involvement made no significant difference to responders' perceptions of the culture of innovation in their organization, which was generally considered poor. Although uptake of QI methods was low overall, QIC members were significantly more likely to use QI methods, which were also significantly associated with leadership behaviour. Conclusions Despite a limited organizational culture of innovation, clinical leadership and use of QI methods in ambulance services generally, the QIC achieved its aims to significantly improve pre‐hospital care for acute myocardial infarction and stroke. We postulate that this was mediated through an improvement subculture, linked to the QIC, which facilitated large‐scale improvement by stimulating leadership and QI methods. Further research is needed to understand success factors for QI in complex health care environments. PMID:26303398
Trojanowicz, Marek; Kolacinska, Kamila; Grate, Jay W.
2018-02-13
Here, the safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. Themore » benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β–radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trojanowicz, Marek; Kolacinska, Kamila; Grate, Jay W.
Here, the safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. Themore » benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β–radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection.« less
Trojanowicz, Marek; Kołacińska, Kamila; Grate, Jay W
2018-06-01
The safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. The benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β-radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection. Copyright © 2018 Elsevier B.V. All rights reserved.
Use of prior knowledge for the analysis of high-throughput transcriptomics and metabolomics data
2014-01-01
Background High-throughput omics technologies have enabled the measurement of many genes or metabolites simultaneously. The resulting high dimensional experimental data poses significant challenges to transcriptomics and metabolomics data analysis methods, which may lead to spurious instead of biologically relevant results. One strategy to improve the results is the incorporation of prior biological knowledge in the analysis. This strategy is used to reduce the solution space and/or to focus the analysis on biological meaningful regions. In this article, we review a selection of these methods used in transcriptomics and metabolomics. We combine the reviewed methods in three groups based on the underlying mathematical model: exploratory methods, supervised methods and estimation of the covariance matrix. We discuss which prior knowledge has been used, how it is incorporated and how it modifies the mathematical properties of the underlying methods. PMID:25033193
Improving the quality of parameter estimates obtained from slug tests
Butler, J.J.; McElwee, C.D.; Liu, W.
1996-01-01
The slug test is one of the most commonly used field methods for obtaining in situ estimates of hydraulic conductivity. Despite its prevalence, this method has received criticism from many quarters in the ground-water community. This criticism emphasizes the poor quality of the estimated parameters, a condition that is primarily a product of the somewhat casual approach that is often employed in slug tests. Recently, the Kansas Geological Survey (KGS) has pursued research directed it improving methods for the performance and analysis of slug tests. Based on extensive theoretical and field research, a series of guidelines have been proposed that should enable the quality of parameter estimates to be improved. The most significant of these guidelines are: (1) three or more slug tests should be performed at each well during a given test period; (2) two or more different initial displacements (Ho) should be used at each well during a test period; (3) the method used to initiate a test should enable the slug to be introduced in a near-instantaneous manner and should allow a good estimate of Ho to be obtained; (4) data-acquisition equipment that enables a large quantity of high quality data to be collected should be employed; (5) if an estimate of the storage parameter is needed, an observation well other than the test well should be employed; (6) the method chosen for analysis of the slug-test data should be appropriate for site conditions; (7) use of pre- and post-analysis plots should be an integral component of the analysis procedure, and (8) appropriate well construction parameters should be employed. Data from slug tests performed at a number of KGS field sites demonstrate the importance of these guidelines.
The role of finite-difference methods in design and analysis for supersonic cruise
NASA Technical Reports Server (NTRS)
Townsend, J. C.
1976-01-01
Finite-difference methods for analysis of steady, inviscid supersonic flows are described, and their present state of development is assessed with particular attention to their applicability to vehicles designed for efficient cruise flight. Current work is described which will allow greater geometric latitude, improve treatment of embedded shock waves, and relax the requirement that the axial velocity must be supersonic.
Turning up the heat on aircraft structures. [design and analysis for high-temperature conditions
NASA Technical Reports Server (NTRS)
Dobyns, Alan; Saff, Charles; Johns, Robert
1992-01-01
An overview is presented of the current effort in design and development of aircraft structures to achieve the lowest cost for best performance. Enhancements in this area are focused on integrated design, improved design analysis tools, low-cost fabrication techniques, and more sophisticated test methods. 3D CAD/CAM data are becoming the method through which design, manufacturing, and engineering communicate.
Borgese, L; Salmistraro, M; Gianoncelli, A; Zacco, A; Lucchini, R; Zimmerman, N; Pisani, L; Siviero, G; Depero, L E; Bontempi, E
2012-01-30
This work is presented as an improvement of a recently introduced method for airborne particulate matter (PM) filter analysis [1]. X-ray standing wave (XSW) and total reflection X-ray fluorescence (TXRF) were performed with a new dedicated laboratory instrumentation. The main advantage of performing both XSW and TXRF, is the possibility to distinguish the nature of the sample: if it is a small droplet dry residue, a thin film like or a bulk sample. Another advantage is related to the possibility to select the angle of total reflection to make TXRF measurements. Finally, the possibility to switch the X-ray source allows to measure with more accuracy lighter and heavier elements (with a change in X-ray anode, for example from Mo to Cu). The aim of the present study is to lay the theoretical foundation of the new proposed method for airborne PM filters quantitative analysis improving the accuracy and efficiency of quantification by means of an external standard. The theoretical model presented and discussed demonstrated that airborne PM filters can be considered as thin layers. A set of reference samples is prepared in laboratory and used to obtain a calibration curve. Our results demonstrate that the proposed method for quantitative analysis of air PM filters is affordable and reliable without the necessity to digest filters to obtain quantitative chemical analysis, and that the use of XSW improve the accuracy of TXRF analysis. Copyright © 2011 Elsevier B.V. All rights reserved.
Improving the Discipline of Cost Estimation and Analysis
NASA Technical Reports Server (NTRS)
Piland, William M.; Pine, David J.; Wilson, Delano M.
2000-01-01
The need to improve the quality and accuracy of cost estimates of proposed new aerospace systems has been widely recognized. The industry has done the best job of maintaining related capability with improvements in estimation methods and giving appropriate priority to the hiring and training of qualified analysts. Some parts of Government, and National Aeronautics and Space Administration (NASA) in particular, continue to need major improvements in this area. Recently, NASA recognized that its cost estimation and analysis capabilities had eroded to the point that the ability to provide timely, reliable estimates was impacting the confidence in planning man), program activities. As a result, this year the Agency established a lead role for cost estimation and analysis. The Independent Program Assessment Office located at the Langley Research Center was given this responsibility.
The use of cognitive task analysis to improve instructional descriptions of procedures.
Clark, Richard E; Pugh, Carla M; Yates, Kenneth A; Inaba, Kenji; Green, Donald J; Sullivan, Maura E
2012-03-01
Surgical training relies heavily on the ability of expert surgeons to provide complete and accurate descriptions of a complex procedure. However, research from a variety of domains suggests that experts often omit critical information about the judgments, analysis, and decisions they make when solving a difficult problem or performing a complex task. In this study, we compared three methods for capturing surgeons' descriptions of how to perform the procedure for inserting a femoral artery shunt (unaided free-recall, unaided free-recall with simulation, and cognitive task analysis methods) to determine which method produced more accurate and complete results. Cognitive task analysis was approximately 70% more complete and accurate than free-recall and or free-recall during a simulation of the procedure. Ten expert trauma surgeons at a major urban trauma center were interviewed separately and asked to describe how to perform an emergency shunt procedure. Four surgeons provided an unaided free-recall description of the shunt procedure, five surgeons provided an unaided free-recall description of the procedure using visual aids and surgical instruments (simulation), and one (chosen randomly) was interviewed using cognitive task analysis (CTA) methods. An 11th vascular surgeon approved the final CTA protocol. The CTA interview with only one expert surgeon resulted in significantly greater accuracy and completeness of the descriptions compared with the unaided free-recall interviews with multiple expert surgeons. Surgeons in the unaided group omitted nearly 70% of necessary decision steps. In the free-recall group, heavy use of simulation improved surgeons' completeness when describing the steps of the procedure. CTA significantly increases the completeness and accuracy of surgeons' instructional descriptions of surgical procedures. In addition, simulation during unaided free-recall interviews may improve the completeness of interview data. Copyright © 2012 Elsevier Inc. All rights reserved.
Examining Benefits of Dedicated Funding and Process Improvement for Depot Level Technology Insertion
2010-06-17
analysis of tabulated data. This method is also supported by Miles and Huberman (1994) as they list six approaches to case study data analysis, two... Miles , Matthew, B. Huberman , Michael. (1994). Qualitative Data Analysis: An Expanded Sourcebook. Thousand Oaks: Sage Publications. 12. Mohr, Jakki
Ravanfar, Seyed Ali; Orbovic, Vladimir; Moradpour, Mahdi; Abdul Aziz, Maheran; Karan, Ratna; Wallace, Simon; Parajuli, Saroj
2017-04-01
Development of in vitro plant regeneration method from Brassica explants via organogenesis and somatic embryogenesis is influenced by many factors such as culture environment, culture medium composition, explant sources, and genotypes which are reviewed in this study. An efficient in vitro regeneration system to allow genetic transformation of Brassica is a crucial tool for improving its economical value. Methods to optimize transformation protocols for the efficient introduction of desirable traits, and a comparative analysis of these methods are also reviewed. Hence, binary vectors, selectable marker genes, minimum inhibitory concentration of selection agents, reporter marker genes, preculture media, Agrobacterium concentration and regeneration ability of putative transformants for improvement of Agrobacterium-mediated transformation of Brassica are discussed.
A new design of groundwater sampling device and its application.
Tsai, Yih-jin; Kuo, Ming-ching T
2005-01-01
Compounds in the atmosphere contaminate samples of groundwater. An inexpensive and simple method for collecting groundwater samples is developed to prevent contamination when the background concentration of contaminants is high. This new design of groundwater sampling device involves a glass sampling bottle with a Teflon-lined valve at each end. A cleaned and dried sampling bottle was connected to a low flow-rate peristaltic pump with Teflon tubing and was filled with water. No headspace volume was remained in the sampling bottle. The sample bottle was then packed in a PVC bag to prevent the target component from infiltrating into the water sample through the valves. In this study, groundwater was sampled at six wells using both the conventional method and the improved method. The analysis of trichlorofluoromethane (CFC-11) concentrations at these six wells indicates that all the groundwater samples obtained by the conventional sampling method were contaminated by CFC-11 from the atmosphere. The improved sampling method greatly eliminated the problems of contamination, preservation and quantitative analysis of natural water.
He, Tian; Xiao, Denghong; Pan, Qiang; Liu, Xiandong; Shan, Yingchun
2014-01-01
This paper attempts to introduce an improved acoustic emission (AE) beamforming method to localize rotor-stator rubbing fault in rotating machinery. To investigate the propagation characteristics of acoustic emission signals in casing shell plate of rotating machinery, the plate wave theory is used in a thin plate. A simulation is conducted and its result shows the localization accuracy of beamforming depends on multi-mode, dispersion, velocity and array dimension. In order to reduce the effect of propagation characteristics on the source localization, an AE signal pre-process method is introduced by combining plate wave theory and wavelet packet transform. And the revised localization velocity to reduce effect of array size is presented. The accuracy of rubbing localization based on beamforming and the improved method of present paper are compared by the rubbing test carried on a test table of rotating machinery. The results indicate that the improved method can localize rub fault effectively. Copyright © 2013 Elsevier B.V. All rights reserved.
Chen, Jianjun; Cui, Jicheng; Yao, Xuefeng; Liu, Jianan; Sun, Ci
2018-04-01
To solve the problem where the actual grating aperture decreases with an increasing scanning angle during the scanning of a three-grating monochromator, we propose an off-axis assembly method for the worm gear turntable that makes it possible to suppress this aperture reduction. We simulated and compared the traditional assembly method with the off-axis assembly method in the three-grating monochromator. Results show that the actual grating aperture can be improved by the off-axis assembly method. In fact, for any one of the three gratings, when the monochromator outputs the longest wavelength in the corresponding wavelength band, the actual grating aperture increases by 45.93%. Over the entire monochromator output band, the actual grating aperture increased by an average of 32.56% and can thus improve the monochromator's output energy. Improvement of the actual grating aperture can also reduce the stray light intensity in the monochromator and improve its output signal-to-noise ratio.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Yanmei; Li, Xinli; Bai, Yan
The measurement of multiphase flow parameters is of great importance in a wide range of industries. In the measurement of multiphase, the signals from the sensors are extremely weak and often buried in strong background noise. It is thus desirable to develop effective signal processing techniques that can detect the weak signal from the sensor outputs. In this paper, two methods, i.e., lock-in-amplifier (LIA) and improved Duffing chaotic oscillator are compared to detect and process the weak signal. For sinusoidal signal buried in noise, the correlation detection with sinusoidal reference signal is simulated by using LIA. The improved Duffing chaoticmore » oscillator method, which based on the Wigner transformation, can restore the signal waveform and detect the frequency. Two methods are combined to detect and extract the weak signal. Simulation results show the effectiveness and accuracy of the proposed improved method. The comparative analysis shows that the improved Duffing chaotic oscillator method can restrain noise strongly since it is sensitive to initial conditions.« less
Rigourd, V; Barnier, J P; Ferroni, A; Nicloux, M; Hachem, T; Magny, J F; Lapillonne, A; Frange, P; Nassif, X; Bille, E
2018-05-03
Three cases of Bacillus cereus infection or colonization occurred in the same region in France, and milk from the milk bank was suspected as a possible common source of contamination. All Batches delivered to the three cases complied with the requirements of the bacteriological reference method recommended by good practices guidelines. Still, a retrospective analysis with a more sensitive method showed one batch to contain B. cereus, however straincomparison revealed no epidemiological link betweenisolates from patients and those from the milk. Consequently, in accordance with the precautionary principle, we developed a new sensitive method for the screening of pasteurized milk for pathogenic bacteria. From January 1 to August 31, 2017, 2526 samples of pasteurized milk were prospectively included in the study. We showed that a 20 mL sample of pasteurized milk incubated for 18 h at 37 °C under aerobic conditions was favoring the detection of B. Cereus. The nonconformity rate was 6.3% for the reference method and 12.6% for the improved method (p < 0.0001). Nonconformity was due to the presence of B. cereus in 88.5% of cases for the improved method and 53% of cases for the reference method (p < 0.0001). Thus our new method is improves the microbiological safety of the product distributed and only moderately increases the rate of bacteriological nonconformity .
Root Gravitropism: Quantification, Challenges, and Solutions.
Muller, Lukas; Bennett, Malcolm J; French, Andy; Wells, Darren M; Swarup, Ranjan
2018-01-01
Better understanding of root traits such as root angle and root gravitropism will be crucial for development of crops with improved resource use efficiency. This chapter describes a high-throughput, automated image analysis method to trace Arabidopsis (Arabidopsis thaliana) seedling roots grown on agar plates. The method combines a "particle-filtering algorithm with a graph-based method" to trace the center line of a root and can be adopted for the analysis of several root parameters such as length, curvature, and stimulus from original root traces.
Symetrica Measurements at PNNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kouzes, Richard T.; Mace, Emily K.; Redding, Rebecca L.
2009-01-26
Symetrica is a small company based in Southampton, England, that has developed an algorithm for processing gamma ray spectra obtained from a variety of scintillation detectors. Their analysis method applied to NaI(Tl), BGO, and LaBr spectra results in deconvoluted spectra with the “resolution” improved by about a factor of three to four. This method has also been applied by Symetrica to plastic scintillator with the result that full energy peaks are produced. If this method is valid and operationally viable, it could lead to a significantly improved plastic scintillator based radiation portal monitor system.
Exergy analysis on industrial boiler energy conservation and emission evaluation applications
NASA Astrophysics Data System (ADS)
Li, Henan
2017-06-01
Industrial boiler is one of the most energy-consuming equipments in china, the annual consumption of energy accounts for about one-third of the national energy consumption. Industrial boilers in service at present have several severe problems such as small capacity, low efficiency, high energy consumption and causing severe pollution on environment. In recent years, our country in the big scope, long time serious fog weather, with coal-fired industrial boilers is closely related to the regional characteristics of high strength and low emissions [1]. The energy-efficient and emission-reducing of industry boiler is of great significance to improve China’s energy usage efficiency and environmental protection. Difference in thermal equilibrium theory is widely used in boiler design, exergy analysis method is established on the basis of the first law and second law of thermodynamics, by studying the cycle of the effect of energy conversion and utilization, to analyze its influencing factors, to reveal the exergy loss of location, distribution and size, find out the weak links, and a method of mining system of the boiler energy saving potential. Exergy analysis method is used for layer combustion boiler efficiency and pollutant emission characteristics analysis and evaluation, and can more objectively and accurately the energy conserving potential of the mining system of the boiler, find out the weak link of energy consumption, and improve equipment performance to improve the industrial boiler environmental friendliness.
Text Mining Improves Prediction of Protein Functional Sites
Cohn, Judith D.; Ravikumar, Komandur E.
2012-01-01
We present an approach that integrates protein structure analysis and text mining for protein functional site prediction, called LEAP-FS (Literature Enhanced Automated Prediction of Functional Sites). The structure analysis was carried out using Dynamics Perturbation Analysis (DPA), which predicts functional sites at control points where interactions greatly perturb protein vibrations. The text mining extracts mentions of residues in the literature, and predicts that residues mentioned are functionally important. We assessed the significance of each of these methods by analyzing their performance in finding known functional sites (specifically, small-molecule binding sites and catalytic sites) in about 100,000 publicly available protein structures. The DPA predictions recapitulated many of the functional site annotations and preferentially recovered binding sites annotated as biologically relevant vs. those annotated as potentially spurious. The text-based predictions were also substantially supported by the functional site annotations: compared to other residues, residues mentioned in text were roughly six times more likely to be found in a functional site. The overlap of predictions with annotations improved when the text-based and structure-based methods agreed. Our analysis also yielded new high-quality predictions of many functional site residues that were not catalogued in the curated data sources we inspected. We conclude that both DPA and text mining independently provide valuable high-throughput protein functional site predictions, and that integrating the two methods using LEAP-FS further improves the quality of these predictions. PMID:22393388
NASA Astrophysics Data System (ADS)
Lerotic, Mirna
Soft x-ray spectromicroscopy provides spectral data on the chemical speciation of light elements at sub-100 nanometer spatial resolution. The high resolution imaging places a strong demand on the microscope stability and on the reproducibility of the scanned image field, and the volume of data necessitates the need for improved data analysis methods. This dissertation concerns two developments in extending the capability of soft x-ray transmission microscopes to carry out studies of chemical speciation at high spatial resolution. One development involves an improvement in x-ray microscope instrumentation: a new Stony Brook scanning transmission x-ray microscope which incorporates laser interferometer feedback in scanning stage positions. The interferometer is used to control the position between the sample and focusing optics, and thus improve the stability of the system. A second development concerns new analysis methods for the study of chemical speciation of complex specimens, such as those in biological and environmental science studies. When all chemical species in a specimen are known and separately characterized, existing approaches can be used to measure the concentration of each component at each pixel. In other cases (such as often occur in biology or environmental science), where the specimen may be too complicated or provide at least some unknown spectral signatures, other approaches must be used. We describe here an approach that uses principal component analysis (similar to factor analysis) to orthogonalize and noise-filter spectromicroscopy data. We then use cluster analysis (a form of unsupervised pattern matching) to classify pixels according to spectral similarity, to extract representative, cluster-averaged spectra with good signal-to-noise ratio, and to obtain gradations of concentration of these representative spectra at each pixel. The method is illustrated with a simulated data set of organic compounds, and a mixture of lutetium in hematite used to understand colloidal transport properties of radionuclides. Also, we describe here an extension of that work employing an angle distance measure; this measure provides better classification based on spectral signatures alone in specimens with significant thickness variations. The method is illustrated using simulated data, and also to examine sporulation in the bacterium Clostridium sp.
Cui, Jiwen; Zhao, Shiyuan; Yang, Di; Ding, Zhenyang
2018-02-20
We use a spectrum interpolation technique to improve the distributed strain measurement accuracy in a Rayleigh-scatter-based optical frequency domain reflectometry sensing system. We demonstrate that strain accuracy is not limited by the "uncertainty principle" that exists in the time-frequency analysis. Different interpolation methods are investigated and used to improve the accuracy of peak position of the cross-correlation and, therefore, improve the accuracy of the strain. Interpolation implemented by padding zeros on one side of the windowed data in the spatial domain, before the inverse fast Fourier transform, is found to have the best accuracy. Using this method, the strain accuracy and resolution are both improved without decreasing the spatial resolution. The strain of 3 μϵ within the spatial resolution of 1 cm at the position of 21.4 m is distinguished, and the measurement uncertainty is 3.3 μϵ.
Phospholipid Fatty Acid Analysis: Past, Present and Future
NASA Astrophysics Data System (ADS)
Findlay, R. H.
2008-12-01
With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.
NASA Astrophysics Data System (ADS)
Beach, Daniel G.
2017-08-01
Paralytic shellfish toxins (PSTs) are neurotoxins produced by dinoflagellates and cyanobacteria that cause paralytic shellfish poisoning in humans. PST quantitation by LC-MS is challenging because of their high polarity, lability as gas-phase ions, and large number of potentially interfering analogues. Differential mobility spectrometry (DMS) has the potential to improve the performance of LC-MS methods for PSTs in terms of selectivity and limits of detection. This work describes a comprehensive investigation of the separation of 16 regulated PSTs by DMS and the development of highly selective LC-DMS-MS methods for PST quantitation. The effects of all DMS parameters on the separation of PSTs from one another were first investigated in detail. The labile nature of 11α-gonyautoxin epimers gave unique insight into fragmentation of labile analytes before, during, and after the DMS analyzer. Two sets of DMS parameters were identified that either optimized the resolution of PSTs from one another or transmitted them at a limited number of compensation voltage (CV) values corresponding to structural subclasses. These were used to develop multidimensional LC-DMS-MS/MS methods using existing HILIC-MS/MS parameters. In both cases, improved selectivity was observed when using DMS, and the quantitative capabilities of a rapid UPLC-DMS-MS/MS method were evaluated. Limits of detection of the developed method were similar to those without DMS, and differences were highly analyte-dependant. Analysis of shellfish matrix reference materials showed good agreement with established methods. The developed methods will be useful in cases where specific matrix interferences are encountered in the LC-MS/MS analysis of PSTs in complex biological samples.
NASA Technical Reports Server (NTRS)
Bakhtiari-Nejad, Maryam; Nguyen, Nhan T.; Krishnakumar, Kalmanje Srinvas
2009-01-01
This paper presents the application of Bounded Linear Stability Analysis (BLSA) method for metrics driven adaptive control. The bounded linear stability analysis method is used for analyzing stability of adaptive control models, without linearizing the adaptive laws. Metrics-driven adaptive control introduces a notion that adaptation should be driven by some stability metrics to achieve robustness. By the application of bounded linear stability analysis method the adaptive gain is adjusted during the adaptation in order to meet certain phase margin requirements. Analysis of metrics-driven adaptive control is evaluated for a linear damaged twin-engine generic transport model of aircraft. The analysis shows that the system with the adjusted adaptive gain becomes more robust to unmodeled dynamics or time delay.
Comparison of histomorphometrical data obtained with two different image analysis methods.
Ballerini, Lucia; Franke-Stenport, Victoria; Borgefors, Gunilla; Johansson, Carina B
2007-08-01
A common way to determine tissue acceptance of biomaterials is to perform histomorphometrical analysis on histologically stained sections from retrieved samples with surrounding tissue, using various methods. The "time and money consuming" methods and techniques used are often "in house standards". We address light microscopic investigations of bone tissue reactions on un-decalcified cut and ground sections of threaded implants. In order to screen sections and generate results faster, the aim of this pilot project was to compare results generated with the in-house standard visual image analysis tool (i.e., quantifications and judgements done by the naked eye) with a custom made automatic image analysis program. The histomorphometrical bone area measurements revealed no significant differences between the methods but the results of the bony contacts varied significantly. The raw results were in relative agreement, i.e., the values from the two methods were proportional to each other: low bony contact values in the visual method corresponded to low values with the automatic method. With similar resolution images and further improvements of the automatic method this difference should become insignificant. A great advantage using the new automatic image analysis method is that it is time saving--analysis time can be significantly reduced.
Infrared face recognition based on LBP histogram and KW feature selection
NASA Astrophysics Data System (ADS)
Xie, Zhihua
2014-07-01
The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
[Isolation and identification methods of enterobacteria group and its technological advancement].
Furuta, Itaru
2007-08-01
In the last half-century, isolation and identification methods of enterobacteria groups have markedly improved by technological advancement. Clinical microbiology tests have changed overtime from tube methods to commercial identification kits and automated identification. Tube methods are the original method for the identification of enterobacteria groups, that is, a basically essential method to recognize bacterial fermentation and biochemical principles. In this paper, traditional tube tests are discussed, such as the utilization of carbohydrates, indole, methyl red, and citrate and urease tests. Commercial identification kits and automated instruments by computer based analysis as current methods are also discussed, and those methods provide rapidity and accuracy. Nonculture techniques of nucleic acid typing methods using PCR analysis, and immunochemical methods using monoclonal antibodies can be further developed.
DOT National Transportation Integrated Search
1980-06-01
The purpose of this report is to provide the tunneling profession with improved practical tools in the technical or design area, which provide more accurate representations of the ground-structure interaction in tunneling. The design methods range fr...
Ferreira, Ana P; Tobyn, Mike
2015-01-01
In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2004-03-23
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Dujardin, Pierre-Philippe; Reverdy, Thomas; Valette, Annick; François, Patrice
2016-06-01
Introduction : project management is on the expected proficiencies for head nurses. Context : The work on the organizations’ improvement carried out by head nurses, is rarely covered in the literature. Objectives : to follow the implementation of actions from projects led by head nurses and to analyze the parameters of success. Method : for a year, an intervention study has followed 17 projects initiating improvement measures. Semistructured interviews were conducted with health-care teams and managers. All of them reported the results of the implementation of each measure as an operational improvement. A mixed analysis containing a logistic regression investigated associations between the result of the action and the various contextual characteristics. Results : this study involved 111 actions. 71 % of them concluded an operational improvement. The organizational and supporting actions had a high success rate, which decreased when hazards were not managed by healthcare managers. Discussion : this study highlights the place of strategies through the implementing methods and the chosen actions. Recommendations are made in order to promote a collective assessment. Conclusion : scientific approaches are proposed to discuss the organizational work.
Kittell, David E; Mares, Jesus O; Son, Steven F
2015-04-01
Two time-frequency analysis methods based on the short-time Fourier transform (STFT) and continuous wavelet transform (CWT) were used to determine time-resolved detonation velocities with microwave interferometry (MI). The results were directly compared to well-established analysis techniques consisting of a peak-picking routine as well as a phase unwrapping method (i.e., quadrature analysis). The comparison is conducted on experimental data consisting of transient detonation phenomena observed in triaminotrinitrobenzene and ammonium nitrate-urea explosives, representing high and low quality MI signals, respectively. Time-frequency analysis proved much more capable of extracting useful and highly resolved velocity information from low quality signals than the phase unwrapping and peak-picking methods. Additionally, control of the time-frequency methods is mainly constrained to a single parameter which allows for a highly unbiased analysis method to extract velocity information. In contrast, the phase unwrapping technique introduces user based variability while the peak-picking technique does not achieve a highly resolved velocity result. Both STFT and CWT methods are proposed as improved additions to the analysis methods applied to MI detonation experiments, and may be useful in similar applications.
A simple method for processing data with least square method
NASA Astrophysics Data System (ADS)
Wang, Chunyan; Qi, Liqun; Chen, Yongxiang; Pang, Guangning
2017-08-01
The least square method is widely used in data processing and error estimation. The mathematical method has become an essential technique for parameter estimation, data processing, regression analysis and experimental data fitting, and has become a criterion tool for statistical inference. In measurement data analysis, the distribution of complex rules is usually based on the least square principle, i.e., the use of matrix to solve the final estimate and to improve its accuracy. In this paper, a new method is presented for the solution of the method which is based on algebraic computation and is relatively straightforward and easy to understand. The practicability of this method is described by a concrete example.
Advances in intelligent diagnosis methods for pulmonary ground-glass opacity nodules.
Yang, Jing; Wang, Hailin; Geng, Chen; Dai, Yakang; Ji, Jiansong
2018-02-07
Pulmonary nodule is one of the important lesions of lung cancer, mainly divided into two categories of solid nodules and ground glass nodules. The improvement of diagnosis of lung cancer has significant clinical significance, which could be realized by machine learning techniques. At present, there have been a lot of researches focusing on solid nodules. But the research on ground glass nodules started late, and lacked research results. This paper summarizes the research progress of the method of intelligent diagnosis for pulmonary nodules since 2014. It is described in details from four aspects: nodular signs, data analysis methods, prediction models and system evaluation. This paper aims to provide the research material for researchers of the clinical diagnosis and intelligent analysis of lung cancer, and further improve the precision of pulmonary ground glass nodule diagnosis.
Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.
Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie
2010-07-01
Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.
Developments in mycotoxin analysis: an update for 2009 - 2010
USDA-ARS?s Scientific Manuscript database
This review highlights developments in mycotoxin analysis and sampling over a period between mid-2009 and mid-2010. It covers the major mycotoxins aflatoxins, Alternaria toxins, ergot alkaloids, fumonisins, ochratoxin, patulin, trichothecenes, and zearalenone. New and improved methods for mycotoxins...
Yang, Jianhong; Li, Xiaomeng; Xu, Jinwu; Ma, Xianghong
2018-01-01
The quantitative analysis accuracy of calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is severely affected by the self-absorption effect and estimation of plasma temperature. Herein, a CF-LIBS quantitative analysis method based on the auto-selection of internal reference line and the optimized estimation of plasma temperature is proposed. The internal reference line of each species is automatically selected from analytical lines by a programmable procedure through easily accessible parameters. Furthermore, the self-absorption effect of the internal reference line is considered during the correction procedure. To improve the analysis accuracy of CF-LIBS, the particle swarm optimization (PSO) algorithm is introduced to estimate the plasma temperature based on the calculation results from the Boltzmann plot. Thereafter, the species concentrations of a sample can be calculated according to the classical CF-LIBS method. A total of 15 certified alloy steel standard samples of known compositions and elemental weight percentages were used in the experiment. Using the proposed method, the average relative errors of Cr, Ni, and Fe calculated concentrations were 4.40%, 6.81%, and 2.29%, respectively. The quantitative results demonstrated an improvement compared with the classical CF-LIBS method and the promising potential of in situ and real-time application.
Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna; Fedele, Andrea; Scipioni, Antonio
2013-01-01
Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified. PMID:23766723
Guild, Georgia E.; Stangoulis, James C. R.
2016-01-01
Within the HarvestPlus program there are many collaborators currently using X-Ray Fluorescence (XRF) spectroscopy to measure Fe and Zn in their target crops. In India, five HarvestPlus wheat collaborators have laboratories that conduct this analysis and their throughput has increased significantly. The benefits of using XRF are its ease of use, minimal sample preparation and high throughput analysis. The lack of commercially available calibration standards has led to a need for alternative calibration arrangements for many of the instruments. Consequently, the majority of instruments have either been installed with an electronic transfer of an original grain calibration set developed by a preferred lab, or a locally supplied calibration. Unfortunately, neither of these methods has been entirely successful. The electronic transfer is unable to account for small variations between the instruments, whereas the use of a locally provided calibration set is heavily reliant on the accuracy of the reference analysis method, which is particularly difficult to achieve when analyzing low levels of micronutrient. Consequently, we have developed a calibration method that uses non-matrix matched glass disks. Here we present the validation of this method and show this calibration approach can improve the reproducibility and accuracy of whole grain wheat analysis on 5 different XRF instruments across the HarvestPlus breeding program. PMID:27375644
Improving Video Based Heart Rate Monitoring.
Lin, Jian; Rozado, David; Duenser, Andreas
2015-01-01
Non-contact measurements of cardiac pulse can provide robust measurement of heart rate (HR) without the annoyance of attaching electrodes to the body. In this paper we explore a novel and reliable method to carry out video-based HR estimation and propose various performance improvement over existing approaches. The investigated method uses Independent Component Analysis (ICA) to detect the underlying HR signal from a mixed source signal present in the RGB channels of the image. The original ICA algorithm was implemented and several modifications were explored in order to determine which one could be optimal for accurate HR estimation. Using statistical analysis, we compared the cardiac pulse rate estimation from the different methods under comparison on the extracted videos to a commercially available oximeter. We found that some of these methods are quite effective and efficient in terms of improving accuracy and latency of the system. We have made the code of our algorithms openly available to the scientific community so that other researchers can explore how to integrate video-based HR monitoring in novel health technology applications. We conclude by noting that recent advances in video-based HR monitoring permit computers to be aware of a user's psychophysiological status in real time.
A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.
Kim, Minchan; Seo, Jiwon; Lee, Jiyun
2014-08-14
Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.
A Comprehensive Method for GNSS Data Quality Determination to Improve Ionospheric Data Analysis
Kim, Minchan; Seo, Jiwon; Lee, Jiyun
2014-01-01
Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis. PMID:25196005
[To improve the quality of requisitions for radiologic examinations].
Roussel, P; Lelièvre, N
2002-05-01
This article presents the different steps implemented in order to improve the quality of requisitions for radiologic examinations in a hospital. and methods. The radiology requests sent from clinical units are periodically analyzed using criteria about tracking, prescription and security required for a good examination. Results are discussed with the clinical units in order to achieve improvements. The periodical analysis of nonconformities shows a gradual improvement of practices. This action contributes to the realization of a single document for every request of examination or analysis in the hospital. The described action is in the context of French regulations, first about the practice of radiology, second about the obligation of quality improvement that health care facilities now have to implement for their accreditation.
Zhou, Xu; Wang, Qilin; Jiang, Guangming; Zhang, Xiwang; Yuan, Zhiguo
2014-12-01
Improvement of sludge dewaterability is crucial for reducing the costs of sludge disposal in wastewater treatment plants. This study presents a novel method based on combined conditioning with zero-valent iron (ZVI) and hydrogen peroxide (HP) at pH 2.0 to improve dewaterability of a full-scale waste activated sludge (WAS). The combination of ZVI (0-750mg/L) and HP (0-750mg/L) at pH 2.0 substantially improved the WAS dewaterability due to Fenton-like reactions. The highest improvement in WAS dewaterability was attained at 500mg ZVI/L and 250mg HP/L, when the capillary suction time of the WAS was reduced by approximately 50%. Particle size distribution indicated that the sludge flocs were decomposed after conditioning. Economic analysis showed that combined conditioning with ZVI and HP was a more economically favorable method for improving WAS dewaterability than the classical Fenton reaction based method initiated by ferrous salts and HP. Copyright © 2014 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Lindle, Jane Clark; Stalion, Nancy; Young, Lu
2005-01-01
Kentucky's accountability system includes a school-processes audit known as Standards and Indicators for School Improvement (SISI), which is in a nascent stage of validation. Content validity methods include comparison to instruments measuring similar constructs as well as other techniques such as job analysis. This study used a two-phase process…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W; Huang, Robert; Masanet, Eric
This chapter focuses on IT measures in the data center and examines the techniques and analysis methods used to verify savings that result from improving the efficiency of two specific pieces of IT equipment: servers and data storage.
Towards a Probabilistic Preliminary Design Criterion for Buckling Critical Composite Shells
NASA Technical Reports Server (NTRS)
Arbocz, Johann; Hilburger, Mark W.
2003-01-01
A probability-based analysis method for predicting buckling loads of compression-loaded laminated-composite shells is presented, and its potential as a basis for a new shell-stability design criterion is demonstrated and discussed. In particular, a database containing information about specimen geometry, material properties, and measured initial geometric imperfections for a selected group of laminated-composite cylindrical shells is used to calculate new buckling-load "knockdown factors". These knockdown factors are shown to be substantially improved, and hence much less conservative than the corresponding deterministic knockdown factors that are presently used by industry. The probability integral associated with the analysis is evaluated by using two methods; that is, by using the exact Monte Carlo method and by using an approximate First-Order Second- Moment method. A comparison of the results from these two methods indicates that the First-Order Second-Moment method yields results that are conservative for the shells considered. Furthermore, the results show that the improved, reliability-based knockdown factor presented always yields a safe estimate of the buckling load for the shells examined.
NASA Astrophysics Data System (ADS)
Hamid, Nor Zila Abd; Adenan, Nur Hamiza; Noorani, Mohd Salmi Md
2017-08-01
Forecasting and analyzing the ozone (O3) concentration time series is important because the pollutant is harmful to health. This study is a pilot study for forecasting and analyzing the O3 time series in one of Malaysian educational area namely Shah Alam using chaotic approach. Through this approach, the observed hourly scalar time series is reconstructed into a multi-dimensional phase space, which is then used to forecast the future time series through the local linear approximation method. The main purpose is to forecast the high O3 concentrations. The original method performed poorly but the improved method addressed the weakness thereby enabling the high concentrations to be successfully forecast. The correlation coefficient between the observed and forecasted time series through the improved method is 0.9159 and both the mean absolute error and root mean squared error are low. Thus, the improved method is advantageous. The time series analysis by means of the phase space plot and Cao method identified the presence of low-dimensional chaotic dynamics in the observed O3 time series. Results showed that at least seven factors affect the studied O3 time series, which is consistent with the listed factors from the diurnal variations investigation and the sensitivity analysis from past studies. In conclusion, chaotic approach has been successfully forecast and analyzes the O3 time series in educational area of Shah Alam. These findings are expected to help stakeholders such as Ministry of Education and Department of Environment in having a better air pollution management.
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
On the analysis of very small samples of Gaussian repeated measurements: an alternative approach.
Westgate, Philip M; Burchett, Woodrow W
2017-03-15
The analysis of very small samples of Gaussian repeated measurements can be challenging. First, due to a very small number of independent subjects contributing outcomes over time, statistical power can be quite small. Second, nuisance covariance parameters must be appropriately accounted for in the analysis in order to maintain the nominal test size. However, available statistical strategies that ensure valid statistical inference may lack power, whereas more powerful methods may have the potential for inflated test sizes. Therefore, we explore an alternative approach to the analysis of very small samples of Gaussian repeated measurements, with the goal of maintaining valid inference while also improving statistical power relative to other valid methods. This approach uses generalized estimating equations with a bias-corrected empirical covariance matrix that accounts for all small-sample aspects of nuisance correlation parameter estimation in order to maintain valid inference. Furthermore, the approach utilizes correlation selection strategies with the goal of choosing the working structure that will result in the greatest power. In our study, we show that when accurate modeling of the nuisance correlation structure impacts the efficiency of regression parameter estimation, this method can improve power relative to existing methods that yield valid inference. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization
NASA Technical Reports Server (NTRS)
Gern, Frank H.
2012-01-01
This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.
Optimized iterative decoding method for TPC coded CPM
NASA Astrophysics Data System (ADS)
Ma, Yanmin; Lai, Penghui; Wang, Shilian; Xie, Shunqin; Zhang, Wei
2018-05-01
Turbo Product Code (TPC) coded Continuous Phase Modulation (CPM) system (TPC-CPM) has been widely used in aeronautical telemetry and satellite communication. This paper mainly investigates the improvement and optimization on the TPC-CPM system. We first add the interleaver and deinterleaver to the TPC-CPM system, and then establish an iterative system to iteratively decode. However, the improved system has a poor convergence ability. To overcome this issue, we use the Extrinsic Information Transfer (EXIT) analysis to find the optimal factors for the system. The experiments show our method is efficient to improve the convergence performance.
Improving designer productivity
NASA Technical Reports Server (NTRS)
Hill, Gary C.
1992-01-01
Designer and design team productivity improves with skill, experience, and the tools available. The design process involves numerous trials and errors, analyses, refinements, and addition of details. Computerized tools have greatly speeded the analysis, and now new theories and methods, emerging under the label Artificial Intelligence (AI), are being used to automate skill and experience. These tools improve designer productivity by capturing experience, emulating recognized skillful designers, and making the essence of complex programs easier to grasp. This paper outlines the aircraft design process in today's technology and business climate, presenting some of the challenges ahead and some of the promising AI methods for meeting those challenges.
ASTM clustering for improving coal analysis by near-infrared spectroscopy.
Andrés, J M; Bona, M T
2006-11-15
Multivariate analysis techniques have been applied to near-infrared (NIR) spectra coals to investigate the relationship between nine coal properties (moisture (%), ash (%), volatile matter (%), fixed carbon (%), heating value (kcal/kg), carbon (%), hydrogen (%), nitrogen (%) and sulphur (%)) and the corresponding predictor variables. In this work, a whole set of coal samples was grouped into six more homogeneous clusters following the ASTM reference method for classification prior to the application of calibration methods to each coal set. The results obtained showed a considerable improvement of the error determination compared with the calibration for the whole sample set. For some groups, the established calibrations approached the quality required by the ASTM/ISO norms for laboratory analysis. To predict property values for a new coal sample it is necessary the assignation of that sample to its respective group. Thus, the discrimination and classification ability of coal samples by Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS) in the NIR range was also studied by applying Soft Independent Modelling of Class Analogy (SIMCA) and Linear Discriminant Analysis (LDA) techniques. Modelling of the groups by SIMCA led to overlapping models that cannot discriminate for unique classification. On the other hand, the application of Linear Discriminant Analysis improved the classification of the samples but not enough to be satisfactory for every group considered.
Savage, Trevor Nicholas; McIntosh, Andrew Stuart
2017-03-01
It is important to understand factors contributing to and directly causing sports injuries to improve the effectiveness and safety of sports skills. The characteristics of injury events must be evaluated and described meaningfully and reliably. However, many complex skills cannot be effectively investigated quantitatively because of ethical, technological and validity considerations. Increasingly, qualitative methods are being used to investigate human movement for research purposes, but there are concerns about reliability and measurement bias of such methods. Using the tackle in Rugby union as an example, we outline a systematic approach for developing a skill analysis protocol with a focus on improving objectivity, validity and reliability. Characteristics for analysis were selected using qualitative analysis and biomechanical theoretical models and epidemiological and coaching literature. An expert panel comprising subject matter experts provided feedback and the inter-rater reliability of the protocol was assessed using ten trained raters. The inter-rater reliability results were reviewed by the expert panel and the protocol was revised and assessed in a second inter-rater reliability study. Mean agreement in the second study improved and was comparable (52-90% agreement and ICC between 0.6 and 0.9) with other studies that have reported inter-rater reliability of qualitative analysis of human movement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burian, Cosmin; Llobet, Eduard; Vilanova, Xavier
We have designed a challenging experimental sample set in the form of 20 solutions with a high degree of similarity in order to study whether the addition of chromatographic separation information improves the performance of regular MS based electronic noses. In order to make an initial study of the approach, two different chromatographic methods were used. By processing the data of these experiments with 2 and 3-way algorithms, we have shown that the addition of chromatographic separation information improves the results compared to the 2-way analysis of mass spectra or total ion chromatogram treated separately. Our findings show that whenmore » the chromatographic peaks are resolved (longer measurement times), 2-way methods work better than 3-way methods, whereas in the case of a more challenging measurement (more coeluted chromatograms, much faster GC-MS measurements) 3-way methods work better.« less
Demodulation circuit for AC motor current spectral analysis
Hendrix, Donald E.; Smith, Stephen F.
1990-12-18
A motor current analysis method for the remote, noninvasive inspection of electric motor-operated systems. Synchronous amplitude demodulation and phase demodulation circuits are used singly and in combination along with a frequency analyzer to produce improved spectral analysis of load-induced frequencies present in the electric current flowing in a motor-driven system.
NASA Astrophysics Data System (ADS)
Lu, Lei; Yan, Jihong; Chen, Wanqun; An, Shi
2018-03-01
This paper proposed a novel spatial frequency analysis method for the investigation of potassium dihydrogen phosphate (KDP) crystal surface based on an improved bidimensional empirical mode decomposition (BEMD) method. Aiming to eliminate end effects of the BEMD method and improve the intrinsic mode functions (IMFs) for the efficient identification of texture features, a denoising process was embedded in the sifting iteration of BEMD method. With removing redundant information in decomposed sub-components of KDP crystal surface, middle spatial frequencies of the cutting and feeding processes were identified. Comparative study with the power spectral density method, two-dimensional wavelet transform (2D-WT), as well as the traditional BEMD method, demonstrated that the method developed in this paper can efficiently extract texture features and reveal gradient development of KDP crystal surface. Furthermore, the proposed method was a self-adaptive data driven technique without prior knowledge, which overcame shortcomings of the 2D-WT model such as the parameters selection. Additionally, the proposed method was a promising tool for the application of online monitoring and optimal control of precision machining process.
A phase match based frequency estimation method for sinusoidal signals
NASA Astrophysics Data System (ADS)
Shen, Yan-Lin; Tu, Ya-Qing; Chen, Lin-Jun; Shen, Ting-Ao
2015-04-01
Accurate frequency estimation affects the ranging precision of linear frequency modulated continuous wave (LFMCW) radars significantly. To improve the ranging precision of LFMCW radars, a phase match based frequency estimation method is proposed. To obtain frequency estimation, linear prediction property, autocorrelation, and cross correlation of sinusoidal signals are utilized. The analysis of computational complex shows that the computational load of the proposed method is smaller than those of two-stage autocorrelation (TSA) and maximum likelihood. Simulations and field experiments are performed to validate the proposed method, and the results demonstrate the proposed method has better performance in terms of frequency estimation precision than methods of Pisarenko harmonic decomposition, modified covariance, and TSA, which contribute to improving the precision of LFMCW radars effectively.
NASA Astrophysics Data System (ADS)
Ahn, Jae-Jun; Akram, Kashif; Shahbaz, Hafiz Muhammad; Kwon, Joong-Ho
2014-12-01
Frozen fish fillets (walleye Pollack and Japanese Spanish mackerel) were selected as samples for irradiation (0-10 kGy) detection trials using different hydrolysis methods. Photostimulated luminescence (PSL)-based screening analysis for gamma-irradiated frozen fillets showed low sensitivity due to limited silicate mineral contents on the samples. Same limitations were found in the thermoluminescence (TL) analysis on mineral samples isolated by density separation method. However, acid (HCl) and alkali (KOH) hydrolysis methods were effective in getting enough minerals to carry out TL analysis, which was reconfirmed through the normalization step by calculating the TL ratios (TL1/TL2). For improved electron spin resonance (ESR) analysis, alkali and enzyme (alcalase) hydrolysis methods were compared in separating minute-bone fractions. The enzymatic method provided more clear radiation-specific hydroxyapatite radicals than that of the alkaline method. Different hydrolysis methods could extend the application of TL and ESR techniques in identifying the irradiation history of frozen fish fillets.
Hao, Chunyan; Nguyen, Bick; Zhiao, Xiaoming; Chen, Ernie; Yang, Paul
2010-01-01
Methods using SPE followed by HPLC/MS/MS analysis were developed and validated for the determination of 39 pesticides in different aquatic environmental matrixes. The target pesticides included 12 carbamates, 15 organophosphates, and 12 phenyl ureas, out of which 16 are regulated in North America. Method detection limits were in the low ng/L range using the U.S. Environmental Protection Agency's protocol and multiple reaction monitoring (MRM) data acquisition, meeting the regulatory needs in the United States, Canada, and European Union. Isotope-labeled compounds were used as injection internal standards, as well as method surrogates to improve the data quality. QC/QA data (e.g., method recovery and within-run and between-run method precision) derived from multiyear monitoring activities were used to demonstrate method ruggedness. The same QC/QA data also showed that the method exerted no obvious matrix effect on the target analytes. Parameters that affect method performance, such as preservatives, pH values, sample storage time, and sample extract storage time, were also studied in detail. Accredited by the Canadian Association for Laboratory Accreditation and licensed by the Ontario government for drinking water analysis, these methods have been applied to the analysis of drinking water, ground water, and surface water samples collected in the province of Ontario, Canada, to ensure the pristine nature of Ontario's aquatic environment. Using the scheduled MRM (sMRM) data acquisition algorithm, it was demonstrated that sMRM improved the S/N of extracted ion chromatograms by at least two- to six-fold and, therefore, enhanced the short- and long-term instrument precision, demonstrated the ability to offer high throughput multiresidue analysis, and allowed the use of two MRM transitions for each compound to achieve higher confidence for compound identification.
Real-time emergency forecasting technique for situation management systems
NASA Astrophysics Data System (ADS)
Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.
2018-05-01
The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.
Land use/cover classification in the Brazilian Amazon using satellite images.
Lu, Dengsheng; Batistella, Mateus; Li, Guiying; Moran, Emilio; Hetrick, Scott; Freitas, Corina da Costa; Dutra, Luciano Vieira; Sant'anna, Sidnei João Siqueira
2012-09-01
Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation-based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi-resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Of the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, has the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical-based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data.
Land use/cover classification in the Brazilian Amazon using satellite images
Lu, Dengsheng; Batistella, Mateus; Li, Guiying; Moran, Emilio; Hetrick, Scott; Freitas, Corina da Costa; Dutra, Luciano Vieira; Sant’Anna, Sidnei João Siqueira
2013-01-01
Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation-based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi-resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Of the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, has the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical-based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data. PMID:24353353
Mihic, Marko M; Todorovic, Marija Lj; Obradovic, Vladimir Lj; Mitrovic, Zorica M
2016-01-01
Social services aimed at the elderly are facing great challenges caused by progressive aging of the global population but also by the constant pressure to spend funds in a rational manner. This paper focuses on analyzing the investments into human resources aimed at enhancing home care for the elderly since many countries have recorded progress in the area over the past years. The goal of this paper is to stress the significance of performing an economic analysis of the investment. This paper combines statistical analysis methods such as correlation and regression analysis, methods of economic analysis, and scenario method. The economic analysis of investing in human resources for home care service in Serbia showed that the both scenarios of investing in either additional home care hours or more beneficiaries are cost-efficient. However, the optimal solution with the positive (and the highest) value of economic net present value criterion is to invest in human resources to boost the number of home care hours from 6 to 8 hours per week and increase the number of the beneficiaries to 33%. This paper shows how the statistical and economic analysis results can be used to evaluate different scenarios and enable quality decision-making based on exact data in order to improve health and quality of life of the elderly and spend funds in a rational manner.
Mixed-Methods Research in the Discipline of Nursing.
Beck, Cheryl Tatano; Harrison, Lisa
2016-01-01
In this review article, we examined the prevalence and characteristics of 294 mixed-methods studies in the discipline of nursing. Creswell and Plano Clark's typology was most frequently used along with concurrent timing. Bivariate statistics was most often the highest level of statistics reported in the results. As for qualitative data analysis, content analysis was most frequently used. The majority of nurse researchers did not specifically address the purpose, paradigm, typology, priority, timing, interaction, or integration of their mixed-methods studies. Strategies are suggested for improving the design, conduct, and reporting of mixed-methods studies in the discipline of nursing.
NASA Astrophysics Data System (ADS)
Brewick, Patrick T.; Smyth, Andrew W.
2016-12-01
The authors have previously shown that many traditional approaches to operational modal analysis (OMA) struggle to properly identify the modal damping ratios for bridges under traffic loading due to the interference caused by the driving frequencies of the traffic loads. This paper presents a novel methodology for modal parameter estimation in OMA that overcomes the problems presented by driving frequencies and significantly improves the damping estimates. This methodology is based on finding the power spectral density (PSD) of a given modal coordinate, and then dividing the modal PSD into separate regions, left- and right-side spectra. The modal coordinates were found using a blind source separation (BSS) algorithm and a curve-fitting technique was developed that uses optimization to find the modal parameters that best fit each side spectra of the PSD. Specifically, a pattern-search optimization method was combined with a clustering analysis algorithm and together they were employed in a series of stages in order to improve the estimates of the modal damping ratios. This method was used to estimate the damping ratios from a simulated bridge model subjected to moving traffic loads. The results of this method were compared to other established OMA methods, such as Frequency Domain Decomposition (FDD) and BSS methods, and they were found to be more accurate and more reliable, even for modes that had their PSDs distorted or altered by driving frequencies.
[Analysis of scatterer microstructure feature based on Chirp-Z transform cepstrum].
Guo, Jianzhong; Lin, Shuyu
2007-12-01
The fundamental research field of medical ultrasound has been the characterization of tissue scatterers. The signal processing method is widely used in this research field. A new method of Chirp-Z Transform Cepstrum for mean spacing estimation of tissue scatterers using ultrasonic scattered signals has been developed. By using this method together with conventional AR cepstrum method, we processed the backscattered signals of mimic tissue and pig liver in vitro. The results illustrated that the Chirp-Z Transform Cepstrum method is effective for signal analysis of ultrasonic scattering and characterization of tissue scatterers, and it can improve the resolution for mean spacing estimation of tissue scatterers.
Multifidelity Analysis and Optimization for Supersonic Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Willcox, Karen; March, Andrew; Haas, Alex; Rajnarayan, Dev; Kays, Cory
2010-01-01
Supersonic aircraft design is a computationally expensive optimization problem and multifidelity approaches over a significant opportunity to reduce design time and computational cost. This report presents tools developed to improve supersonic aircraft design capabilities including: aerodynamic tools for supersonic aircraft configurations; a systematic way to manage model uncertainty; and multifidelity model management concepts that incorporate uncertainty. The aerodynamic analysis tools developed are appropriate for use in a multifidelity optimization framework, and include four analysis routines to estimate the lift and drag of a supersonic airfoil, a multifidelity supersonic drag code that estimates the drag of aircraft configurations with three different methods: an area rule method, a panel method, and an Euler solver. In addition, five multifidelity optimization methods are developed, which include local and global methods as well as gradient-based and gradient-free techniques.
Exploratory factor analysis in Rehabilitation Psychology: a content analysis.
Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N
2014-11-01
Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.
Improved dynamic analysis method using load-dependent Ritz vectors
NASA Technical Reports Server (NTRS)
Escobedo-Torres, J.; Ricles, J. M.
1993-01-01
The dynamic analysis of large space structures is important in order to predict their behavior under operating conditions. Computer models of large space structures are characterized by having a large number of degrees of freedom, and the computational effort required to carry out the analysis is very large. Conventional methods of solution utilize a subset of the eigenvectors of the system, but for systems with many degrees of freedom, the solution of the eigenproblem is in many cases the most costly phase of the analysis. For this reason, alternate solution methods need to be considered. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. The load dependent Ritz vector method is presented as an alternative to the classical normal mode methods for obtaining dynamic responses of large space structures. A simplified model of a space station is used to compare results. Results show that the load dependent Ritz vector method predicts the dynamic response better than the classical normal mode method. Even though this alternate method is very promising, further studies are necessary to fully understand its attributes and limitations.
ERIC Educational Resources Information Center
Cho, Yonjoo; Jo, Sung Jun; Park, Sunyoung; Kang, Ingu; Chen, Zengguan
2011-01-01
This study conducted a citation network analysis (CNA) of human performance technology (HPT) to examine its current state of the field. Previous reviews of the field have used traditional research methods, such as content analysis, survey, Delphi, and citation analysis. The distinctive features of CNA come from using a social network analysis…
Conversation analysis as a method for investigating interaction in care home environments.
Chatwin, John
2014-11-01
This article gives an outline of how the socio-linguistic approach of conversation analysis can be applied to the analysis of carer-patient interaction in care homes. A single case study from a routine encounter in a residential care home is presented. This is used to show how the conversation analysis method works, the kinds of interactional and communication features it can expose, and what specific contribution this kind of micro-interactional approach may make to improving quality of care in these environments. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Factoring local sequence composition in motif significance analysis.
Ng, Patrick; Keich, Uri
2008-01-01
We recently introduced a biologically realistic and reliable significance analysis of the output of a popular class of motif finders. In this paper we further improve our significance analysis by incorporating local base composition information. Relying on realistic biological data simulation, as well as on FDR analysis applied to real data, we show that our method is significantly better than the increasingly popular practice of using the normal approximation to estimate the significance of a finder's output. Finally we turn to leveraging our reliable significance analysis to improve the actual motif finding task. Specifically, endowing a variant of the Gibbs Sampler with our improved significance analysis we demonstrate that de novo finders can perform better than has been perceived. Significantly, our new variant outperforms all the finders reviewed in a recently published comprehensive analysis of the Harbison genome-wide binding location data. Interestingly, many of these finders incorporate additional information such as nucleosome positioning and the significance of binding data.
Slope Stability Analysis of Waste Dump in Sandstone Open Pit Osielec
NASA Astrophysics Data System (ADS)
Adamczyk, Justyna; Cała, Marek; Flisiak, Jerzy; Kolano, Malwina; Kowalski, Michał
2013-03-01
This paper presents the slope stability analysis for the current as well as projected (final) geometry of waste dump Sandstone Open Pit "Osielec". For the stability analysis six sections were selected. Then, the final geometry of the waste dump was designed and the stability analysis was conducted. On the basis of the analysis results the opportunities to improve the stability of the object were identified. The next issue addressed in the paper was to determine the proportion of the mixture containing mining and processing wastes, for which the waste dump remains stable. Stability calculations were carried out using Janbu method, which belongs to the limit equilibrium methods.
High-resolution DNA melting analysis in plant research
USDA-ARS?s Scientific Manuscript database
Genetic and genomic studies provide valuable insight into the inheritance, structure, organization, and function of genes. The knowledge gained from the analysis of plant genes is beneficial to all aspects of plant research, including crop improvement. New methods and tools are continually developed...
Combining Correlation Matrices: Simulation Analysis of Improved Fixed-Effects Methods
ERIC Educational Resources Information Center
Hafdahl, Adam R.
2007-01-01
The originally proposed multivariate meta-analysis approach for correlation matrices--analyze Pearson correlations, with each study's observed correlations replacing their population counterparts in its conditional-covariance matrix--performs poorly. Two refinements are considered: Analyze Fisher Z-transformed correlations, and substitute better…
Revealing gene regulation and association through biological networks
USDA-ARS?s Scientific Manuscript database
This review had first summarized traditional methods used by plant breeders for genetic improvement, such as QTL analysis and transcriptomic analysis. With accumulating data, we can draw a network that comprises all possible links between members of a community, including protein–protein interaction...
Spencer-Smith, Megan; Klingberg, Torkel
2015-01-01
Many common disorders across the lifespan feature impaired working memory (WM). Reported benefits of a WM training program include improving inattention in daily life, but this has not been evaluated in a meta-analysis. This study aimed to evaluate whether one WM training method has benefits for inattention in daily life by conducting a systematic review and meta-analysis. We searched Medline and PsycINFO, relevant journals and contacted authors for studies with an intervention and control group reporting post-training estimates of inattention in daily life. To reduce the influence of different WM training methods on the findings, the review was restricted to trials evaluating the Cogmed method. A meta-analysis calculated the pooled standardised difference in means (SMD) between intervention and control groups. A total of 622 studies were identified and 12 studies with 13 group comparisons met inclusion criteria. The meta-analysis showed a significant training effect on inattention in daily life, SMD=-0.47, 95% CI -0.65, -0.29, p<.00001. Subgroup analyses showed this significant effect was observed in groups of children and adults as well as users with and without ADHD, and in studies using control groups that were active and non-adaptive, wait-list and passive as well as studies using specific or general measures. Seven of the studies reported follow-up assessment and a meta-analysis showed persisting training benefits for inattention in daily life, SMD=-0.33, 95% CI -0.57 -0.09, p=.006. Additional meta-analyses confirmed improvements after training on visuospatial WM, SMD=0.66, 95% CI 0.43, 0.89, p<.00001, and verbal WM tasks, SMD=0.40, 95% CI 0.18, 0.62, p=.0004. Benefits of a WM training program generalise to improvements in everyday functioning. Initial evidence shows that the Cogmed method has significant benefits for inattention in daily life with a clinically relevant effect size.
Computational Fluid Dynamics Simulation Study of Active Power Control in Wind Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fleming, Paul; Aho, Jake; Gebraad, Pieter
2016-08-01
This paper presents an analysis performed on a wind plant's ability to provide active power control services using a high-fidelity computational fluid dynamics-based wind plant simulator. This approach allows examination of the impact on wind turbine wake interactions within a wind plant on performance of the wind plant controller. The paper investigates several control methods for improving performance in waked conditions. One method uses wind plant wake controls, an active field of research in which wind turbine control systems are coordinated to account for their wakes, to improve the overall performance. Results demonstrate the challenge of providing active power controlmore » in waked conditions but also the potential methods for improving this performance.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertz, P.R.
Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less
NASA Astrophysics Data System (ADS)
Luo, D.; Cai, F.
2017-12-01
Small-scale and high-resolution marine sparker multi-channel seismic surveys using large energy sparkers are characterized by a high dominant frequency of the seismic source, wide bandwidth, and a high resolution. The technology with a high-resolution and high-detection precision was designed to improve the imaging quality of shallow sedimentary. In the study, a 20KJ sparker and 24-channel streamer cable with a 6.25m group interval were used as a seismic source and receiver system, respectively. Key factors for seismic imaging of gas hydrate are enhancement of S/N ratio, amplitude compensation and detailed velocity analysis. However, the data in this study has some characteristics below: 1. Small maximum offsets are adverse to velocity analysis and multiple attenuation. 2. Lack of low frequency information, that is, information less than 100Hz are invisible. 3. Low S/N ratio since less coverage times (only 12 times). These characteristics make it difficult to reach the targets of seismic imaging. In the study, the target processing methods are used to improve the seismic imaging quality of gas hydrate. First, some technologies of noise suppression are combined used in pre-stack seismic data to suppression of seismic noise and improve the S/N ratio. These technologies including a spectrum sharing noise elimination method, median filtering and exogenous interference suppression method. Second, the combined method of three technologies including SRME, τ-p deconvolution and high precision Radon transformation is used to remove multiples. Third, accurate velocity field are used in amplitude energy compensation to highlight the Bottom Simulating Reflector (short for BSR, the indicator of gas hydrates) and gas migration pathways (such as gas chimneys, hot spots et al.). Fourth, fine velocity analysis technology are used to improve accuracy of velocity analysis. Fifth, pre-stack deconvolution processing technology is used to compensate for low frequency energy and suppress of ghost, thus formation reflection characteristics are highlighted. The result shows that the small-scale and high resolution marine sparker multi-channel seismic surveys are very effective in improving the resolution and quality of gas hydrate imaging than the conventional seismic acquisition technology.
Integrating SAS and GIS software to improve habitat-use estimates from radiotelemetry data
Kenow, K.P.; Wright, R.G.; Samuel, M.D.; Rasmussen, P.W.
2001-01-01
Radiotelemetry has been used commonly to remotely determine habitat use by a variety of wildlife species. However, habitat misclassification can occur because the true location of a radiomarked animal can only be estimated. Analytical methods that provide improved estimates of habitat use from radiotelemetry location data using a subsampling approach have been proposed previously. We developed software, based on these methods, to conduct improved habitat-use analyses. A Statistical Analysis System (SAS)-executable file generates a random subsample of points from the error distribution of an estimated animal location and formats the output into ARC/INFO-compatible coordinate and attribute files. An associated ARC/INFO Arc Macro Language (AML) creates a coverage of the random points, determines the habitat type at each random point from an existing habitat coverage, sums the number of subsample points by habitat type for each location, and outputs tile results in ASCII format. The proportion and precision of habitat types used is calculated from the subsample of points generated for each radiotelemetry location. We illustrate the method and software by analysis of radiotelemetry data for a female wild turkey (Meleagris gallopavo).
NASA Astrophysics Data System (ADS)
Nahar, Jannatun; Johnson, Fiona; Sharma, Ashish
2018-02-01
Conventional bias correction is usually applied on a grid-by-grid basis, meaning that the resulting corrections cannot address biases in the spatial distribution of climate variables. To solve this problem, a two-step bias correction method is proposed here to correct time series at multiple locations conjointly. The first step transforms the data to a set of statistically independent univariate time series, using a technique known as independent component analysis (ICA). The mutually independent signals can then be bias corrected as univariate time series and back-transformed to improve the representation of spatial dependence in the data. The spatially corrected data are then bias corrected at the grid scale in the second step. The method has been applied to two CMIP5 General Circulation Model simulations for six different climate regions of Australia for two climate variables—temperature and precipitation. The results demonstrate that the ICA-based technique leads to considerable improvements in temperature simulations with more modest improvements in precipitation. Overall, the method results in current climate simulations that have greater equivalency in space and time with observational data.
Computer image analysis in caryopses quality evaluation as exemplified by malting barley
NASA Astrophysics Data System (ADS)
Koszela, K.; Raba, B.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Czekała, W.; Ludwiczak, A.; Przybylak, A.; Boniecki, P.; Przybył, J.
2015-07-01
One of the purposes to employ modern technologies in agricultural and food industry is to increase the efficiency and automation of production processes, which helps improve productive effectiveness of business enterprises, thus making them more competitive. Nowadays, a challenge presents itself for this branch of economy, to produce agricultural and food products characterized by the best parameters in terms of quality, while maintaining optimum production and distribution costs of the processed biological material. Thus, several scientific centers seek to devise new and improved methods and technologies in this field, which will allow to meet the expectations. A new solution, under constant development, is to employ the so-called machine vision which is to replace human work in both quality and quantity evaluation processes. An indisputable advantage of employing the method is keeping the evaluation unbiased while improving its rate and, what is important, eliminating the fatigue factor of the expert. This paper elaborates on the topic of quality evaluation by marking the contamination in malting barley grains using computer image analysis and selected methods of artificial intelligence [4-5].
An Alternate Method to Springback Compensation for Sheet Metal Forming
Omar, Badrul; Jusoff, Kamaruzaman
2014-01-01
The aim of this work is to improve the accuracy of cold stamping product by accommodating springback. This is a numerical approach to improve the accuracy of springback analysis and die compensation process combining the displacement adjustment (DA) method and the spring forward (SF) algorithm. This alternate hybrid method (HM) is conducted by firstly employing DA method followed by the SF method instead of either DA or SF method individually. The springback shape and the target part are used to optimize the die surfaces compensating springback. The hybrid method (HM) algorithm has been coded in Fortran and tested in two- and three-dimensional models. By implementing the HM, the springback error can be decreased and the dimensional deviation falls in the predefined tolerance range. PMID:25165738
[Review of research design and statistical methods in Chinese Journal of Cardiology].
Zhang, Li-jun; Yu, Jin-ming
2009-07-01
To evaluate the research design and the use of statistical methods in Chinese Journal of Cardiology. Peer through the research design and statistical methods in all of the original papers in Chinese Journal of Cardiology from December 2007 to November 2008. The most frequently used research designs are cross-sectional design (34%), prospective design (21%) and experimental design (25%). In all of the articles, 49 (25%) use wrong statistical methods, 29 (15%) lack some sort of statistic analysis, 23 (12%) have inconsistencies in description of methods. There are significant differences between different statistical methods (P < 0.001). The correction rates of multifactor analysis were low and repeated measurement datas were not used repeated measurement analysis. Many problems exist in Chinese Journal of Cardiology. Better research design and correct use of statistical methods are still needed. More strict review by statistician and epidemiologist is also required to improve the literature qualities.
Development of Predictive Energy Management Strategies for Hybrid Electric Vehicles
NASA Astrophysics Data System (ADS)
Baker, David
Studies have shown that obtaining and utilizing information about the future state of vehicles can improve vehicle fuel economy (FE). However, there has been a lack of research into the impact of real-world prediction error on FE improvements, and whether near-term technologies can be utilized to improve FE. This study seeks to research the effect of prediction error on FE. First, a speed prediction method is developed, and trained with real-world driving data gathered only from the subject vehicle (a local data collection method). This speed prediction method informs a predictive powertrain controller to determine the optimal engine operation for various prediction durations. The optimal engine operation is input into a high-fidelity model of the FE of a Toyota Prius. A tradeoff analysis between prediction duration and prediction fidelity was completed to determine what duration of prediction resulted in the largest FE improvement. Results demonstrate that 60-90 second predictions resulted in the highest FE improvement over the baseline, achieving up to a 4.8% FE increase. A second speed prediction method utilizing simulated vehicle-to-vehicle (V2V) communication was developed to understand if incorporating near-term technologies could be utilized to further improve prediction fidelity. This prediction method produced lower variation in speed prediction error, and was able to realize a larger FE improvement over the local prediction method for longer prediction durations, achieving up to 6% FE improvement. This study concludes that speed prediction and prediction-informed optimal vehicle energy management can produce FE improvements with real-world prediction error and drive cycle variability, as up to 85% of the FE benefit of perfect speed prediction was achieved with the proposed prediction methods.
Innovative Use of Quality Management Methods for Product Improvement
NASA Astrophysics Data System (ADS)
Midor, Katarzyna; Žarnovský, Jozef
2016-12-01
Organisations constantly look for new, innovative solutions and methods which could be used to improve their efficiency and increase the quality of their products. Identifying the causes for returns is an important issue for modern companies, as returns are the cause for the increase in production costs and, most importantly, the loss of credibility in the eyes of the client. Therefore, for the company to be able to sustain or strengthen its position on the market, it has to follow the rules of quality management. Especially important is the rule of constant improvement. This rule is primarily connected with preventing errors and defects from occurring at all the stages of the production process. To achieve that, one must, among other things, use quality management tools. The article presents an analysis of causes for returns of a vibrating screen produced by a company which manufactures machinery and equipment for the extractive industry, using quality management tools such as the Ishikawa diagram and Pareto analysis. The analysis allowed for the identification of the causes of client returns which could not be previously identified, and proposing solutions for them.
The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...
Rapid economic analysis of northern hardwood stand improvement options
William B. Leak
1980-01-01
Data and methodology are provided for projecting basal area, diameter, volumes, and values by product for northern hardwood stands, and for determining the rate of return on stand improvement investments. The method is rapid, requires a minimum amount of information, and should prove useful for on-the-ground economic analyses.
Educational Information Quantization for Improving Content Quality in Learning Management Systems
ERIC Educational Resources Information Center
Rybanov, Alexander Aleksandrovich
2014-01-01
The article offers the educational information quantization method for improving content quality in Learning Management Systems. The paper considers questions concerning analysis of quality of quantized presentation of educational information, based on quantitative text parameters: average frequencies of parts of speech, used in the text; formal…
Kamoi, Shun; Pretty, Christopher; Balmer, Joel; Davidson, Shaun; Pironet, Antoine; Desaive, Thomas; Shaw, Geoffrey M; Chase, J Geoffrey
2017-04-24
Pressure contour analysis is commonly used to estimate cardiac performance for patients suffering from cardiovascular dysfunction in the intensive care unit. However, the existing techniques for continuous estimation of stroke volume (SV) from pressure measurement can be unreliable during hemodynamic instability, which is inevitable for patients requiring significant treatment. For this reason, pressure contour methods must be improved to capture changes in vascular properties and thus provide accurate conversion from pressure to flow. This paper presents a novel pressure contour method utilizing pulse wave velocity (PWV) measurement to capture vascular properties. A three-element Windkessel model combined with the reservoir-wave concept are used to decompose the pressure contour into components related to storage and flow. The model parameters are identified beat-to-beat from the water-hammer equation using measured PWV, wave component of the pressure, and an estimate of subject-specific aortic dimension. SV is then calculated by converting pressure to flow using identified model parameters. The accuracy of this novel method is investigated using data from porcine experiments (N = 4 Pietrain pigs, 20-24.5 kg), where hemodynamic properties were significantly altered using dobutamine, fluid administration, and mechanical ventilation. In the experiment, left ventricular volume was measured using admittance catheter, and aortic pressure waveforms were measured at two locations, the aortic arch and abdominal aorta. Bland-Altman analysis comparing gold-standard SV measured by the admittance catheter and estimated SV from the novel method showed average limits of agreement of ±26% across significant hemodynamic alterations. This result shows the method is capable of estimating clinically acceptable absolute SV values according to Critchely and Critchely. The novel pressure contour method presented can accurately estimate and track SV even when hemodynamic properties are significantly altered. Integrating PWV measurements into pressure contour analysis improves identification of beat-to-beat changes in Windkessel model parameters, and thus, provides accurate estimate of blood flow from measured pressure contour. The method has great potential for overcoming weaknesses associated with current pressure contour methods for estimating SV.
Hiroyasu, Tomoyuki; Hayashinuma, Katsutoshi; Ichikawa, Hiroshi; Yagi, Nobuaki
2015-08-01
A preprocessing method for endoscopy image analysis using texture analysis is proposed. In a previous study, we proposed a feature value that combines a co-occurrence matrix and a run-length matrix to analyze the extent of early gastric cancer from images taken with narrow-band imaging endoscopy. However, the obtained feature value does not identify lesion zones correctly due to the influence of noise and halation. Therefore, we propose a new preprocessing method with a non-local means filter for de-noising and contrast limited adaptive histogram equalization. We have confirmed that the pattern of gastric mucosa in images can be improved by the proposed method. Furthermore, the lesion zone is shown more correctly by the obtained color map.
Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.
An investigation of improved airbag performance by vent control and gas injection
NASA Astrophysics Data System (ADS)
Lee, Calvin; Rosato, Nick; Lai, Francis
Airbags are currently being investigated as an impact energy absorber for U.S. Army airdrop. Simple airbags with constant vent areas have been found to be unsatisfactory in yielding high G forces. In this paper, a method of controlling the vent area and a method of injecting gas into the airbag during its compression stroke to improve airbag performance are presented. Theoretical analysis of complex airbags using these two methods show that they provide lower G forces than simple airbags. Vertical drop tests of a vent-control airbag confirm this result. Gas-injection airbags are currently being tested.
The scalar and electromagnetic form factors of the nucleon in dispersively improved Chiral EFT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alarcon, Jose Manuel
We present a method for calculating the nucleon form factors of G-parity-even operators. This method combines chiral effective field theory (χEFT) and dispersion theory. Through unitarity we factorize the imaginary part of the form factors into a perturbative part, calculable with χEFT, and a non-perturbative part, obtained through other methods. We consider the scalar and electromagnetic (EM) form factors of the nucleon. The results show an important improvement compared to standard chiral calculations, and can be used in analysis of the low-energy properties of the nucleon.
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.
An improved model for whole genome phylogenetic analysis by Fourier transform.
Yin, Changchuan; Yau, Stephen S-T
2015-10-07
DNA sequence similarity comparison is one of the major steps in computational phylogenetic studies. The sequence comparison of closely related DNA sequences and genomes is usually performed by multiple sequence alignments (MSA). While the MSA method is accurate for some types of sequences, it may produce incorrect results when DNA sequences undergone rearrangements as in many bacterial and viral genomes. It is also limited by its computational complexity for comparing large volumes of data. Previously, we proposed an alignment-free method that exploits the full information contents of DNA sequences by Discrete Fourier Transform (DFT), but still with some limitations. Here, we present a significantly improved method for the similarity comparison of DNA sequences by DFT. In this method, we map DNA sequences into 2-dimensional (2D) numerical sequences and then apply DFT to transform the 2D numerical sequences into frequency domain. In the 2D mapping, the nucleotide composition of a DNA sequence is a determinant factor and the 2D mapping reduces the nucleotide composition bias in distance measure, and thus improving the similarity measure of DNA sequences. To compare the DFT power spectra of DNA sequences with different lengths, we propose an improved even scaling algorithm to extend shorter DFT power spectra to the longest length of the underlying sequences. After the DFT power spectra are evenly scaled, the spectra are in the same dimensionality of the Fourier frequency space, then the Euclidean distances of full Fourier power spectra of the DNA sequences are used as the dissimilarity metrics. The improved DFT method, with increased computational performance by 2D numerical representation, can be applicable to any DNA sequences of different length ranges. We assess the accuracy of the improved DFT similarity measure in hierarchical clustering of different DNA sequences including simulated and real datasets. The method yields accurate and reliable phylogenetic trees and demonstrates that the improved DFT dissimilarity measure is an efficient and effective similarity measure of DNA sequences. Due to its high efficiency and accuracy, the proposed DFT similarity measure is successfully applied on phylogenetic analysis for individual genes and large whole bacterial genomes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Topic modeling for cluster analysis of large biological and medical datasets
2014-01-01
Background The big data moniker is nowhere better deserved than to describe the ever-increasing prodigiousness and complexity of biological and medical datasets. New methods are needed to generate and test hypotheses, foster biological interpretation, and build validated predictors. Although multivariate techniques such as cluster analysis may allow researchers to identify groups, or clusters, of related variables, the accuracies and effectiveness of traditional clustering methods diminish for large and hyper dimensional datasets. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. Its ability to reduce high dimensionality to a small number of latent variables makes it suitable as a means for clustering or overcoming clustering difficulties in large biological and medical datasets. Results In this study, three topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, are proposed and tested on the cluster analysis of three large datasets: Salmonella pulsed-field gel electrophoresis (PFGE) dataset, lung cancer dataset, and breast cancer dataset, which represent various types of large biological or medical datasets. All three various methods are shown to improve the efficacy/effectiveness of clustering results on the three datasets in comparison to traditional methods. A preferable cluster analysis method emerged for each of the three datasets on the basis of replicating known biological truths. Conclusion Topic modeling could be advantageously applied to the large datasets of biological or medical research. The three proposed topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, yield clustering improvements for the three different data types. Clusters more efficaciously represent truthful groupings and subgroupings in the data than traditional methods, suggesting that topic model-based methods could provide an analytic advancement in the analysis of large biological or medical datasets. PMID:25350106
Topic modeling for cluster analysis of large biological and medical datasets.
Zhao, Weizhong; Zou, Wen; Chen, James J
2014-01-01
The big data moniker is nowhere better deserved than to describe the ever-increasing prodigiousness and complexity of biological and medical datasets. New methods are needed to generate and test hypotheses, foster biological interpretation, and build validated predictors. Although multivariate techniques such as cluster analysis may allow researchers to identify groups, or clusters, of related variables, the accuracies and effectiveness of traditional clustering methods diminish for large and hyper dimensional datasets. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. Its ability to reduce high dimensionality to a small number of latent variables makes it suitable as a means for clustering or overcoming clustering difficulties in large biological and medical datasets. In this study, three topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, are proposed and tested on the cluster analysis of three large datasets: Salmonella pulsed-field gel electrophoresis (PFGE) dataset, lung cancer dataset, and breast cancer dataset, which represent various types of large biological or medical datasets. All three various methods are shown to improve the efficacy/effectiveness of clustering results on the three datasets in comparison to traditional methods. A preferable cluster analysis method emerged for each of the three datasets on the basis of replicating known biological truths. Topic modeling could be advantageously applied to the large datasets of biological or medical research. The three proposed topic model-derived clustering methods, highest probable topic assignment, feature selection and feature extraction, yield clustering improvements for the three different data types. Clusters more efficaciously represent truthful groupings and subgroupings in the data than traditional methods, suggesting that topic model-based methods could provide an analytic advancement in the analysis of large biological or medical datasets.
Song, Ming-Fen; Li, Yun-Shan; Ootsuyama, Yuko; Kasai, Hiroshi; Kawai, Kazuaki; Ohta, Masanori; Eguchi, Yasumasa; Yamato, Hiroshi; Matsumoto, Yuki; Yoshida, Rie; Ogawa, Yasutaka
2009-07-01
Urinary 8-OH-dG is commonly analyzed as a marker of oxidative stress. For its analysis, ELISA and HPLC methods are generally used, although discrepancies in the data obtained by these methods have often been discussed. To clarify this problem, we fractionated human urine by reverse-phase HPLC and assayed each fraction by the ELISA method. In addition to the 8-OH-dG fraction, a positive reaction was observed in the first eluted fraction. The components in this fraction were examined by the ELISA. Urea was found to be the responsible component in this fraction. Urea is present in high concentrations in the urine of mice, rats, and humans, and its level is influenced by many factors. Therefore, certain improvements, such as a correction based on urea content or urease treatment, are required for the accurate analysis of urinary 8-OH-dG by the ELISA method. In addition, performance of the ELISA at 4 degrees C reduced the recognition of urea considerably and improved the 8-OH-dG analysis.
Characterization of Triaxial Braided Composite Material Properties for Impact Simulation
NASA Technical Reports Server (NTRS)
Roberts, Gary D.; Goldberg, Robert K.; Biniendak, Wieslaw K.; Arnold, William A.; Littell, Justin D.; Kohlman, Lee W.
2009-01-01
The reliability of impact simulations for aircraft components made with triaxial braided carbon fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Improvements to standard quasi-static test methods are needed to account for the large unit cell size and localized damage within the unit cell. The deformation and damage of a triaxial braided composite material was examined using standard quasi-static in-plane tension, compression, and shear tests. Some modifications to standard test specimen geometries are suggested, and methods for measuring the local strain at the onset of failure within the braid unit cell are presented. Deformation and damage at higher strain rates is examined using ballistic impact tests on 61- by 61- by 3.2-mm (24- by 24- by 0.125-in.) composite panels. Digital image correlation techniques were used to examine full-field deformation and damage during both quasi-static and impact tests. An impact analysis method is presented that utilizes both local and global deformation and failure information from the quasi-static tests as input for impact simulations. Improvements that are needed in test and analysis methods for better predictive capability are examined.
Li, Ziyi; Safo, Sandra E; Long, Qi
2017-07-11
Sparse principal component analysis (PCA) is a popular tool for dimensionality reduction, pattern recognition, and visualization of high dimensional data. It has been recognized that complex biological mechanisms occur through concerted relationships of multiple genes working in networks that are often represented by graphs. Recent work has shown that incorporating such biological information improves feature selection and prediction performance in regression analysis, but there has been limited work on extending this approach to PCA. In this article, we propose two new sparse PCA methods called Fused and Grouped sparse PCA that enable incorporation of prior biological information in variable selection. Our simulation studies suggest that, compared to existing sparse PCA methods, the proposed methods achieve higher sensitivity and specificity when the graph structure is correctly specified, and are fairly robust to misspecified graph structures. Application to a glioblastoma gene expression dataset identified pathways that are suggested in the literature to be related with glioblastoma. The proposed sparse PCA methods Fused and Grouped sparse PCA can effectively incorporate prior biological information in variable selection, leading to improved feature selection and more interpretable principal component loadings and potentially providing insights on molecular underpinnings of complex diseases.
NASA Astrophysics Data System (ADS)
Chu, J.; Zhang, C.; Fu, G.; Li, Y.; Zhou, H.
2015-08-01
This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed method dramatically reduces the computational demands required for attaining high-quality approximations of optimal trade-off relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed dimension reduction and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform dimension reduction of optimization problems when solving complex multi-objective reservoir operation problems.
Reuse of imputed data in microarray analysis increases imputation efficiency
Kim, Ki-Yeol; Kim, Byoung-Jin; Yi, Gwan-Su
2004-01-01
Background The imputation of missing values is necessary for the efficient use of DNA microarray data, because many clustering algorithms and some statistical analysis require a complete data set. A few imputation methods for DNA microarray data have been introduced, but the efficiency of the methods was low and the validity of imputed values in these methods had not been fully checked. Results We developed a new cluster-based imputation method called sequential K-nearest neighbor (SKNN) method. This imputes the missing values sequentially from the gene having least missing values, and uses the imputed values for the later imputation. Although it uses the imputed values, the efficiency of this new method is greatly improved in its accuracy and computational complexity over the conventional KNN-based method and other methods based on maximum likelihood estimation. The performance of SKNN was in particular higher than other imputation methods for the data with high missing rates and large number of experiments. Application of Expectation Maximization (EM) to the SKNN method improved the accuracy, but increased computational time proportional to the number of iterations. The Multiple Imputation (MI) method, which is well known but not applied previously to microarray data, showed a similarly high accuracy as the SKNN method, with slightly higher dependency on the types of data sets. Conclusions Sequential reuse of imputed data in KNN-based imputation greatly increases the efficiency of imputation. The SKNN method should be practically useful to save the data of some microarray experiments which have high amounts of missing entries. The SKNN method generates reliable imputed values which can be used for further cluster-based analysis of microarray data. PMID:15504240
Hao, Ruijie; Adoligbe, Camus; Jiang, Bijie; Zhao, Xianlin; Gui, Linsheng; Qu, Kaixing; Wu, Sen; Zan, Linsen
2015-01-01
Longissimus dorsi muscle (LD) proteomics provides a novel opportunity to reveal the molecular mechanism behind intramuscular fat deposition. Unfortunately, the vast amounts of lipids and nucleic acids in this tissue hampered LD proteomics analysis. Trichloroacetic acid (TCA)/acetone precipitation is a widely used method to remove contaminants from protein samples. However, the high speed centrifugation employed in this method produces hard precipitates, which restrict contaminant elimination and protein re-dissolution. To address the problem, the centrifugation precipitates were first grinded with a glass tissue grinder and then washed with 90% acetone (TCA/acetone-G-W) in the present study. According to our result, the treatment for solid precipitate facilitated non-protein contaminant removal and protein re-dissolution, ultimately improving two-dimensional gel electrophoresis (2-DE) analysis. Additionally, we also evaluated the effect of sample drying on 2-DE profile as well as protein yield. It was found that 30 min air-drying did not result in significant protein loss, but reduced horizontal streaking and smearing on 2-DE gel compared to 10 min. In summary, we developed an optimized TCA/acetone precipitation method for protein extraction of LD, in which the modifications improved the effectiveness of TCA/acetone method.
Hao, Ruijie; Adoligbe, Camus; Jiang, Bijie; Zhao, Xianlin; Gui, Linsheng; Qu, Kaixing; Wu, Sen; Zan, Linsen
2015-01-01
Longissimus dorsi muscle (LD) proteomics provides a novel opportunity to reveal the molecular mechanism behind intramuscular fat deposition. Unfortunately, the vast amounts of lipids and nucleic acids in this tissue hampered LD proteomics analysis. Trichloroacetic acid (TCA)/acetone precipitation is a widely used method to remove contaminants from protein samples. However, the high speed centrifugation employed in this method produces hard precipitates, which restrict contaminant elimination and protein re-dissolution. To address the problem, the centrifugation precipitates were first grinded with a glass tissue grinder and then washed with 90% acetone (TCA/acetone-G-W) in the present study. According to our result, the treatment for solid precipitate facilitated non-protein contaminant removal and protein re-dissolution, ultimately improving two-dimensional gel electrophoresis (2-DE) analysis. Additionally, we also evaluated the effect of sample drying on 2-DE profile as well as protein yield. It was found that 30 min air-drying did not result in significant protein loss, but reduced horizontal streaking and smearing on 2-DE gel compared to 10 min. In summary, we developed an optimized TCA/acetone precipitation method for protein extraction of LD, in which the modifications improved the effectiveness of TCA/acetone method. PMID:25893432
Improving wavelet denoising based on an in-depth analysis of the camera color processing
NASA Astrophysics Data System (ADS)
Seybold, Tamara; Plichta, Mathias; Stechele, Walter
2015-02-01
While Denoising is an extensively studied task in signal processing research, most denoising methods are designed and evaluated using readily processed image data, e.g. the well-known Kodak data set. The noise model is usually additive white Gaussian noise (AWGN). This kind of test data does not correspond to nowadays real-world image data taken with a digital camera. Using such unrealistic data to test, optimize and compare denoising algorithms may lead to incorrect parameter tuning or suboptimal choices in research on real-time camera denoising algorithms. In this paper we derive a precise analysis of the noise characteristics for the different steps in the color processing. Based on real camera noise measurements and simulation of the processing steps, we obtain a good approximation for the noise characteristics. We further show how this approximation can be used in standard wavelet denoising methods. We improve the wavelet hard thresholding and bivariate thresholding based on our noise analysis results. Both the visual quality and objective quality metrics show the advantage of the proposed method. As the method is implemented using look-up-tables that are calculated before the denoising step, our method can be implemented with very low computational complexity and can process HD video sequences real-time in an FPGA.
Bergquist, J; Vona, M J; Stiller, C O; O'Connor, W T; Falkenberg, T; Ekman, R
1996-03-01
The use of capillary electrophoresis with laser-induced fluorescence detection (CE-LIF) for the analysis of microdialysate samples from the periaqueductal grey matter (PAG) of freely moving rats is described. By employing 3-(4-carboxybenzoyl)-2-quinoline-carboxaldehyde (CBQCA) as a derivatization agent, we simultaneously monitored the concentrations of 8 amino acids (arginine, glutamine, valine, gamma-amino-n-butyric acid (GABA), alanine, glycine, glutamate, and aspartate), with nanomolar and subnanomolar detection limits. Two of the amino acids (GABA and glutamate) were analysed in parallel by conventional high-performance liquid chromatography (HPLC) in order to directly compare the two analytical methods. Other CE methods for analysis of microdialysate have been previously described, and this improved method offers greater sensitivity, ease of use, and the possibility to monitor several amino acids simultaneously. By using this technique together with an optimised form of microdialysis technique, the tiny sample consumption and the improved detection limits permit the detection of fast and transient transmitter changes.
Triangular covariance factorizations for. Ph.D. Thesis. - Calif. Univ.
NASA Technical Reports Server (NTRS)
Thornton, C. L.
1976-01-01
An improved computational form of the discrete Kalman filter is derived using an upper triangular factorization of the error covariance matrix. The covariance P is factored such that P = UDUT where U is unit upper triangular and D is diagonal. Recursions are developed for propagating the U-D covariance factors together with the corresponding state estimate. The resulting algorithm, referred to as the U-D filter, combines the superior numerical precision of square root filtering techniques with an efficiency comparable to that of Kalman's original formula. Moreover, this method is easily implemented and involves no more computer storage than the Kalman algorithm. These characteristics make the U-D method an attractive realtime filtering technique. A new covariance error analysis technique is obtained from an extension of the U-D filter equations. This evaluation method is flexible and efficient and may provide significantly improved numerical results. Cost comparisons show that for a large class of problems the U-D evaluation algorithm is noticeably less expensive than conventional error analysis methods.
NASA Astrophysics Data System (ADS)
Fei, Cheng-Wei; Bai, Guang-Chen
2014-12-01
To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.
NASA Technical Reports Server (NTRS)
Shollenberger, C. A.; Smyth, D. N.
1978-01-01
A nonlinear, nonplanar three dimensional jet flap analysis, applicable to the ground effect problem, is presented. Lifting surface methodology is developed for a wing with arbitrary planform operating in an inviscid and incompressible fluid. The classical, infintely thin jet flap model is employed to simulate power induced effects. An iterative solution procedure is applied within the analysis to successively approximate the jet shape until a converged solution is obtained which closely satisfies jet and wing boundary conditions. Solution characteristics of the method are discussed and example results are presented for unpowered, basic powered and complex powered configurations. Comparisons between predictions of the present method and experimental measurements indicate that the improvement of the jet with the ground plane is important in the analyses of powered lift systems operating in ground proximity. Further development of the method is suggested in the areas of improved solution convergence, more realistic modeling of jet impingement and calculation efficiency enhancements.
Research on Robot Pose Control Technology Based on Kinematics Analysis Model
NASA Astrophysics Data System (ADS)
Liu, Dalong; Xu, Lijuan
2018-01-01
In order to improve the attitude stability of the robot, proposes an attitude control method of robot based on kinematics analysis model, solve the robot walking posture transformation, grasping and controlling the motion planning problem of robot kinematics. In Cartesian space analytical model, using three axis accelerometer, magnetometer and the three axis gyroscope for the combination of attitude measurement, the gyroscope data from Calman filter, using the four element method for robot attitude angle, according to the centroid of the moving parts of the robot corresponding to obtain stability inertia parameters, using random sampling RRT motion planning method, accurate operation to any position control of space robot, to ensure the end effector along a prescribed trajectory the implementation of attitude control. The accurate positioning of the experiment is taken using MT-R robot as the research object, the test robot. The simulation results show that the proposed method has better robustness, and higher positioning accuracy, and it improves the reliability and safety of robot operation.
Autoregressive modeling for the spectral analysis of oceanographic data
NASA Technical Reports Server (NTRS)
Gangopadhyay, Avijit; Cornillon, Peter; Jackson, Leland B.
1989-01-01
Over the last decade there has been a dramatic increase in the number and volume of data sets useful for oceanographic studies. Many of these data sets consist of long temporal or spatial series derived from satellites and large-scale oceanographic experiments. These data sets are, however, often 'gappy' in space, irregular in time, and always of finite length. The conventional Fourier transform (FT) approach to the spectral analysis is thus often inapplicable, or where applicable, it provides questionable results. Here, through comparative analysis with the FT for different oceanographic data sets, the possibilities offered by autoregressive (AR) modeling to perform spectral analysis of gappy, finite-length series, are discussed. The applications demonstrate that as the length of the time series becomes shorter, the resolving power of the AR approach as compared with that of the FT improves. For the longest data sets examined here, 98 points, the AR method performed only slightly better than the FT, but for the very short ones, 17 points, the AR method showed a dramatic improvement over the FT. The application of the AR method to a gappy time series, although a secondary concern of this manuscript, further underlines the value of this approach.
Rait, N.
1981-01-01
A modified method is described for a 1-mg sample multi-element semiquantitative spectrographic analysis. This method uses a direct-current arc source, carbon instead of graphite electrodes, and an 80% argon-20% oxygen atmosphere instead of air. Although this is a destructive method, an analysis can be made for 68 elements in all mineral and geochemical samples. Carbon electrodes have been an aid in improving the detection limits of many elements. The carbon has a greater resistance to heat conductance and develops a better tip, facilitating sample volatilization and counter balancing the cooling effect of a flow of the argon-oxygen mixture around the anode. Where such an argon-oxygen atmosphere is used instead of air, the cyanogen band lines are greatly diminished in intensity, and thus more spectral lines of analysis elements are available for use; the spectral background is also lower. The main advantage of using the carbon electrode and the 80% argon-20% oxygen atmosphere is the improved detection limits of 36 out of 68 elements. The detection limits remain the same for 23 elements, and are not as good for only nine elements. ?? 1981.
The Teaching Gap: Best Ideas from the World's Teachers for Improving Education in the Classroom.
ERIC Educational Resources Information Center
Stigler, James W.; Hiebert, James
This book is an action plan for improving education in the U.S, focusing on the key role of teachers in this improvement. It offers a detailed comparison of the educational methods of Germany, Japan, and the United States. The analysis begins with an international study of mathematics teaching in the three countries that was conducted as part of…
A global optimization approach to multi-polarity sentiment analysis.
Li, Xinmiao; Li, Jing; Wu, Yukeng
2015-01-01
Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG) and support vector machines (SVM) are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti) approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA) and grid search method. From the results of this comparison, we found that PSOGO-Senti is more suitable for improving a difficult multi-polarity sentiment analysis problem.
Steps towards Improving GNSS Systematic Errors and Biases
NASA Astrophysics Data System (ADS)
Herring, T.; Moore, M.
2017-12-01
Four general areas of analysis method improvements, three related to data analysis models and the fourth to calibration methods, have been recommended at the recent unified analysis workshop (UAW) and we discuss aspects of these areas for improvement. The gravity fields used in the GNSS orbit integrations should be updated to match modern fields to make them consistent with the fields being used by the other IAG services. The update would include the static part of the field and a time variable component. The force models associated with radiation forces are the most uncertain and modeling of these forces can be made more consistent with the exchange of attitude information. The international GNSS service (IGS) will develop an attitude format and make attitude information available so that analysis centers can validate their models. The IGS has noted the appearance of the GPS draconitic period and harmonics of this period in time series of various geodetic products (e.g., positions and Earth orientation parameters). An updated short-period (diurnal and semidiurnal) model is needed and a method to determine the best model developed. The final area, not directly related to analysis models, is the recommendation that site dependent calibration of GNSS antennas are needed since these have a direct effect on the ITRF realization and position offsets when antennas are changed. Evaluation of the effects of the use of antenna specific phase center models will be investigated for those sites where these values are available without disturbing an existing antenna installation. Potential development of an in-situ antenna calibration system is strongly encouraged. In-situ calibration would be deployed at core sites where GNSS sites are tied to other geodetic systems. With recent expansion of the number of GPS satellites transmitting unencrypted codes on the GPS L2 frequency and the availability of software GNSS receivers in-situ calibration between an existing installation and a movable directional antenna is now more likely to generate accurate results than earlier analog switching systems. With all of these improvements, there is the expectation that there will be better agreement between the space geodetic methods thus allowing more definitive assessment and modeling of the Earth's time variable shape and gravity field.
Etchepareborda, Pablo; Vadnjal, Ana Laura; Federico, Alejandro; Kaufmann, Guillermo H
2012-09-15
We evaluate the extension of the exact nonlinear reconstruction technique developed for digital holography to the phase-recovery problems presented by other optical interferometric methods, which use carrier modulation. It is shown that the introduction of an analytic wavelet analysis in the ridge of the cepstrum transformation corresponding to the analyzed interferogram can be closely related to the well-known wavelet analysis of the interferometric intensity. Subsequently, the phase-recovery process is improved. The advantages and limitations of this framework are analyzed and discussed using numerical simulations in singular scalar light fields and in temporal speckle pattern interferometry.
Analysis of a virtual memory model for maintaining database views
NASA Technical Reports Server (NTRS)
Kinsley, Kathryn C.; Hughes, Charles E.
1992-01-01
This paper presents an analytical model for predicting the performance of a new support strategy for database views. This strategy, called the virtual method, is compared with traditional methods for supporting views. The analytical model's predictions of improved performance by the virtual method are then validated by comparing these results with those achieved in an experimental implementation.
Pina, Jamie; Massoudi, Barbara L; Chester, Kelley; Koyanagi, Mark
2018-06-07
Researchers and analysts have not completely examined word frequency analysis as an approach to creating a public health quality improvement taxonomy. To develop a taxonomy of public health quality improvement concepts for an online exchange of quality improvement work. We analyzed documents, conducted an expert review, and employed a user-centered design along with a faceted search approach to make online entries searchable for users. To provide the most targeted facets to users, we used word frequency to analyze 334 published public health quality improvement documents to find the most common clusters of word meanings. We then reviewed the highest-weighted concepts and categorized their relationships to quality improvement details in our taxonomy. Next, we mapped meanings to items in our taxonomy and presented them in order of their weighted percentages in the data. Using these methods, we developed and sorted concepts in the faceted search presentation so that online exchange users could access relevant search criteria. We reviewed 50 of the top synonym clusters and identified 12 categories for our taxonomy data. The final categories were as follows: Summary; Planning and Execution Details; Health Impact; Training and Preparation; Information About the Community; Information About the Health Department; Results; Quality Improvement (QI) Staff; Information; Accreditation Details; Collaborations; and Contact Information of the Submitter. Feedback about the elements in the taxonomy and presentation of elements in our search environment from users has been positive. When relevant data are available, the word frequency analysis method may be useful in other taxonomy development efforts for public health.
Sliding-mode control combined with improved adaptive feedforward for wafer scanner
NASA Astrophysics Data System (ADS)
Li, Xiaojie; Wang, Yiguang
2018-03-01
In this paper, a sliding-mode control method combined with improved adaptive feedforward is proposed for wafer scanner to improve the tracking performance of the closed-loop system. Particularly, In addition to the inverse model, the nonlinear force ripple effect which may degrade the tracking accuracy of permanent magnet linear motor (PMLM) is considered in the proposed method. The dominant position periodicity of force ripple is determined by using the Fast Fourier Transform (FFT) analysis for experimental data and the improved feedforward control is achieved by the online recursive least-squares (RLS) estimation of the inverse model and the force ripple. The improved adaptive feedforward is given in a general form of nth-order model with force ripple effect. This proposed method is motivated by the motion controller design of the long-stroke PMLM and short-stroke voice coil motor for wafer scanner. The stability of the closed-loop control system and the convergence of the motion tracking are guaranteed by the proposed sliding-mode feedback and adaptive feedforward methods theoretically. Comparative experiments on a precision linear motion platform can verify the correctness and effectiveness of the proposed method. The experimental results show that comparing to traditional method the proposed one has better performance of rapidity and robustness, especially for high speed motion trajectory. And, the improvements on both tracking accuracy and settling time can be achieved.
Romppanen, T; Huttunen, E; Helminen, H J
1980-07-01
An improved light microscopical histoquantitative method for the analysis of the stereologic structure of the ventral lobe of the rat prostate is introduced. From paraffin-embedded tissue sections, volumetric fractions of the acinar parenchyma, the glandular epithelium, the glandular lumen, and the interacinar tissue were determined. The surface density of the glandular epithelium and the length density of the glandular tubules per cubic millimeter of tissue were also calculated. The corresponding total amount/quantity of each tissue compartment was computed for the whole ventral lobe based on the weight of the lobe. Using established stereologic laws, the height of the epithelium, the diameter of the glandular tubules, the free distance between the glandular tubules, and the distance between the glandular centers (means) were determined. The fitness of the method was tested by analyzing, in addition to normal prostates, ventral prostates of rats castrated 30 days before sacrifice.
Methods for evaluating a mature substance abuse prevention/early intervention program.
Becker, L R; Hall, M; Fisher, D A; Miller, T R
2000-05-01
The authors describe methods for work in progress to evaluate four workplace prevention and/or early intervention programs designed to change occupational norms and reduce substance abuse at a major U.S. transportation company. The four programs are an employee assistance program, random drug testing, managed behavioral health care, and a peer-led intervention program. An elaborate mixed-methods evaluation combines data collection and analysis techniques from several traditions. A process-improvement evaluation focuses on the peer-led component to describe its evolution, document the implementation process for those interested in replicating it, and provide information for program improvement. An outcome-assessment evaluation examines impacts of the four programs on job performance measures (e.g., absenteeism, turnover, injury, and disability rates) and includes a cost-offset and employer cost-savings analysis. Issues related to using archival data, combining qualitative and quantitative designs, and working in a corporate environment are discussed.
Improvements in soft gelatin capsule sample preparation for USP-based simethicone FTIR analysis.
Hargis, Amy D; Whittall, Linda B
2013-02-23
Due to the absence of a significant chromophore, Simethicone raw material and finished product analysis is achieved using a FTIR-based method that quantifies the polydimethylsiloxane (PDMS) component of the active ingredient. The method can be found in the USP monographs for several dosage forms of Simethicone-containing pharmaceutical products. For soft gelatin capsules, the PDMS assay values determined using the procedure described in the USP method were variable (%RSDs from 2 to 9%) and often lower than expected based on raw material values. After investigation, it was determined that the extraction procedure used for sample preparation was causing loss of material to the container walls due to the hydrophobic nature of PDMS. Evaluation revealed that a simple dissolution of the gelatin capsule fill in toluene provided improved assay results (%RSDs≤0.5%) as well as a simplified and rapid sample preparation. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ding, Hao; Cao, Ming; DuPont, Andrew W.; Scott, Larry D.; Guha, Sushovan; Singhal, Shashideep; Younes, Mamoun; Pence, Isaac; Herline, Alan; Schwartz, David; Xu, Hua; Mahadevan-Jansen, Anita; Bi, Xiaohong
2016-03-01
Inflammatory bowel disease (IBD) is an idiopathic disease that is typically characterized by chronic inflammation of the gastrointestinal tract. Recently much effort has been devoted to the development of novel diagnostic tools that can assist physicians for fast, accurate, and automated diagnosis of the disease. Previous research based on Raman spectroscopy has shown promising results in differentiating IBD patients from normal screening cases. In the current study, we examined IBD patients in vivo through a colonoscope-coupled Raman system. Optical diagnosis for IBD discrimination was conducted based on full-range spectra using multivariate statistical methods. Further, we incorporated several feature selection methods in machine learning into the classification model. The diagnostic performance for disease differentiation was significantly improved after feature selection. Our results showed that improved IBD diagnosis can be achieved using Raman spectroscopy in combination with multivariate analysis and feature selection.
Deep Learning Nuclei Detection in Digitized Histology Images by Superpixels
Sornapudi, Sudhir; Stanley, Ronald Joe; Stoecker, William V.; Almubarak, Haidar; Long, Rodney; Antani, Sameer; Thoma, George; Zuna, Rosemary; Frazier, Shelliane R.
2018-01-01
Background: Advances in image analysis and computational techniques have facilitated automatic detection of critical features in histopathology images. Detection of nuclei is critical for squamous epithelium cervical intraepithelial neoplasia (CIN) classification into normal, CIN1, CIN2, and CIN3 grades. Methods: In this study, a deep learning (DL)-based nuclei segmentation approach is investigated based on gathering localized information through the generation of superpixels using a simple linear iterative clustering algorithm and training with a convolutional neural network. Results: The proposed approach was evaluated on a dataset of 133 digitized histology images and achieved an overall nuclei detection (object-based) accuracy of 95.97%, with demonstrated improvement over imaging-based and clustering-based benchmark techniques. Conclusions: The proposed DL-based nuclei segmentation Method with superpixel analysis has shown improved segmentation results in comparison to state-of-the-art methods. PMID:29619277
Regional Scale Meteorological Analysis and Prediction Using GPS Occultation and EOS Data
NASA Technical Reports Server (NTRS)
Bromwich, David H.; Shum, C. K.; Zhao, Changyin; Kuo, Bill; Rocken, Chris
2004-01-01
The main objective of the research under this award is to improve regional meteorological analysis and prediction for traditionally data limited regions, particularly over the Southern Ocean and Antarctica, using the remote sensing observations from current and upcoming GPS radio occultation missions and the EOS instrument suite. The major components of this project are: 1.Develop and improve the methods for retrieving temperature, moisture, and pressure profiles from GPS radio occultation data and EOS radiometer data. 2. Develop and improve a regional scale data assimilation system (MM5 4DVAR). 3. Perform case studies involving data analysis and numerical modeling to investigate the impact of different data for regional meteorological analysis and the importance of data assimilation for regional meteorological simulation over the Antarctic region. 4. Apply the findings and improvements from the above studies to weather forecasting experiments. 5. In the third year of the award we made significant progress toward the remaining goals of the project. The work included carefully evaluating the performance of an atmospheric mesoscale model, the Polar MM5 in Antarctic applications and improving the upper boundary condition.
Whole-Range Assessment: A Simple Method for Analysing Allelopathic Dose-Response Data
An, Min; Pratley, J. E.; Haig, T.; Liu, D.L.
2005-01-01
Based on the typical biological responses of an organism to allelochemicals (hormesis), concepts of whole-range assessment and inhibition index were developed for improved analysis of allelopathic data. Examples of their application are presented using data drawn from the literature. The method is concise and comprehensive, and makes data grouping and multiple comparisons simple, logical, and possible. It improves data interpretation, enhances research outcomes, and is a statistically efficient summary of the plant response profiles. PMID:19330165
Toward improved durability in advanced aircraft engine hot sections
NASA Technical Reports Server (NTRS)
Sokolowski, Daniel E. (Editor)
1989-01-01
The conference on durability improvement methods for advanced aircraft gas turbine hot-section components discussed NASA's Hot Section Technology (HOST) project, advanced high-temperature instrumentation for hot-section research, the development and application of combustor aerothermal models, and the evaluation of a data base and numerical model for turbine heat transfer. Also discussed are structural analysis methods for gas turbine hot section components, fatigue life-prediction modeling for turbine hot section materials, and the service life modeling of thermal barrier coatings for aircraft gas turbine engines.
NASA Technical Reports Server (NTRS)
Dinar, N.
1978-01-01
Several aspects of multigrid methods are briefly described. The main subjects include the development of very efficient multigrid algorithms for systems of elliptic equations (Cauchy-Riemann, Stokes, Navier-Stokes), as well as the development of control and prediction tools (based on local mode Fourier analysis), used to analyze, check and improve these algorithms. Preliminary research on multigrid algorithms for time dependent parabolic equations is also described. Improvements in existing multigrid processes and algorithms for elliptic equations were studied.
Saleh-Lakha, S.; Allen, V. G.; Li, J.; Pagotto, F.; Odumeru, J.; Taboada, E.; Lombos, M.; Tabing, K. C.; Blais, B.; Ogunremi, D.; Downing, G.; Lee, S.; Gao, A.; Nadon, C.
2013-01-01
Listeria monocytogenes is responsible for severe and often fatal food-borne infections in humans. A collection of 2,421 L. monocytogenes isolates originating from Ontario's food chain between 1993 and 2010, along with Ontario clinical isolates collected from 2004 to 2010, was characterized using an improved multilocus variable-number tandem-repeat analysis (MLVA). The MLVA method was established based on eight primer pairs targeting seven variable-number tandem-repeat (VNTR) loci in two 4-plex fluorescent PCRs. Diversity indices and amplification rates of the individual VNTR loci ranged from 0.38 to 0.92 and from 0.64 to 0.99, respectively. MLVA types and pulsed-field gel electrophoresis (PFGE) patterns were compared using Comparative Partitions analysis involving 336 clinical and 99 food and environmental isolates. The analysis yielded Simpson's diversity index values of 0.998 and 0.992 for MLVA and PFGE, respectively, and adjusted Wallace coefficients of 0.318 when MLVA was used as a primary subtyping method and 0.088 when PFGE was a primary typing method. Statistical data analysis using BioNumerics allowed for identification of at least 8 predominant and persistent L. monocytogenes MLVA types in Ontario's food chain. The MLVA method correctly clustered epidemiologically related outbreak strains and separated unrelated strains in a subset analysis. An MLVA database was established for the 2,421 L. monocytogenes isolates, which allows for comparison of data among historical and new isolates of different sources. The subtyping method coupled with the MLVA database will help in effective monitoring/prevention approaches to identify environmental contamination by pathogenic strains of L. monocytogenes and investigation of outbreaks. PMID:23956391
A quality quantitative method of silicon direct bonding based on wavelet image analysis
NASA Astrophysics Data System (ADS)
Tan, Xiao; Tao, Zhi; Li, Haiwang; Xu, Tiantong; Yu, Mingxing
2018-04-01
The rapid development of MEMS (micro-electro-mechanical systems) has received significant attention from researchers in various fields and subjects. In particular, the MEMS fabrication process is elaborate and, as such, has been the focus of extensive research inquiries. However, in MEMS fabrication, component bonding is difficult to achieve and requires a complex approach. Thus, improvements in bonding quality are relatively important objectives. A higher quality bond can only be achieved with improved measurement and testing capabilities. In particular, the traditional testing methods mainly include infrared testing, tensile testing, and strength testing, despite the fact that using these methods to measure bond quality often results in low efficiency or destructive analysis. Therefore, this paper focuses on the development of a precise, nondestructive visual testing method based on wavelet image analysis that is shown to be highly effective in practice. The process of wavelet image analysis includes wavelet image denoising, wavelet image enhancement, and contrast enhancement, and as an end result, can display an image with low background noise. In addition, because the wavelet analysis software was developed with MATLAB, it can reveal the bonding boundaries and bonding rates to precisely indicate the bond quality at all locations on the wafer. This work also presents a set of orthogonal experiments that consist of three prebonding factors, the prebonding temperature, the positive pressure value and the prebonding time, which are used to analyze the prebonding quality. This method was used to quantify the quality of silicon-to-silicon wafer bonding, yielding standard treatment quantities that could be practical for large-scale use.
Environmental dynamics at orbital altitudes
NASA Technical Reports Server (NTRS)
Karr, G. R.
1976-01-01
The influence of real satellite aerodynamics on the determination of upper atmospheric density was investigated. A method of analysis of satellite drag data is presented which includes the effect of satellite lift and the variation in aerodynamic properties around the orbit. The studies indicate that satellite lift may be responsible for the observed orbit precession rather than a super rotation of the upper atmosphere. The influence of simplifying assumptions concerning the aerodynamics of objects in falling sphere analysis were evaluated and an improved method of analysis was developed. Wind tunnel data was used to develop more accurate drag coefficient relationships for studying altitudes between 80 and 120 Km. The improved drag coefficient relationships revealed a considerable error in previous falling sphere drag interpretation. These data were reanalyzed using the more accurate relationships. Theoretical investigations of the drag coefficient in the very low speed ratio region were also conducted.
Challenges for Better thesis supervision
Ghadirian, Laleh; Sayarifard, Azadeh; Majdzadeh, Reza; Rajabi, Fatemeh; Yunesian, Masoud
2014-01-01
Background: Conduction of thesis by the students is one of their major academic activities. Thesis quality and acquired experiences are highly dependent on the supervision. Our study is aimed at identifing the challenges in thesis supervision from both students and faculty members point of view. Methods: This study was conducted using individual in-depth interviews and Focus Group Discussions (FGD). The participants were 43 students and faculty members selected by purposive sampling. It was carried out in Tehran University of Medical Sciences in 2012. Data analysis was done concurrently with data gathering using content analysis method. Results: Our data analysis resulted in 162 codes, 17 subcategories and 4 major categories, "supervisory knowledge and skills", "atmosphere", "bylaws and regulations relating to supervision" and "monitoring and evaluation". Conclusion: This study showed that more attention and planning in needed for modifying related rules and regulations, qualitative and quantitative improvement in mentorship training, research atmosphere improvement and effective monitoring and evaluation in supervisory area. PMID:25250273
Advancing data management and analysis in different scientific disciplines
NASA Astrophysics Data System (ADS)
Fischer, M.; Gasthuber, M.; Giesler, A.; Hardt, M.; Meyer, J.; Prabhune, A.; Rigoll, F.; Schwarz, K.; Streit, A.
2017-10-01
Over the past several years, rapid growth of data has affected many fields of science. This has often resulted in the need for overhauling or exchanging the tools and approaches in the disciplines’ data life cycles. However, this allows the application of new data analysis methods and facilitates improved data sharing. The project Large-Scale Data Management and Analysis (LSDMA) of the German Helmholtz Association has been addressing both specific and generic requirements in its data life cycle successfully since 2012. Its data scientists work together with researchers from the fields such as climatology, energy and neuroscience to improve the community-specific data life cycles, in several cases even all stages of the data life cycle, i.e. from data acquisition to data archival. LSDMA scientists also study methods and tools that are of importance to many communities, e.g. data repositories and authentication and authorization infrastructure.
Laser-based methods for the analysis of low molecular weight compounds in biological matrices.
Kiss, András; Hopfgartner, Gérard
2016-07-15
Laser-based desorption and/or ionization methods play an important role in the field of the analysis of low molecular-weight compounds (LMWCs) because they allow direct analysis with high-throughput capabilities. In the recent years there were several new improvements in ionization methods with the emergence of novel atmospheric ion sources such as laser ablation electrospray ionization or laser diode thermal desorption and atmospheric pressure chemical ionization and in sample preparation methods with the development of new matrix compounds for matrix-assisted laser desorption/ionization (MALDI). Also, the combination of ion mobility separation with laser-based ionization methods starts to gain popularity with access to commercial systems. These developments have been driven mainly by the emergence of new application fields such as MS imaging and non-chromatographic analytical approaches for quantification. This review aims to present these new developments in laser-based methods for the analysis of low-molecular weight compounds by MS and several potential applications. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Oh, Won Jin; Jang, Jong Shik; Lee, Youn Seoung; Kim, Ansoon; Kim, Kyung Joong
2018-02-01
Quantitative analysis methods of multi-element alloy films were compared. The atomic fractions of Si1-xGex alloy films were measured by depth profiling analysis with secondary ion mass spectrometry (SIMS) and X-ray Photoelectron Spectroscopy (XPS). Intensity-to-composition conversion factor (ICF) was used as a mean to convert the intensities to compositions instead of the relative sensitivity factors. The ICFs were determined from a reference Si1-xGex alloy film by the conventional method, average intensity (AI) method and total number counting (TNC) method. In the case of SIMS, although the atomic fractions measured by oxygen ion beams were not quantitative due to severe matrix effect, the results by cesium ion beam were very quantitative. The quantitative analysis results by SIMS using MCs2+ ions are comparable to the results by XPS. In the case of XPS, the measurement uncertainty was highly improved by the AI method and TNC method.
Algorithm of reducing the false positives in IDS based on correlation Analysis
NASA Astrophysics Data System (ADS)
Liu, Jianyi; Li, Sida; Zhang, Ru
2018-03-01
This paper proposes an algorithm of reducing the false positives in IDS based on correlation Analysis. Firstly, the algorithm analyzes the distinguishing characteristics of false positives and real alarms, and preliminary screen the false positives; then use the method of attribute similarity clustering to the alarms and further reduces the amount of alarms; finally, according to the characteristics of multi-step attack, associated it by the causal relationship. The paper also proposed a reverse causation algorithm based on the attack association method proposed by the predecessors, turning alarm information into a complete attack path. Experiments show that the algorithm simplifies the number of alarms, improve the efficiency of alarm processing, and contribute to attack purposes identification and alarm accuracy improvement.
Energy resolution improvement of CdTe detectors by using the principal component analysis technique
NASA Astrophysics Data System (ADS)
Alharbi, T.
2018-02-01
In this paper, we report on the application of the Principal Component Analysis (PCA) technique for the improvement of the γ-ray energy resolution of CdTe detectors. The PCA technique is used to estimate the amount of charge-trapping effect which is reflected in the shape of each detector pulse, thereby correcting for the charge-trapping effect. The details of the method are described and the results obtained with a CdTe detector are shown. We have achieved an energy resolution of 1.8 % (FWHM) at 662 keV with full detection efficiency from a 1 mm thick CdTe detector which gives an energy resolution of 4.5 % (FWHM) by using the standard pulse processing method.
A Bayesian technique for improving the sensitivity of the atmospheric neutrino L/E analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blake, A. S. T.; Chapman, J. D.; Thomson, M. A.
Tmore » his paper outlines a method for improving the precision of atmospheric neutrino oscillation measurements. One experimental signature for these oscillations is an observed deficit in the rate of ν μ charged-current interactions with an oscillatory dependence on L ν / E ν , where L ν is the neutrino propagation distance and E mrow is="true"> ν is the neutrino energy. For contained-vertex atmospheric neutrino interactions, the L ν / E ν resolution varies significantly from event to event. he precision of the oscillation measurement can be improved by incorporating information on L ν / E ν resolution into the oscillation analysis. In the analysis presented here, a Bayesian technique is used to estimate the L ν / E ν resolution of observed atmospheric neutrinos on an event-by-event basis. By separating the events into bins of L ν / E ν resolution in the oscillation analysis, a significant improvement in oscillation sensitivity can be achieved.« less
Samuel V. Glass; Stanley D. Gatland II; Kohta Ueno; Christopher J. Schumacher
2017-01-01
ASHRAE Standard 160, Criteria for Moisture-Control Design Analysis in Buildings, was published in 2009. The standard sets criteria for moisture design loads, hygrothermal analysis methods, and satisfactory moisture performance of the building envelope. One of the evaluation criteria specifies conditions necessary to avoid mold growth. The current standard requires that...
Improving nuclear data accuracy of 241Am and 237Np capture cross sections
NASA Astrophysics Data System (ADS)
Žerovnik, Gašper; Schillebeeckx, Peter; Cano-Ott, Daniel; Jandel, Marian; Hori, Jun-ichi; Kimura, Atsushi; Rossbach, Matthias; Letourneau, Alain; Noguere, Gilles; Leconte, Pierre; Sano, Tadafumi; Kellett, Mark A.; Iwamoto, Osamu; Ignatyuk, Anatoly V.; Cabellos, Oscar; Genreith, Christoph; Harada, Hideo
2017-09-01
In the framework of the OECD/NEA WPEC subgroup 41, ways to improve neutron induced capture cross sections for 241Am and 237Np are being sought. Decay data, energy dependent cross section data and neutron spectrum averaged data are important for that purpose and were investigated. New time-of-flight measurements were performed and analyzed, and considerable effort was put into development of methods for analysis of spectrum averaged data and re-analysis of existing experimental data.
Paul, Rick L
2011-01-01
Radiochemical neutron activation analysis (RNAA) with retention on hydrated manganese dioxide (HMD) has played a key role in the certification of As in biological materials at NIST. Although this method provides very high and reproducible yields and detection limits at low microgram/kilogram levels, counting geometry uncertainties may arise from unequal distribution of As in the HMD, and arsenic detection limits may not be optimal due to significant retention of other elements. An alternate RNAA procedure with separation of arsenic by solvent extraction has been investigated. After digestion of samples in nitric and perchloric acids, As(III) is extracted from 2 M sulfuric acid solution into a solution of zinc diethyldithiocarbamate in chloroform. Counting of (76)As allows quantitation of arsenic. Addition of an (77)As tracer solution prior to dissolution allows correction for chemical yield and counting geometries, further improving reproducibility. The HMD and solvent extraction procedures for arsenic were compared through analysis of SRMs 1577c (bovine liver), 1547 (peach leaves), and 1575a (pine needles). Both methods gave As results in agreement with certified values with comparable reproducibility. However, the solvent extraction method yields a factor of 3 improvement in detection limits and is less time-consuming than the HMD method. The new method shows great promise for use in As certification in reference materials.
NASA Astrophysics Data System (ADS)
Liu, Ping; Qi, Chu-Bo; Zhu, Quan-Fei; Yuan, Bi-Feng; Feng, Yu-Qi
2016-02-01
Precursor ion scan and multiple reaction monitoring scan (MRM) are two typical scan modes in mass spectrometry analysis. Here, we developed a strategy by combining stable isotope labeling (IL) with liquid chromatography-mass spectrometry (LC-MS) under double precursor ion scan (DPI) and MRM for analysis of thiols in 5 types of human cancer urine. Firstly, the IL-LC-DPI-MS method was applied for non-targeted profiling of thiols from cancer samples. Compared to traditional full scan mode, the DPI method significantly improved identification selectivity and accuracy. 103 thiol candidates were discovered in all cancers and 6 thiols were identified by their standards. It is worth noting that pantetheine, for the first time, was identified in human urine. Secondly, the IL-LC-MRM-MS method was developed for relative quantification of thiols in cancers compared to healthy controls. All the MRM transitions of light and heavy labeled thiols were acquired from urines by using DPI method. Compared to DPI method, the sensitivity of MRM improved by 2.1-11.3 folds. In addition, the concentration of homocysteine, γ-glutamylcysteine and pantetheine enhanced more than two folds in cancer patients compared to healthy controls. Taken together, the method demonstrated to be a promising strategy for identification and comprehensive quantification of thiols in human urines.
Liu, Ping; Qi, Chu-Bo; Zhu, Quan-Fei; Yuan, Bi-Feng; Feng, Yu-Qi
2016-01-01
Precursor ion scan and multiple reaction monitoring scan (MRM) are two typical scan modes in mass spectrometry analysis. Here, we developed a strategy by combining stable isotope labeling (IL) with liquid chromatography-mass spectrometry (LC-MS) under double precursor ion scan (DPI) and MRM for analysis of thiols in 5 types of human cancer urine. Firstly, the IL-LC-DPI-MS method was applied for non-targeted profiling of thiols from cancer samples. Compared to traditional full scan mode, the DPI method significantly improved identification selectivity and accuracy. 103 thiol candidates were discovered in all cancers and 6 thiols were identified by their standards. It is worth noting that pantetheine, for the first time, was identified in human urine. Secondly, the IL-LC-MRM-MS method was developed for relative quantification of thiols in cancers compared to healthy controls. All the MRM transitions of light and heavy labeled thiols were acquired from urines by using DPI method. Compared to DPI method, the sensitivity of MRM improved by 2.1–11.3 folds. In addition, the concentration of homocysteine, γ-glutamylcysteine and pantetheine enhanced more than two folds in cancer patients compared to healthy controls. Taken together, the method demonstrated to be a promising strategy for identification and comprehensive quantification of thiols in human urines. PMID:26888486
Synthesized airfoil data method for prediction of dynamic stall and unsteady airloads
NASA Technical Reports Server (NTRS)
Gangwani, S. T.
1983-01-01
A detailed analysis of dynamic stall experiments has led to a set of relatively compact analytical expressions, called synthesized unsteady airfoil data, which accurately describe in the time-domain the unsteady aerodynamic characteristics of stalled airfoils. An analytical research program was conducted to expand and improve this synthesized unsteady airfoil data method using additional available sets of unsteady airfoil data. The primary objectives were to reduce these data to synthesized form for use in rotor airload prediction analyses and to generalize the results. Unsteady drag data were synthesized which provided the basis for successful expansion of the formulation to include computation of the unsteady pressure drag of airfoils and rotor blades. Also, an improved prediction model for airfoil flow reattachment was incorporated in the method. Application of this improved unsteady aerodynamics model has resulted in an improved correlation between analytic predictions and measured full scale helicopter blade loads and stress data.
Improved brain tumor segmentation by utilizing tumor growth model in longitudinal brain MRI
NASA Astrophysics Data System (ADS)
Pei, Linmin; Reza, Syed M. S.; Li, Wei; Davatzikos, Christos; Iftekharuddin, Khan M.
2017-03-01
In this work, we propose a novel method to improve texture based tumor segmentation by fusing cell density patterns that are generated from tumor growth modeling. To model tumor growth, we solve the reaction-diffusion equation by using Lattice-Boltzmann method (LBM). Computational tumor growth modeling obtains the cell density distribution that potentially indicates the predicted tissue locations in the brain over time. The density patterns is then considered as novel features along with other texture (such as fractal, and multifractal Brownian motion (mBm)), and intensity features in MRI for improved brain tumor segmentation. We evaluate the proposed method with about one hundred longitudinal MRI scans from five patients obtained from public BRATS 2015 data set, validated by the ground truth. The result shows significant improvement of complete tumor segmentation using ANOVA analysis for five patients in longitudinal MR images.
Improved brain tumor segmentation by utilizing tumor growth model in longitudinal brain MRI.
Pei, Linmin; Reza, Syed M S; Li, Wei; Davatzikos, Christos; Iftekharuddin, Khan M
2017-02-11
In this work, we propose a novel method to improve texture based tumor segmentation by fusing cell density patterns that are generated from tumor growth modeling. In order to model tumor growth, we solve the reaction-diffusion equation by using Lattice-Boltzmann method (LBM). Computational tumor growth modeling obtains the cell density distribution that potentially indicates the predicted tissue locations in the brain over time. The density patterns is then considered as novel features along with other texture (such as fractal, and multifractal Brownian motion (mBm)), and intensity features in MRI for improved brain tumor segmentation. We evaluate the proposed method with about one hundred longitudinal MRI scans from five patients obtained from public BRATS 2015 data set, validated by the ground truth. The result shows significant improvement of complete tumor segmentation using ANOVA analysis for five patients in longitudinal MR images.
Research on assessment and improvement method of remote sensing image reconstruction
NASA Astrophysics Data System (ADS)
Sun, Li; Hua, Nian; Yu, Yanbo; Zhao, Zhanping
2018-01-01
Remote sensing image quality assessment and improvement is an important part of image processing. Generally, the use of compressive sampling theory in remote sensing imaging system can compress images while sampling which can improve efficiency. A method of two-dimensional principal component analysis (2DPCA) is proposed to reconstruct the remote sensing image to improve the quality of the compressed image in this paper, which contain the useful information of image and can restrain the noise. Then, remote sensing image quality influence factors are analyzed, and the evaluation parameters for quantitative evaluation are introduced. On this basis, the quality of the reconstructed images is evaluated and the different factors influence on the reconstruction is analyzed, providing meaningful referential data for enhancing the quality of remote sensing images. The experiment results show that evaluation results fit human visual feature, and the method proposed have good application value in the field of remote sensing image processing.
NASA Astrophysics Data System (ADS)
Lin, Wei; Li, Xizhe; Yang, Zhengming; Lin, Lijun; Xiong, Shengchun; Wang, Zhiyuan; Wang, Xiangyang; Xiao, Qianhua
Based on the basic principle of the porosity method in image segmentation, considering the relationship between the porosity of the rocks and the fractal characteristics of the pore structures, a new improved image segmentation method was proposed, which uses the calculated porosity of the core images as a constraint to obtain the best threshold. The results of comparative analysis show that the porosity method can best segment images theoretically, but the actual segmentation effect is deviated from the real situation. Due to the existence of heterogeneity and isolated pores of cores, the porosity method that takes the experimental porosity of the whole core as the criterion cannot achieve the desired segmentation effect. On the contrary, the new improved method overcomes the shortcomings of the porosity method, and makes a more reasonable binary segmentation for the core grayscale images, which segments images based on the actual porosity of each image by calculated. Moreover, the image segmentation method based on the calculated porosity rather than the measured porosity also greatly saves manpower and material resources, especially for tight rocks.
Improved first-order uncertainty method for water-quality modeling
Melching, C.S.; Anmangandla, S.
1992-01-01
Uncertainties are unavoidable in water-quality modeling and subsequent management decisions. Monte Carlo simulation and first-order uncertainty analysis (involving linearization at central values of the uncertain variables) have been frequently used to estimate probability distributions for water-quality model output due to their simplicity. Each method has its drawbacks: Monte Carlo simulation's is mainly computational time; and first-order analysis are mainly questions of accuracy and representativeness, especially for nonlinear systems and extreme conditions. An improved (advanced) first-order method is presented, where the linearization point varies to match the output level whose exceedance probability is sought. The advanced first-order method is tested on the Streeter-Phelps equation to estimate the probability distribution of critical dissolved-oxygen deficit and critical dissolved oxygen using two hypothetical examples from the literature. The advanced first-order method provides a close approximation of the exceedance probability for the Streeter-Phelps model output estimated by Monte Carlo simulation using less computer time - by two orders of magnitude - regardless of the probability distributions assumed for the uncertain model parameters.
Wijetunge, Chalini D; Saeed, Isaam; Boughton, Berin A; Spraggins, Jeffrey M; Caprioli, Richard M; Bacic, Antony; Roessner, Ute; Halgamuge, Saman K
2015-10-01
Matrix Assisted Laser Desorption Ionization-Imaging Mass Spectrometry (MALDI-IMS) in 'omics' data acquisition generates detailed information about the spatial distribution of molecules in a given biological sample. Various data processing methods have been developed for exploring the resultant high volume data. However, most of these methods process data in the spectral domain and do not make the most of the important spatial information available through this technology. Therefore, we propose a novel streamlined data analysis pipeline specifically developed for MALDI-IMS data utilizing significant spatial information for identifying hidden significant molecular distribution patterns in these complex datasets. The proposed unsupervised algorithm uses Sliding Window Normalization (SWN) and a new spatial distribution based peak picking method developed based on Gray level Co-Occurrence (GCO) matrices followed by clustering of biomolecules. We also use gist descriptors and an improved version of GCO matrices to extract features from molecular images and minimum medoid distance to automatically estimate the number of possible groups. We evaluated our algorithm using a new MALDI-IMS metabolomics dataset of a plant (Eucalypt) leaf. The algorithm revealed hidden significant molecular distribution patterns in the dataset, which the current Component Analysis and Segmentation Map based approaches failed to extract. We further demonstrate the performance of our peak picking method over other traditional approaches by using a publicly available MALDI-IMS proteomics dataset of a rat brain. Although SWN did not show any significant improvement as compared with using no normalization, the visual assessment showed an improvement as compared to using the median normalization. The source code and sample data are freely available at http://exims.sourceforge.net/. awgcdw@student.unimelb.edu.au or chalini_w@live.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Apekey, Tanefa A; McSorley, Gerry; Tilling, Michelle; Siriwardena, A Niroshan
2011-04-01
Leadership and innovation are currently seen as essential elements for the development and maintenance of high-quality care. Little is known about the relationship between leadership and culture of innovation and the extent to which quality improvement methods are used in general practice. This study aimed to assess the relationship between leadership behaviour, culture of innovation and adoption of quality improvement methods in general practice. Self-administered postal questionnaires were sent to general practitioner quality improvement leads in one county in the UK between June and December 2007. The questionnaire consisted of background information, a 12-item scale to assess leadership behaviour, a seven-dimension self-rating scale for culture of innovation and questions on current use of quality improvement tools and techniques. Sixty-three completed questionnaires (62%) were returned. Leadership behaviours were not commonly reported. Most practices reported a positive culture of innovation, featuring relationship most strongly, followed by targets and information but rated lower on other dimensions of rewards, risk and resources. There was a significant positive correlation between leadership behaviour and the culture of innovation (r = 0.57; P < 0.001). Apart from clinical audit and significant event analysis, quality improvement methods were not adopted by most participating practices. Leadership behaviours were infrequently reported and this was associated with a limited culture of innovation in participating general practices. There was little use of quality improvement methods beyond clinical and significant event audit. Practices need support to enhance leadership skills, encourage innovation and develop quality improvement skills if improvements in health care are to accelerate. © 2010 Blackwell Publishing Ltd.
Effect of the absolute statistic on gene-sampling gene-set analysis methods.
Nam, Dougu
2017-06-01
Gene-set enrichment analysis and its modified versions have commonly been used for identifying altered functions or pathways in disease from microarray data. In particular, the simple gene-sampling gene-set analysis methods have been heavily used for datasets with only a few sample replicates. The biggest problem with this approach is the highly inflated false-positive rate. In this paper, the effect of absolute gene statistic on gene-sampling gene-set analysis methods is systematically investigated. Thus far, the absolute gene statistic has merely been regarded as a supplementary method for capturing the bidirectional changes in each gene set. Here, it is shown that incorporating the absolute gene statistic in gene-sampling gene-set analysis substantially reduces the false-positive rate and improves the overall discriminatory ability. Its effect was investigated by power, false-positive rate, and receiver operating curve for a number of simulated and real datasets. The performances of gene-set analysis methods in one-tailed (genome-wide association study) and two-tailed (gene expression data) tests were also compared and discussed.
The Effect of Laminar Flow on Rotor Hover Performance
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.; Martin, Preston B.
2017-01-01
The topic of laminar flow effects on hover performance is introduced with respect to some historical efforts where laminar flow was either measured or attempted. An analysis method is outlined using combined blade element, momentum method coupled to an airfoil analysis method, which includes the full e(sup N) transition model. The analysis results compared well with the measured hover performance including the measured location of transition on both the upper and lower blade surfaces. The analysis method is then used to understand the upper limits of hover efficiency as a function of disk loading. The impact of laminar flow is higher at low disk loading, but significant improvement in terms of power loading appears possible even up to high disk loading approaching 20 ps f. A optimum planform design equation is derived for cases of zero profile drag and finite drag levels. These results are intended to be a guide for design studies and as a benchmark to compare higher fidelity analysis results. The details of the analysis method are given to enable other researchers to use the same approach for comparison to other approaches.
Wójcicki, Tomasz; Nowicki, Michał
2016-01-01
The article presents a selected area of research and development concerning the methods of material analysis based on the automatic image recognition of the investigated metallographic sections. The objectives of the analyses of the materials for gas nitriding technology are described. The methods of the preparation of nitrided layers, the steps of the process and the construction and operation of devices for gas nitriding are given. We discuss the possibility of using the methods of digital images processing in the analysis of the materials, as well as their essential task groups: improving the quality of the images, segmentation, morphological transformations and image recognition. The developed analysis model of the nitrided layers formation, covering image processing and analysis techniques, as well as selected methods of artificial intelligence are presented. The model is divided into stages, which are formalized in order to better reproduce their actions. The validation of the presented method is performed. The advantages and limitations of the developed solution, as well as the possibilities of its practical use, are listed. PMID:28773389
Multiscale Analysis of Solar Image Data
NASA Astrophysics Data System (ADS)
Young, C. A.; Myers, D. C.
2001-12-01
It is often said that the blessing and curse of solar physics is that there is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also cursed us with an increased amount of higher complexity data than previous missions. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present a preliminary analysis of multiscale techniques applied to solar image data. Specifically, we explore the use of the 2-d wavelet transform and related transforms with EIT, LASCO and TRACE images. This work was supported by NASA contract NAS5-00220.
Method for combined biometric and chemical analysis of human fingerprints.
Staymates, Jessica L; Orandi, Shahram; Staymates, Matthew E; Gillen, Greg
This paper describes a method for combining direct chemical analysis of latent fingerprints with subsequent biometric analysis within a single sample. The method described here uses ion mobility spectrometry (IMS) as a chemical detection method for explosives and narcotics trace contamination. A collection swab coated with a high-temperature adhesive has been developed to lift latent fingerprints from various surfaces. The swab is then directly inserted into an IMS instrument for a quick chemical analysis. After the IMS analysis, the lifted print remains intact for subsequent biometric scanning and analysis using matching algorithms. Several samples of explosive-laden fingerprints were successfully lifted and the explosives detected with IMS. Following explosive detection, the lifted fingerprints remained of sufficient quality for positive match scores using a prepared gallery consisting of 60 fingerprints. Based on our results ( n = 1200), there was no significant decrease in the quality of the lifted print post IMS analysis. In fact, for a small subset of lifted prints, the quality was improved after IMS analysis. The described method can be readily applied to domestic criminal investigations, transportation security, terrorist and bombing threats, and military in-theatre settings.
Zhang, Feng; Liao, Xiangke; Peng, Shaoliang; Cui, Yingbo; Wang, Bingqiang; Zhu, Xiaoqian; Liu, Jie
2016-06-01
' The de novo assembly of DNA sequences is increasingly important for biological researches in the genomic era. After more than one decade since the Human Genome Project, some challenges still exist and new solutions are being explored to improve de novo assembly of genomes. String graph assembler (SGA), based on the string graph theory, is a new method/tool developed to address the challenges. In this paper, based on an in-depth analysis of SGA we prove that the SGA-based sequence de novo assembly is an NP-complete problem. According to our analysis, SGA outperforms other similar methods/tools in memory consumption, but costs much more time, of which 60-70 % is spent on the index construction. Upon this analysis, we introduce a hybrid parallel optimization algorithm and implement this algorithm in the TianHe-2's parallel framework. Simulations are performed with different datasets. For data of small size the optimized solution is 3.06 times faster than before, and for data of middle size it's 1.60 times. The results demonstrate an evident performance improvement, with the linear scalability for parallel FM-index construction. This results thus contribute significantly to improving the efficiency of de novo assembly of DNA sequences.
NASA Astrophysics Data System (ADS)
Iwaki, Y.
2010-07-01
The Quality Assurance (QA) of measurand has been discussed over many years by Quality Engineering (QE). It is need to more discuss about ISO standard. It is mining to find out root fault element for improvement of measured accuracy, and it remove. The accuracy assurance needs to investigate the Reference Material (RM) for calibration and an improvement accuracy of data processing. This research follows the accuracy improvement in field of data processing by how to improve of accuracy. As for the fault element relevant to measurement accuracy, in many cases, two or more element is buried exist. The QE is to assume the generating frequency of fault state, and it is solving from higher ranks for fault factor first by "Failure Mode and Effects Analysis (FMEA)". Then QE investigate the root cause over the fault element by "Root Cause Analysis (RCA)" and "Fault Tree Analysis (FTA)" and calculate order to the generating element of assume specific fault. These days comes, the accuracy assurance of measurement result became duty in the Professional Test (PT). ISO standard was legislated by ISO-GUM (Guide of express Uncertainty in Measurement) as guidance of an accuracy assurance in 1993 [1] for QA. Analysis method of ISO-GUM is changed into Exploratory Data Analysis (EDA) from Analysis of Valiance (ANOVA). EDA calculate one by one until an assurance performance is obtained according to "Law of the propagation of uncertainty". If the truth value was unknown, ISO-GUM is changed into reference value. A reference value set up by the EDA and it does check with a Key Comparison (KC) method. KC is comparing between null hypothesis and frequency hypothesis. It performs operation of assurance by ISO-GUM in order of standard uncertainty, the combined uncertainty of many fault elements and an expansion uncertain for assurance. An assurance value is authorized by multiplying the final expansion uncertainty [2] by K of coverage factor. K-value is calculated from the Effective Free Degree (EFD) which thought the number of samples is important. Free degree is based on maximum likelihood method of an improved information criterion (AIC) for a Quality Control (QC). The assurance performance of ISO-GUM is come out by set up of the confidence interval [3] and is decided. The result of research of "Decided level/Minimum Detectable Concentration (DL/MDC)" was able to profit by the operation. QE has developed for the QC of industry. However, these have been processed by regression analysis by making frequency probability of a statistic value into normalized distribution. The occurrence probability of the statistics value of a fault element which is accompanied element by a natural phenomenon becomes an abnormal distribution in many cases. The abnormal distribution needs to obtain an assurance value by other method than statistical work of type B in ISO-GUM. It is tried fusion the improvement of worker by QE became important for reservation of the reliability of measurement accuracy and safety. This research was to make the result of Blood Chemical Analysis (BCA) in the field of clinical test.
Quality Improvement of Liver Ultrasound Images Using Fuzzy Techniques
Bayani, Azadeh; Langarizadeh, Mostafa; Radmard, Amir Reza; Nejad, Ahmadreza Farzaneh
2016-01-01
Background: Liver ultrasound images are so common and are applied so often to diagnose diffuse liver diseases like fatty liver. However, the low quality of such images makes it difficult to analyze them and diagnose diseases. The purpose of this study, therefore, is to improve the contrast and quality of liver ultrasound images. Methods: In this study, a number of image contrast enhancement algorithms which are based on fuzzy logic were applied to liver ultrasound images - in which the view of kidney is observable - using Matlab2013b to improve the image contrast and quality which has a fuzzy definition; just like image contrast improvement algorithms using a fuzzy intensification operator, contrast improvement algorithms applying fuzzy image histogram hyperbolization, and contrast improvement algorithms by fuzzy IF-THEN rules. Results: With the measurement of Mean Squared Error and Peak Signal to Noise Ratio obtained from different images, fuzzy methods provided better results, and their implementation - compared with histogram equalization method - led both to the improvement of contrast and visual quality of images and to the improvement of liver segmentation algorithms results in images. Conclusion: Comparison of the four algorithms revealed the power of fuzzy logic in improving image contrast compared with traditional image processing algorithms. Moreover, contrast improvement algorithm based on a fuzzy intensification operator was selected as the strongest algorithm considering the measured indicators. This method can also be used in future studies on other ultrasound images for quality improvement and other image processing and analysis applications. PMID:28077898
Combining Heterogeneous Correlation Matrices: Simulation Analysis of Fixed-Effects Methods
ERIC Educational Resources Information Center
Hafdahl, Adam R.
2008-01-01
Monte Carlo studies of several fixed-effects methods for combining and comparing correlation matrices have shown that two refinements improve estimation and inference substantially. With rare exception, however, these simulations have involved homogeneous data analyzed using conditional meta-analytic procedures. The present study builds on…
Primordial power spectrum: a complete analysis with the WMAP nine-year data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazra, Dhiraj Kumar; Shafieloo, Arman; Souradeep, Tarun, E-mail: dhiraj@apctp.org, E-mail: arman@apctp.org, E-mail: tarun@iucaa.ernet.in
2013-07-01
We have improved further the error sensitive Richardson-Lucy deconvolution algorithm making it applicable directly on the un-binned measured angular power spectrum of Cosmic Microwave Background observations to reconstruct the form of the primordial power spectrum. This improvement makes the application of the method significantly more straight forward by removing some intermediate stages of analysis allowing a reconstruction of the primordial spectrum with higher efficiency and precision and with lower computational expenses. Applying the modified algorithm we fit the WMAP 9 year data using the optimized reconstructed form of the primordial spectrum with more than 300 improvement in χ{sup 2}{sub eff}more » with respect to the best fit power-law. This is clearly beyond the reach of other alternative approaches and reflects the efficiency of the proposed method in the reconstruction process and allow us to look for any possible feature in the primordial spectrum projected in the CMB data. Though the proposed method allow us to look at various possibilities for the form of the primordial spectrum, all having good fit to the data, proper error-analysis is needed to test for consistency of theoretical models since, along with possible physical artefacts, most of the features in the reconstructed spectrum might be arising from fitting noises in the CMB data. Reconstructed error-band for the form of the primordial spectrum using many realizations of the data, all bootstrapped and based on WMAP 9 year data, shows proper consistency of power-law form of the primordial spectrum with the WMAP 9 data at all wave numbers. Including WMAP polarization data in to the analysis have not improved much our results due to its low quality but we expect Planck data will allow us to make a full analysis on CMB observations on both temperature and polarization separately and in combination.« less
Hoffmann, Susanne; Dreher-Hummel, Thomas; Dollinger, Claudia; Frei, Irena Anna
2018-04-01
Background: Many hospitals have defined procedures for a complaint management. A systematic analysis of patient complaints helps to identify similar complaints and patterns so that targeted improvement measures can be derived (Gallagher & Mazor, 2015). Aim: Our three-month, nurse-led practice development project aimed 1) to identify complaints regarding communication issues, 2) to systemise and prioritise complaints regarding communication issues, and 3) to derive clinic-specific recommendations for improvement. Method: We analysed 273 complaints of patients documented by the quality management (secondary data analysis). Using content analysis and applying the coding taxonomy for inpatient complaints by Reader, Gillespie and Roberts (2014), we distinguished communication-related complaints. By further inductive differentiation of these complaints, we identified patterns and prioritised fields of action. Results: We identified 186 communication-related complaints divided into 16 subcategories. For each subcategory, improvement interventions were derived, discussed and prioritised. Conclusions: Thus, patient complaints provided an excellent opportunity for reflection and workplace learning for nurses. The analysis gave impulse to exemplify the subject “person-centered care” for nurses.
System parameter identification from projection of inverse analysis
NASA Astrophysics Data System (ADS)
Liu, K.; Law, S. S.; Zhu, X. Q.
2017-05-01
The output of a system due to a change of its parameters is often approximated with the sensitivity matrix from the first order Taylor series. The system output can be measured in practice, but the perturbation in the system parameters is usually not available. Inverse sensitivity analysis can be adopted to estimate the unknown system parameter perturbation from the difference between the observation output data and corresponding analytical output data calculated from the original system model. The inverse sensitivity analysis is re-visited in this paper with improvements based on the Principal Component Analysis on the analytical data calculated from the known system model. The identification equation is projected into a subspace of principal components of the system output, and the sensitivity of the inverse analysis is improved with an iterative model updating procedure. The proposed method is numerical validated with a planar truss structure and dynamic experiments with a seven-storey planar steel frame. Results show that it is robust to measurement noise, and the location and extent of stiffness perturbation can be identified with better accuracy compared with the conventional response sensitivity-based method.
How mental health nurses improve their critical thinking through problem-based learning.
Hung, Tsui-Mei; Tang, Lee-Chun; Ko, Chen-Ju
2015-01-01
Critical thinking has been regarded as one of the most important elements for nurses to improve quality of patient care. The aim of this study was to use problem-based learning (PBL) as a method in a continuing education program to evaluate nurses' critical thinking skills. A quasiexperimental study design was carried out. The "Critical Thinking Disposition Inventory" in Chinese was used for data collection. The results indicated significant improvement after PBL continuous education, notably in the dimensions of systematic analysis and curiosity. Content analysis extracted four themes: (a) changes in linear thinking required, (b) logical and systematic thinking required performance improved, (3) integration of prior knowledge and clinical application, and (4) brainstorming learning strategy. The study supports PBL as a continuing education strategy for mental health nurses, and that systematic analysis and curiosity effectively facilitate the development of critical thinking.
Improved analysis of ground vibrations produced by man-made sources.
Ainalis, Daniel; Ducarne, Loïc; Kaufmann, Olivier; Tshibangu, Jean-Pierre; Verlinden, Olivier; Kouroussis, Georges
2018-03-01
Man-made sources of ground vibration must be carefully monitored in urban areas in order to ensure that structural damage and discomfort to residents is prevented or minimised. The research presented in this paper provides a comparative evaluation of various methods used to analyse a series of tri-axial ground vibration measurements generated by rail, road, and explosive blasting. The first part of the study is focused on comparing various techniques to estimate the dominant frequency, including time-frequency analysis. The comparative evaluation of the various methods to estimate the dominant frequency revealed that, depending on the method used, there can be significant variation in the estimates obtained. A new and improved analysis approach using the continuous wavelet transform was also presented, using the time-frequency distribution to estimate the localised dominant frequency and peak particle velocity. The technique can be used to accurately identify the level and frequency content of a ground vibration signal as it varies with time, and identify the number of times the threshold limits of damage are exceeded. Copyright © 2017 Elsevier B.V. All rights reserved.
Improved analysis of palm creases
Park, Jin Seo; Shin, Dong Sun; Jung, Wonsug
2010-01-01
Palm creases are helpful in revealing anthropologic characteristics and diagnosing chromosomal aberrations, and have been analyzed qualitatively and quantitatively. However, previous methods of analyzing palm creases were not objective so that reproducibility could not be guaranteed. In this study, a more objective morphologic analysis of palm creases was developed. The features of the improved methods include the strict definition of major and minor palm creases and the systematic classification of major palm creases based on their relationships, branches, and variants. Furthermore, based on the analysis of 3,216 Koreans, palm creases were anthropologically interpreted. There was a tendency for palm creases to be evenly distributed on the palm, which was acknowledged by the relationship between major and minor creases as well as by the incidences of major creases types. This tendency was consistent with the role of palm creases to facilitate folding of palm skin. The union of major palm creases was frequent in males and right palms to have powerful hand grip. The new method of analyzing palm creases is expected to be widely used for anthropologic investigation and chromosomal diagnosis. PMID:21189999
Laurila, J; Standertskjöld-Nordenstam, C G; Suramo, I; Tolppanen, E M; Tervonen, O; Korhola, O; Brommels, M
2001-01-01
To study the efficacy of continuous quality improvement (CQI) compared to ordinary management in an on-duty radiology department. Because of complaints regarding delivery of on-duty radiological services, an improvement was initiated simultaneously at two hospitals, at the HUCH (Helsinki University Central Hospital) utilising the CQI-method, and at the OUH (Oulu University Hospital) with a traditional management process. For the CQI project, a team was formed to evaluate the process with flow-charts, cause and effect diagrams, Pareto analysis and control charts. Interventions to improve the process were based on the results of these analyses. The team at the HUCH implemented the following changes: A radiologist was added to the evening shift between 15:00-22:00 and a radiographer was moved from the morning shift to 15:00-22:00. A clear improvement was achieved in the turn-around time, but in the follow-up some of the gains were lost. Only minimal changes were achieved at the OUH, where the intervention was based on traditional management processes. CQI was an effective method for improving the quality of performance of a radiology department compared with ordinary management methods, but some of this improvement may be subsequently lost without a continuous measurement system.
Improving designer productivity. [artificial intelligence
NASA Technical Reports Server (NTRS)
Hill, Gary C.
1992-01-01
Designer and design team productivity improves with skill, experience, and the tools available. The design process involves numerous trials and errors, analyses, refinements, and addition of details. Computerized tools have greatly speeded the analysis, and now new theories and methods, emerging under the label Artificial Intelligence (AI), are being used to automate skill and experience. These tools improve designer productivity by capturing experience, emulating recognized skillful designers, and making the essence of complex programs easier to grasp. This paper outlines the aircraft design process in today's technology and business climate, presenting some of the challenges ahead and some of the promising AI methods for meeting these challenges.
NASA Astrophysics Data System (ADS)
Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun
2017-12-01
For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.