Sample records for obtain reliable predictions

  1. Reliability of Long-Term Wave Conditions Predicted with Data Sets of Short Duration

    DTIC Science & Technology

    1985-03-01

    the validity and reliability of predicted probable wave heights obtained from data of limited duration. BACKGROUND: The basic steps listed by...interest to perform the analysis outlined in steps 2 to 5, the prediction would only be reliable for up to a 3year return period. For a 5-year data set...for long-term hindcast data . The data retrieval and analysis program known as the Sea State Engineering Analysis System (SEAS) makes handling of the

  2. Predicting Intent to Get a College Degree.

    ERIC Educational Resources Information Center

    Staats, Sara; Partlo, Christie

    1990-01-01

    Examined reliability and validity of Perceived Quality of Academic Life (PQAL) instrument with data collected from 218 midwestern commuter college students. Extended existing research by studying PQAL scores as predictors of intent to remain in college. Findings showed that the PQAL was reliable, valid, and predictive of future intent to obtain a…

  3. Alberta infant motor scale: reliability and validity when used on preterm infants in Taiwan.

    PubMed

    Jeng, S F; Yau, K I; Chen, L C; Hsiao, S F

    2000-02-01

    The goal of this study was to examine the reliability and validity of measurements obtained with the Alberta Infant Motor Scale (AIMS) for evaluation of preterm infants in Taiwan. Two independent groups of preterm infants were used to investigate the reliability (n=45) and validity (n=41) for the AIMS. In the reliability study, the AIMS was administered to the infants by a physical therapist, and infant performance was videotaped. The performance was then rescored by the same therapist and by 2 other therapists to examine the intrarater and interrater reliability. In the validity study, the AIMS and the Bayley Motor Scale were administered to the infants at 6 and 12 months of age to examine criterion-related validity. Intraclass correlation coefficients (ICCs) for intrarater and interrater reliability of measurements obtained with the AIMS were high (ICC=.97-.99). The AIMS scores correlated with the Bayley Motor Scale scores at 6 and 12 months (r=.78 and.90), although the AIMS scores at 6 months were only moderately predictive of the motor function at 12 months (r=.56). The results suggest that measurements obtained with the AIMS have acceptable reliability and concurrent validity but limited predictive value for evaluating preterm Taiwanese infants.

  4. Earth Observing System/Advanced Microwave Sounding Unit-A (EOS/AMSU-A): Reliability prediction report for module A1 (channels 3 through 15) and module A2 (channels 1 and 2)

    NASA Technical Reports Server (NTRS)

    Geimer, W.

    1995-01-01

    This report documents the final reliability prediction performed on the Earth Observing System/Advanced Microwave Sounding Unit-A (EOS/AMSU-A). The A1 Module contains Channels 3 through 15, and is referred to herein as 'EOS/AMSU-A1'. The A2 Module contains Channels 1 and 2, and is referred herein as 'EOS/AMSU-A2'. The 'specified' figures were obtained from Aerojet Reports 8897-1 and 9116-1. The predicted reliability figure for the EOS/AMSU-A1 meets the specified value and provides a Mean Time Between Failures (MTBF) of 74,390 hours. The predicted reliability figure for the EOS/AMSU-A2 meets the specified value and provides a MTBF of 193,110 hours.

  5. Height prediction equations for even-aged upland oak stands

    Treesearch

    Donald E. Hilt; Martin E. Dale

    1982-01-01

    Forest growth models that use predicted tree diameters or diameter distributions require a reliable height-prediction model to obtain volume estimates because future height-diameter relationships will not necessarily be the same as the present height-diameter relationship. A total tree height prediction equation for even-aged upland oak stands is presented. Predicted...

  6. Estimation of reliability of predictions and model applicability domain evaluation in the analysis of acute toxicity (LD50).

    PubMed

    Sazonovas, A; Japertas, P; Didziapetris, R

    2010-01-01

    This study presents a new type of acute toxicity (LD(50)) prediction that enables automated assessment of the reliability of predictions (which is synonymous with the assessment of the Model Applicability Domain as defined by the Organization for Economic Cooperation and Development). Analysis involved nearly 75,000 compounds from six animal systems (acute rat toxicity after oral and intraperitoneal administration; acute mouse toxicity after oral, intraperitoneal, intravenous, and subcutaneous administration). Fragmental Partial Least Squares (PLS) with 100 bootstraps yielded baseline predictions that were automatically corrected for non-linear effects in local chemical spaces--a combination called Global, Adjusted Locally According to Similarity (GALAS) modelling methodology. Each prediction obtained in this manner is provided with a reliability index value that depends on both compound's similarity to the training set (that accounts for similar trends in LD(50) variations within multiple bootstraps) and consistency of experimental results with regard to the baseline model in the local chemical environment. The actual performance of the Reliability Index (RI) was proven by its good (and uniform) correlations with Root Mean Square Error (RMSE) in all validation sets, thus providing quantitative assessment of the Model Applicability Domain. The obtained models can be used for compound screening in the early stages of drug development and prioritization for experimental in vitro testing or later in vivo animal acute toxicity studies.

  7. Measuring acuity of the approximate number system reliably and validly: the evaluation of an adaptive test procedure

    PubMed Central

    Lindskog, Marcus; Winman, Anders; Juslin, Peter; Poom, Leo

    2013-01-01

    Two studies investigated the reliability and predictive validity of commonly used measures and models of Approximate Number System acuity (ANS). Study 1 investigated reliability by both an empirical approach and a simulation of maximum obtainable reliability under ideal conditions. Results showed that common measures of the Weber fraction (w) are reliable only when using a substantial number of trials, even under ideal conditions. Study 2 compared different purported measures of ANS acuity as for convergent and predictive validity in a within-subjects design and evaluated an adaptive test using the ZEST algorithm. Results showed that the adaptive measure can reduce the number of trials needed to reach acceptable reliability. Only direct tests with non-symbolic numerosity discriminations of stimuli presented simultaneously were related to arithmetic fluency. This correlation remained when controlling for general cognitive ability and perceptual speed. Further, the purported indirect measure of ANS acuity in terms of the Numeric Distance Effect (NDE) was not reliable and showed no sign of predictive validity. The non-symbolic NDE for reaction time was significantly related to direct w estimates in a direction contrary to the expected. Easier stimuli were found to be more reliable, but only harder (7:8 ratio) stimuli contributed to predictive validity. PMID:23964256

  8. Medicine is not science: guessing the future, predicting the past.

    PubMed

    Miller, Clifford

    2014-12-01

    Irregularity limits human ability to know, understand and predict. A better understanding of irregularity may improve the reliability of knowledge. Irregularity and its consequences for knowledge are considered. Reliable predictive empirical knowledge of the physical world has always been obtained by observation of regularities, without needing science or theory. Prediction from observational knowledge can remain reliable despite some theories based on it proving false. A naïve theory of irregularity is outlined. Reducing irregularity and/or increasing regularity can increase the reliability of knowledge. Beyond long experience and specialization, improvements include implementing supporting knowledge systems of libraries of appropriately classified prior cases and clinical histories and education about expertise, intuition and professional judgement. A consequence of irregularity and complexity is that classical reductionist science cannot provide reliable predictions of the behaviour of complex systems found in nature, including of the human body. Expertise, expert judgement and their exercise appear overarching. Diagnosis involves predicting the past will recur in the current patient applying expertise and intuition from knowledge and experience of previous cases and probabilistic medical theory. Treatment decisions are an educated guess about the future (prognosis). Benefits of the improvements suggested here are likely in fields where paucity of feedback for practitioners limits development of reliable expert diagnostic intuition. Further analysis, definition and classification of irregularity is appropriate. Observing and recording irregularities are initial steps in developing irregularity theory to improve the reliability and extent of knowledge, albeit some forms of irregularity present inherent difficulties. © 2014 John Wiley & Sons, Ltd.

  9. Calculations of reliability predictions for the Apollo spacecraft

    NASA Technical Reports Server (NTRS)

    Amstadter, B. L.

    1966-01-01

    A new method of reliability prediction for complex systems is defined. Calculation of both upper and lower bounds are involved, and a procedure for combining the two to yield an approximately true prediction value is presented. Both mission success and crew safety predictions can be calculated, and success probabilities can be obtained for individual mission phases or subsystems. Primary consideration is given to evaluating cases involving zero or one failure per subsystem, and the results of these evaluations are then used for analyzing multiple failure cases. Extensive development is provided for the overall mission success and crew safety equations for both the upper and lower bounds.

  10. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  11. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  12. Bootstrap study of genome-enabled prediction reliabilities using haplotype blocks across Nordic Red cattle breeds.

    PubMed

    Cuyabano, B C D; Su, G; Rosa, G J M; Lund, M S; Gianola, D

    2015-10-01

    This study compared the accuracy of genome-enabled prediction models using individual single nucleotide polymorphisms (SNP) or haplotype blocks as covariates when using either a single breed or a combined population of Nordic Red cattle. The main objective was to compare predictions of breeding values of complex traits using a combined training population with haplotype blocks, with predictions using a single breed as training population and individual SNP as predictors. To compare the prediction reliabilities, bootstrap samples were taken from the test data set. With the bootstrapped samples of prediction reliabilities, we built and graphed confidence ellipses to allow comparisons. Finally, measures of statistical distances were used to calculate the gain in predictive ability. Our analyses are innovative in the context of assessment of predictive models, allowing a better understanding of prediction reliabilities and providing a statistical basis to effectively calibrate whether one prediction scenario is indeed more accurate than another. An ANOVA indicated that use of haplotype blocks produced significant gains mainly when Bayesian mixture models were used but not when Bayesian BLUP was fitted to the data. Furthermore, when haplotype blocks were used to train prediction models in a combined Nordic Red cattle population, we obtained up to a statistically significant 5.5% average gain in prediction accuracy, over predictions using individual SNP and training the model with a single breed. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  13. VerSeDa: vertebrate secretome database

    PubMed Central

    Cortazar, Ana R.; Oguiza, José A.

    2017-01-01

    Based on the current tools, de novo secretome (full set of proteins secreted by an organism) prediction is a time consuming bioinformatic task that requires a multifactorial analysis in order to obtain reliable in silico predictions. Hence, to accelerate this process and offer researchers a reliable repository where secretome information can be obtained for vertebrates and model organisms, we have developed VerSeDa (Vertebrate Secretome Database). This freely available database stores information about proteins that are predicted to be secreted through the classical and non-classical mechanisms, for the wide range of vertebrate species deposited at the NCBI, UCSC and ENSEMBL sites. To our knowledge, VerSeDa is the only state-of-the-art database designed to store secretome data from multiple vertebrate genomes, thus, saving an important amount of time spent in the prediction of protein features that can be retrieved from this repository directly. Database URL: VerSeDa is freely available at http://genomics.cicbiogune.es/VerSeDa/index.php PMID:28365718

  14. VerSeDa: vertebrate secretome database.

    PubMed

    Cortazar, Ana R; Oguiza, José A; Aransay, Ana M; Lavín, José L

    2017-01-01

    Based on the current tools, de novo secretome (full set of proteins secreted by an organism) prediction is a time consuming bioinformatic task that requires a multifactorial analysis in order to obtain reliable in silico predictions. Hence, to accelerate this process and offer researchers a reliable repository where secretome information can be obtained for vertebrates and model organisms, we have developed VerSeDa (Vertebrate Secretome Database). This freely available database stores information about proteins that are predicted to be secreted through the classical and non-classical mechanisms, for the wide range of vertebrate species deposited at the NCBI, UCSC and ENSEMBL sites. To our knowledge, VerSeDa is the only state-of-the-art database designed to store secretome data from multiple vertebrate genomes, thus, saving an important amount of time spent in the prediction of protein features that can be retrieved from this repository directly. VerSeDa is freely available at http://genomics.cicbiogune.es/VerSeDa/index.php. © The Author(s) 2017. Published by Oxford University Press.

  15. Reliability prediction of large fuel cell stack based on structure stress analysis

    NASA Astrophysics Data System (ADS)

    Liu, L. F.; Liu, B.; Wu, C. W.

    2017-09-01

    The aim of this paper is to improve the reliability of Proton Electrolyte Membrane Fuel Cell (PEMFC) stack by designing the clamping force and the thickness difference between the membrane electrode assembly (MEA) and the gasket. The stack reliability is directly determined by the component reliability, which is affected by the material property and contact stress. The component contact stress is a random variable because it is usually affected by many uncertain factors in the production and clamping process. We have investigated the influences of parameter variation coefficient on the probability distribution of contact stress using the equivalent stiffness model and the first-order second moment method. The optimal contact stress to make the component stay in the highest level reliability is obtained by the stress-strength interference model. To obtain the optimal contact stress between the contact components, the optimal thickness of the component and the stack clamping force are optimally designed. Finally, a detailed description is given how to design the MEA and gasket dimensions to obtain the highest stack reliability. This work can provide a valuable guidance in the design of stack structure for a high reliability of fuel cell stack.

  16. Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models

    PubMed Central

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  17. Rollover risk prediction of heavy vehicles by reliability index and empirical modelling

    NASA Astrophysics Data System (ADS)

    Sellami, Yamine; Imine, Hocine; Boubezoul, Abderrahmane; Cadiou, Jean-Charles

    2018-03-01

    This paper focuses on a combination of a reliability-based approach and an empirical modelling approach for rollover risk assessment of heavy vehicles. A reliability-based warning system is developed to alert the driver to a potential rollover before entering into a bend. The idea behind the proposed methodology is to estimate the rollover risk by the probability that the vehicle load transfer ratio (LTR) exceeds a critical threshold. Accordingly, a so-called reliability index may be used as a measure to assess the vehicle safe functioning. In the reliability method, computing the maximum of LTR requires to predict the vehicle dynamics over the bend which can be in some cases an intractable problem or time-consuming. With the aim of improving the reliability computation time, an empirical model is developed to substitute the vehicle dynamics and rollover models. This is done by using the SVM (Support Vector Machines) algorithm. The preliminary obtained results demonstrate the effectiveness of the proposed approach.

  18. Binding free energy predictions of farnesoid X receptor (FXR) agonists using a linear interaction energy (LIE) approach with reliability estimation: application to the D3R Grand Challenge 2

    NASA Astrophysics Data System (ADS)

    Rifai, Eko Aditya; van Dijk, Marc; Vermeulen, Nico P. E.; Geerke, Daan P.

    2018-01-01

    Computational protein binding affinity prediction can play an important role in drug research but performing efficient and accurate binding free energy calculations is still challenging. In the context of phase 2 of the Drug Design Data Resource (D3R) Grand Challenge 2 we used our automated eTOX ALLIES approach to apply the (iterative) linear interaction energy (LIE) method and we evaluated its performance in predicting binding affinities for farnesoid X receptor (FXR) agonists. Efficiency was obtained by our pre-calibrated LIE models and molecular dynamics (MD) simulations at the nanosecond scale, while predictive accuracy was obtained for a small subset of compounds. Using our recently introduced reliability estimation metrics, we could classify predictions with higher confidence by featuring an applicability domain (AD) analysis in combination with protein-ligand interaction profiling. The outcomes of and agreement between our AD and interaction-profile analyses to distinguish and rationalize the performance of our predictions highlighted the relevance of sufficiently exploring protein-ligand interactions during training and it demonstrated the possibility to quantitatively and efficiently evaluate if this is achieved by using simulation data only.

  19. The Reliability and Validity of Using Regression Residuals to Measure Institutional Effectiveness in Promoting Degree Completion

    ERIC Educational Resources Information Center

    Horn, Aaron S.; Lee, Giljae

    2016-01-01

    A relatively simple way of measuring institutional effectiveness in relation to degree completion is to estimate the difference between an actual and predicted graduation rate, but the reliability and validity of this method have not been thoroughly examined. Longitudinal data were obtained from IPEDS for both public and private not-for-profit…

  20. Audience preferences are predicted by temporal reliability of neural processing

    PubMed Central

    Dmochowski, Jacek P.; Bezdek, Matthew A.; Abelson, Brian P.; Johnson, John S.; Schumacher, Eric H.; Parra, Lucas C.

    2014-01-01

    Naturalistic stimuli evoke highly reliable brain activity across viewers. Here we record neural activity from a group of naive individuals while viewing popular, previously-broadcast television content for which the broad audience response is characterized by social media activity and audience ratings. We find that the level of inter-subject correlation in the evoked encephalographic responses predicts the expressions of interest and preference among thousands. Surprisingly, ratings of the larger audience are predicted with greater accuracy than those of the individuals from whom the neural data is obtained. An additional functional magnetic resonance imaging study employing a separate sample of subjects shows that the level of neural reliability evoked by these stimuli covaries with the amount of blood-oxygenation-level-dependent (BOLD) activation in higher-order visual and auditory regions. Our findings suggest that stimuli which we judge favourably may be those to which our brains respond in a stereotypical manner shared by our peers. PMID:25072833

  1. Audience preferences are predicted by temporal reliability of neural processing.

    PubMed

    Dmochowski, Jacek P; Bezdek, Matthew A; Abelson, Brian P; Johnson, John S; Schumacher, Eric H; Parra, Lucas C

    2014-07-29

    Naturalistic stimuli evoke highly reliable brain activity across viewers. Here we record neural activity from a group of naive individuals while viewing popular, previously-broadcast television content for which the broad audience response is characterized by social media activity and audience ratings. We find that the level of inter-subject correlation in the evoked encephalographic responses predicts the expressions of interest and preference among thousands. Surprisingly, ratings of the larger audience are predicted with greater accuracy than those of the individuals from whom the neural data is obtained. An additional functional magnetic resonance imaging study employing a separate sample of subjects shows that the level of neural reliability evoked by these stimuli covaries with the amount of blood-oxygenation-level-dependent (BOLD) activation in higher-order visual and auditory regions. Our findings suggest that stimuli which we judge favourably may be those to which our brains respond in a stereotypical manner shared by our peers.

  2. The reliable solution and computation time of variable parameters logistic model

    NASA Astrophysics Data System (ADS)

    Wang, Pengfei; Pan, Xinnong

    2018-05-01

    The study investigates the reliable computation time (RCT, termed as T c) by applying a double-precision computation of a variable parameters logistic map (VPLM). Firstly, by using the proposed method, we obtain the reliable solutions for the logistic map. Secondly, we construct 10,000 samples of reliable experiments from a time-dependent non-stationary parameters VPLM and then calculate the mean T c. The results indicate that, for each different initial value, the T cs of the VPLM are generally different. However, the mean T c trends to a constant value when the sample number is large enough. The maximum, minimum, and probable distribution functions of T c are also obtained, which can help us to identify the robustness of applying a nonlinear time series theory to forecasting by using the VPLM output. In addition, the T c of the fixed parameter experiments of the logistic map is obtained, and the results suggest that this T c matches the theoretical formula-predicted value.

  3. Warranty optimisation based on the prediction of costs to the manufacturer using neural network model and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Stamenkovic, Dragan D.; Popovic, Vladimir M.

    2015-02-01

    Warranty is a powerful marketing tool, but it always involves additional costs to the manufacturer. In order to reduce these costs and make use of warranty's marketing potential, the manufacturer needs to master the techniques for warranty cost prediction according to the reliability characteristics of the product. In this paper a combination free replacement and pro rata warranty policy is analysed as warranty model for one type of light bulbs. Since operating conditions have a great impact on product reliability, they need to be considered in such analysis. A neural network model is used to predict light bulb reliability characteristics based on the data from the tests of light bulbs in various operating conditions. Compared with a linear regression model used in the literature for similar tasks, the neural network model proved to be a more accurate method for such prediction. Reliability parameters obtained in this way are later used in Monte Carlo simulation for the prediction of times to failure needed for warranty cost calculation. The results of the analysis make possible for the manufacturer to choose the optimal warranty policy based on expected product operating conditions. In such a way, the manufacturer can lower the costs and increase the profit.

  4. Effects of turbulence modelling on prediction of flow characteristics in a bench-scale anaerobic gas-lift digester.

    PubMed

    Coughtrie, A R; Borman, D J; Sleigh, P A

    2013-06-01

    Flow in a gas-lift digester with a central draft-tube was investigated using computational fluid dynamics (CFD) and different turbulence closure models. The k-ω Shear-Stress-Transport (SST), Renormalization-Group (RNG) k-∊, Linear Reynolds-Stress-Model (RSM) and Transition-SST models were tested for a gas-lift loop reactor under Newtonian flow conditions validated against published experimental work. The results identify that flow predictions within the reactor (where flow is transitional) are particularly sensitive to the turbulence model implemented; the Transition-SST model was found to be the most robust for capturing mixing behaviour and predicting separation reliably. Therefore, Transition-SST is recommended over k-∊ models for use in comparable mixing problems. A comparison of results obtained using multiphase Euler-Lagrange and singlephase approaches are presented. The results support the validity of the singlephase modelling assumptions in obtaining reliable predictions of the reactor flow. Solver independence of results was verified by comparing two independent finite-volume solvers (Fluent-13.0sp2 and OpenFOAM-2.0.1). Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Bayesian quantitative precipitation forecasts in terms of quantiles

    NASA Astrophysics Data System (ADS)

    Bentzien, Sabrina; Friederichs, Petra

    2014-05-01

    Ensemble prediction systems (EPS) for numerical weather predictions on the mesoscale are particularly developed to obtain probabilistic guidance for high impact weather. An EPS not only issues a deterministic future state of the atmosphere but a sample of possible future states. Ensemble postprocessing then translates such a sample of forecasts into probabilistic measures. This study focus on probabilistic quantitative precipitation forecasts in terms of quantiles. Quantiles are particular suitable to describe precipitation at various locations, since no assumption is required on the distribution of precipitation. The focus is on the prediction during high-impact events and related to the Volkswagen Stiftung funded project WEX-MOP (Mesoscale Weather Extremes - Theory, Spatial Modeling and Prediction). Quantile forecasts are derived from the raw ensemble and via quantile regression. Neighborhood method and time-lagging are effective tools to inexpensively increase the ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Since an EPS provides a large amount of potentially informative predictors, a variable selection is required in order to obtain a stable statistical model. A Bayesian formulation of quantile regression allows for inference about the selection of predictive covariates by the use of appropriate prior distributions. Moreover, the implementation of an additional process layer for the regression parameters accounts for spatial variations of the parameters. Bayesian quantile regression and its spatially adaptive extension is illustrated for the German-focused mesoscale weather prediction ensemble COSMO-DE-EPS, which runs (pre)operationally since December 2010 at the German Meteorological Service (DWD). Objective out-of-sample verification uses the quantile score (QS), a weighted absolute error between quantile forecasts and observations. The QS is a proper scoring function and can be decomposed into reliability, resolutions and uncertainty parts. A quantile reliability plot gives detailed insights in the predictive performance of the quantile forecasts.

  6. Multi-model ensemble hydrologic prediction using Bayesian model averaging

    NASA Astrophysics Data System (ADS)

    Duan, Qingyun; Ajami, Newsha K.; Gao, Xiaogang; Sorooshian, Soroosh

    2007-05-01

    Multi-model ensemble strategy is a means to exploit the diversity of skillful predictions from different models. This paper studies the use of Bayesian model averaging (BMA) scheme to develop more skillful and reliable probabilistic hydrologic predictions from multiple competing predictions made by several hydrologic models. BMA is a statistical procedure that infers consensus predictions by weighing individual predictions based on their probabilistic likelihood measures, with the better performing predictions receiving higher weights than the worse performing ones. Furthermore, BMA provides a more reliable description of the total predictive uncertainty than the original ensemble, leading to a sharper and better calibrated probability density function (PDF) for the probabilistic predictions. In this study, a nine-member ensemble of hydrologic predictions was used to test and evaluate the BMA scheme. This ensemble was generated by calibrating three different hydrologic models using three distinct objective functions. These objective functions were chosen in a way that forces the models to capture certain aspects of the hydrograph well (e.g., peaks, mid-flows and low flows). Two sets of numerical experiments were carried out on three test basins in the US to explore the best way of using the BMA scheme. In the first set, a single set of BMA weights was computed to obtain BMA predictions, while the second set employed multiple sets of weights, with distinct sets corresponding to different flow intervals. In both sets, the streamflow values were transformed using Box-Cox transformation to ensure that the probability distribution of the prediction errors is approximately Gaussian. A split sample approach was used to obtain and validate the BMA predictions. The test results showed that BMA scheme has the advantage of generating more skillful and equally reliable probabilistic predictions than original ensemble. The performance of the expected BMA predictions in terms of daily root mean square error (DRMS) and daily absolute mean error (DABS) is generally superior to that of the best individual predictions. Furthermore, the BMA predictions employing multiple sets of weights are generally better than those using single set of weights.

  7. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    NASA Astrophysics Data System (ADS)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  8. Application of Grey Model GM(1, 1) to Ultra Short-Term Predictions of Universal Time

    NASA Astrophysics Data System (ADS)

    Lei, Yu; Guo, Min; Zhao, Danning; Cai, Hongbing; Hu, Dandan

    2016-03-01

    A mathematical model known as one-order one-variable grey differential equation model GM(1, 1) has been herein employed successfully for the ultra short-term (<10days) predictions of universal time (UT1-UTC). The results of predictions are analyzed and compared with those obtained by other methods. It is shown that the accuracy of the predictions is comparable with that obtained by other prediction methods. The proposed method is able to yield an exact prediction even though only a few observations are provided. Hence it is very valuable in the case of a small size dataset since traditional methods, e.g., least-squares (LS) extrapolation, require longer data span to make a good forecast. In addition, these results can be obtained without making any assumption about an original dataset, and thus is of high reliability. Another advantage is that the developed method is easy to use. All these reveal a great potential of the GM(1, 1) model for UT1-UTC predictions.

  9. Prediction of Process-Induced Distortions in L-Shaped Composite Profiles Using Path-Dependent Constitutive Law

    NASA Astrophysics Data System (ADS)

    Ding, Anxin; Li, Shuxin; Wang, Jihui; Ni, Aiqing; Sun, Liangliang; Chang, Lei

    2016-10-01

    In this paper, the corner spring-in angles of AS4/8552 L-shaped composite profiles with different thicknesses are predicted using path-dependent constitutive law with the consideration of material properties variation due to phase change during curing. The prediction accuracy mainly depends on the properties in the rubbery and glassy states obtained by homogenization method rather than experimental measurements. Both analytical and finite element (FE) homogenization methods are applied to predict the overall properties of AS4/8552 composite. The effect of fiber volume fraction on the properties is investigated for both rubbery and glassy states using both methods. And the predicted results are compared with experimental measurements for the glassy state. Good agreement is achieved between the predicted results and available experimental data, showing the reliability of the homogenization method. Furthermore, the corner spring-in angles of L-shaped composite profiles are measured experimentally and the reliability of path-dependent constitutive law is validated as well as the properties prediction by FE homogenization method.

  10. Prediction models for CO2 emission in Malaysia using best subsets regression and multi-linear regression

    NASA Astrophysics Data System (ADS)

    Tan, C. H.; Matjafri, M. Z.; Lim, H. S.

    2015-10-01

    This paper presents the prediction models which analyze and compute the CO2 emission in Malaysia. Each prediction model for CO2 emission will be analyzed based on three main groups which is transportation, electricity and heat production as well as residential buildings and commercial and public services. The prediction models were generated using data obtained from World Bank Open Data. Best subset method will be used to remove irrelevant data and followed by multi linear regression to produce the prediction models. From the results, high R-square (prediction) value was obtained and this implies that the models are reliable to predict the CO2 emission by using specific data. In addition, the CO2 emissions from these three groups are forecasted using trend analysis plots for observation purpose.

  11. Short communication: Improving accuracy of predicting breeding values in Brazilian Holstein population by adding data from Nordic and French Holstein populations.

    PubMed

    Li, X; Lund, M S; Zhang, Q; Costa, C N; Ducrocq, V; Su, G

    2016-06-01

    The present study investigated the improvement of prediction reliabilities for 3 production traits in Brazilian Holsteins that had no genotype information by adding information from Nordic and French Holstein bulls that had genotypes. The estimated across-country genetic correlations (ranging from 0.604 to 0.726) indicated that an important genotype by environment interaction exists between Brazilian and Nordic (or Nordic and French) populations. Prediction reliabilities for Brazilian genotyped bulls were greatly increased by including data of Nordic and French bulls, and a 2-trait single-step genomic BLUP performed much better than the corresponding pedigree-based BLUP. However, only a minor improvement in prediction reliabilities was observed in nongenotyped Brazilian cows. The results indicate that although there is a large genotype by environment interaction, inclusion of a foreign reference population can improve accuracy of genetic evaluation for the Brazilian Holstein population. However, a Brazilian reference population is necessary to obtain a more accurate genomic evaluation. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. The predictive information obtained by testing multiple software versions

    NASA Technical Reports Server (NTRS)

    Lee, Larry D.

    1987-01-01

    Multiversion programming is a redundancy approach to developing highly reliable software. In applications of this method, two or more versions of a program are developed independently by different programmers and the versions are combined to form a redundant system. One variation of this approach consists of developing a set of n program versions and testing the versions to predict the failure probability of a particular program or a system formed from a subset of the programs. The precision that might be obtained, and also the effect of programmer variability if predictions are made over repetitions of the process of generating different program versions, are examined.

  13. Predicting wettability behavior of fluorosilica coated metal surface using optimum neural network

    NASA Astrophysics Data System (ADS)

    Taghipour-Gorjikolaie, Mehran; Valipour Motlagh, Naser

    2018-02-01

    The interaction between variables, which are effective on the surface wettability, is very complex to predict the contact angles and sliding angles of liquid drops. In this paper, in order to solve this complexity, artificial neural network was used to develop reliable models for predicting the angles of liquid drops. Experimental data are divided into training data and testing data. By using training data and feed forward structure for the neural network and using particle swarm optimization for training the neural network based models, the optimum models were developed. The obtained results showed that regression index for the proposed models for the contact angles and sliding angles are 0.9874 and 0.9920, respectively. As it can be seen, these values are close to unit and it means the reliable performance of the models. Also, it can be inferred from the results that the proposed model have more reliable performance than multi-layer perceptron and radial basis function based models.

  14. Bayesian inference based on dual generalized order statistics from the exponentiated Weibull model

    NASA Astrophysics Data System (ADS)

    Al Sobhi, Mashail M.

    2015-02-01

    Bayesian estimation for the two parameters and the reliability function of the exponentiated Weibull model are obtained based on dual generalized order statistics (DGOS). Also, Bayesian prediction bounds for future DGOS from exponentiated Weibull model are obtained. The symmetric and asymmetric loss functions are considered for Bayesian computations. The Markov chain Monte Carlo (MCMC) methods are used for computing the Bayes estimates and prediction bounds. The results have been specialized to the lower record values. Comparisons are made between Bayesian and maximum likelihood estimators via Monte Carlo simulation.

  15. Use of Landsat data to predict the trophic state of Minnesota lakes

    NASA Technical Reports Server (NTRS)

    Lillesand, T. M.; Johnson, W. L.; Deuell, R. L.; Lindstrom, O. M.; Meisner, D. E.

    1983-01-01

    Near-concurrent Landsat Multispectral Scanner (MSS) and ground data were obtained for 60 lakes distributed in two Landsat scene areas. The ground data included measurement of secchi disk depth, chlorophyll-a, total phosphorous, turbidity, color, and total nitrogen, as well as Carlson Trophic State Index (TSI) values derived from the first three parameters. The Landsat data best correlated with the TSI values. Prediction models were developed to classify some 100 'test' lakes appearing in the two analysis scenes on the basis of TSI estimates. Clouds, wind, poor image data, small lake size, and shallow lake depth caused some problems in lake TSI prediction. Overall, however, the Landsat-predicted TSI estimates were judged to be very reliable for the secchi-derived TSI estimation, moderately reliable for prediction of the chlorophyll-a TSI, and unreliable for the phosphorous value. Numerous Landsat data extraction procedures were compared, and the success of the Landsat TSI prediction models was a strong function of the procedure employed.

  16. Study of complete interconnect reliability for a GaAs MMIC power amplifier

    NASA Astrophysics Data System (ADS)

    Lin, Qian; Wu, Haifeng; Chen, Shan-ji; Jia, Guoqing; Jiang, Wei; Chen, Chao

    2018-05-01

    By combining the finite element analysis (FEA) and artificial neural network (ANN) technique, the complete prediction of interconnect reliability for a monolithic microwave integrated circuit (MMIC) power amplifier (PA) at the both of direct current (DC) and alternating current (AC) operation conditions is achieved effectively in this article. As a example, a MMIC PA is modelled to study the electromigration failure of interconnect. This is the first time to study the interconnect reliability for an MMIC PA at the conditions of DC and AC operation simultaneously. By training the data from FEA, a high accuracy ANN model for PA reliability is constructed. Then, basing on the reliability database which is obtained from the ANN model, it can give important guidance for improving the reliability design for IC.

  17. Multi-scale Modeling of the Impact Response of a Strain Rate Sensitive High-Manganese Austenitic Steel

    NASA Astrophysics Data System (ADS)

    Önal, Orkun; Ozmenci, Cemre; Canadinc, Demircan

    2014-09-01

    A multi-scale modeling approach was applied to predict the impact response of a strain rate sensitive high-manganese austenitic steel. The roles of texture, geometry and strain rate sensitivity were successfully taken into account all at once by coupling crystal plasticity and finite element (FE) analysis. Specifically, crystal plasticity was utilized to obtain the multi-axial flow rule at different strain rates based on the experimental deformation response under uniaxial tensile loading. The equivalent stress - equivalent strain response was then incorporated into the FE model for the sake of a more representative hardening rule under impact loading. The current results demonstrate that reliable predictions can be obtained by proper coupling of crystal plasticity and FE analysis even if the experimental flow rule of the material is acquired under uniaxial loading and at moderate strain rates that are significantly slower than those attained during impact loading. Furthermore, the current findings also demonstrate the need for an experiment-based multi-scale modeling approach for the sake of reliable predictions of the impact response.

  18. Probabilistic Analysis of Aircraft Gas Turbine Disk Life and Reliability

    NASA Technical Reports Server (NTRS)

    Melis, Matthew E.; Zaretsky, Erwin V.; August, Richard

    1999-01-01

    Two series of low cycle fatigue (LCF) test data for two groups of different aircraft gas turbine engine compressor disk geometries were reanalyzed and compared using Weibull statistics. Both groups of disks were manufactured from titanium (Ti-6Al-4V) alloy. A NASA Glenn Research Center developed probabilistic computer code Probable Cause was used to predict disk life and reliability. A material-life factor A was determined for titanium (Ti-6Al-4V) alloy based upon fatigue disk data and successfully applied to predict the life of the disks as a function of speed. A comparison was made with the currently used life prediction method based upon crack growth rate. Applying an endurance limit to the computer code did not significantly affect the predicted lives under engine operating conditions. Failure location prediction correlates with those experimentally observed in the LCF tests. A reasonable correlation was obtained between the predicted disk lives using the Probable Cause code and a modified crack growth method for life prediction. Both methods slightly overpredict life for one disk group and significantly under predict it for the other.

  19. Thermographic Analysis of Stress Distribution in Welded Joints

    NASA Astrophysics Data System (ADS)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  20. Predicting Spike Occurrence and Neuronal Responsiveness from LFPs in Primary Somatosensory Cortex

    PubMed Central

    Storchi, Riccardo; Zippo, Antonio G.; Caramenti, Gian Carlo; Valente, Maurizio; Biella, Gabriele E. M.

    2012-01-01

    Local Field Potentials (LFPs) integrate multiple neuronal events like synaptic inputs and intracellular potentials. LFP spatiotemporal features are particularly relevant in view of their applications both in research (e.g. for understanding brain rhythms, inter-areal neural communication and neronal coding) and in the clinics (e.g. for improving invasive Brain-Machine Interface devices). However the relation between LFPs and spikes is complex and not fully understood. As spikes represent the fundamental currency of neuronal communication this gap in knowledge strongly limits our comprehension of neuronal phenomena underlying LFPs. We investigated the LFP-spike relation during tactile stimulation in primary somatosensory (S-I) cortex in the rat. First we quantified how reliably LFPs and spikes code for a stimulus occurrence. Then we used the information obtained from our analyses to design a predictive model for spike occurrence based on LFP inputs. The model was endowed with a flexible meta-structure whose exact form, both in parameters and structure, was estimated by using a multi-objective optimization strategy. Our method provided a set of nonlinear simple equations that maximized the match between models and true neurons in terms of spike timings and Peri Stimulus Time Histograms. We found that both LFPs and spikes can code for stimulus occurrence with millisecond precision, showing, however, high variability. Spike patterns were predicted significantly above chance for 75% of the neurons analysed. Crucially, the level of prediction accuracy depended on the reliability in coding for the stimulus occurrence. The best predictions were obtained when both spikes and LFPs were highly responsive to the stimuli. Spike reliability is known to depend on neuron intrinsic properties (i.e. on channel noise) and on spontaneous local network fluctuations. Our results suggest that the latter, measured through the LFP response variability, play a dominant role. PMID:22586452

  1. Predicting spike occurrence and neuronal responsiveness from LFPs in primary somatosensory cortex.

    PubMed

    Storchi, Riccardo; Zippo, Antonio G; Caramenti, Gian Carlo; Valente, Maurizio; Biella, Gabriele E M

    2012-01-01

    Local Field Potentials (LFPs) integrate multiple neuronal events like synaptic inputs and intracellular potentials. LFP spatiotemporal features are particularly relevant in view of their applications both in research (e.g. for understanding brain rhythms, inter-areal neural communication and neuronal coding) and in the clinics (e.g. for improving invasive Brain-Machine Interface devices). However the relation between LFPs and spikes is complex and not fully understood. As spikes represent the fundamental currency of neuronal communication this gap in knowledge strongly limits our comprehension of neuronal phenomena underlying LFPs. We investigated the LFP-spike relation during tactile stimulation in primary somatosensory (S-I) cortex in the rat. First we quantified how reliably LFPs and spikes code for a stimulus occurrence. Then we used the information obtained from our analyses to design a predictive model for spike occurrence based on LFP inputs. The model was endowed with a flexible meta-structure whose exact form, both in parameters and structure, was estimated by using a multi-objective optimization strategy. Our method provided a set of nonlinear simple equations that maximized the match between models and true neurons in terms of spike timings and Peri Stimulus Time Histograms. We found that both LFPs and spikes can code for stimulus occurrence with millisecond precision, showing, however, high variability. Spike patterns were predicted significantly above chance for 75% of the neurons analysed. Crucially, the level of prediction accuracy depended on the reliability in coding for the stimulus occurrence. The best predictions were obtained when both spikes and LFPs were highly responsive to the stimuli. Spike reliability is known to depend on neuron intrinsic properties (i.e. on channel noise) and on spontaneous local network fluctuations. Our results suggest that the latter, measured through the LFP response variability, play a dominant role.

  2. Evaluation of stroke volume variation obtained by arterial pulse contour analysis to predict fluid responsiveness intraoperatively.

    PubMed

    Lahner, D; Kabon, B; Marschalek, C; Chiari, A; Pestel, G; Kaider, A; Fleischmann, E; Hetz, H

    2009-09-01

    Fluid management guided by oesophageal Doppler monitor has been reported to improve perioperative outcome. Stroke volume variation (SVV) is considered a reliable clinical predictor of fluid responsiveness. Consequently, the aim of the present trial was to evaluate the accuracy of SVV determined by arterial pulse contour (APCO) analysis, using the FloTrac/Vigileo system, to predict fluid responsiveness as measured by the oesophageal Doppler. Patients undergoing major abdominal surgery received intraoperative fluid management guided by oesophageal Doppler monitoring. Fluid boluses of 250 ml each were administered in case of a decrease in corrected flow time (FTc) to <350 ms. Patients were connected to a monitoring device, obtaining SVV by APCO. Haemodynamic variables were recorded before and after fluid bolus application. Fluid responsiveness was defined as an increase in stroke volume index >10%. The ability of SVV to predict fluid responsiveness was assessed by calculation of the area under the receiver operating characteristic (ROC) curve. Twenty patients received 67 fluid boluses. Fifty-two of the 67 fluid boluses administered resulted in fluid responsiveness. SVV achieved an area under the ROC curve of 0.512 [confidence interval (CI) 0.32-0.70]. A cut-off point for fluid responsiveness was found for SVV > or =8.5% (sensitivity: 77%; specificity: 43%; positive predictive value: 84%; and negative predictive value: 33%). This prospective, interventional observer-blinded study demonstrates that SVV obtained by APCO, using the FloTrac/Vigileo system, is not a reliable predictor of fluid responsiveness in the setting of major abdominal surgery.

  3. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  4. On the reliability of computed chaotic solutions of non-linear differential equations

    NASA Astrophysics Data System (ADS)

    Liao, Shijun

    2009-08-01

    A new concept, namely the critical predictable time Tc, is introduced to give a more precise description of computed chaotic solutions of non-linear differential equations: it is suggested that computed chaotic solutions are unreliable and doubtable when t > Tc. This provides us a strategy to detect reliable solution from a given computed result. In this way, the computational phenomena, such as computational chaos (CC), computational periodicity (CP) and computational prediction uncertainty, which are mainly based on long-term properties of computed time-series, can be completely avoided. Using this concept, the famous conclusion `accurate long-term prediction of chaos is impossible' should be replaced by a more precise conclusion that `accurate prediction of chaos beyond the critical predictable time Tc is impossible'. So, this concept also provides us a timescale to determine whether or not a particular time is long enough for a given non-linear dynamic system. Besides, the influence of data inaccuracy and various numerical schemes on the critical predictable time is investigated in details by using symbolic computation software as a tool. A reliable chaotic solution of Lorenz equation in a rather large interval 0 <= t < 1200 non-dimensional Lorenz time units is obtained for the first time. It is found that the precision of the initial condition and the computed data at each time step, which is mathematically necessary to get such a reliable chaotic solution in such a long time, is so high that it is physically impossible due to the Heisenberg uncertainty principle in quantum physics. This, however, provides us a so-called `precision paradox of chaos', which suggests that the prediction uncertainty of chaos is physically unavoidable, and that even the macroscopical phenomena might be essentially stochastic and thus could be described by probability more economically.

  5. Evaluation of Secretion Prediction Highlights Differing Approaches Needed for Oomycete and Fungal Effectors.

    PubMed

    Sperschneider, Jana; Williams, Angela H; Hane, James K; Singh, Karam B; Taylor, Jennifer M

    2015-01-01

    The steadily increasing number of sequenced fungal and oomycete genomes has enabled detailed studies of how these eukaryotic microbes infect plants and cause devastating losses in food crops. During infection, fungal and oomycete pathogens secrete effector molecules which manipulate host plant cell processes to the pathogen's advantage. Proteinaceous effectors are synthesized intracellularly and must be externalized to interact with host cells. Computational prediction of secreted proteins from genomic sequences is an important technique to narrow down the candidate effector repertoire for subsequent experimental validation. In this study, we benchmark secretion prediction tools on experimentally validated fungal and oomycete effectors. We observe that for a set of fungal SwissProt protein sequences, SignalP 4 and the neural network predictors of SignalP 3 (D-score) and SignalP 2 perform best. For effector prediction in particular, the use of a sensitive method can be desirable to obtain the most complete candidate effector set. We show that the neural network predictors of SignalP 2 and 3, as well as TargetP were the most sensitive tools for fungal effector secretion prediction, whereas the hidden Markov model predictors of SignalP 2 and 3 were the most sensitive tools for oomycete effectors. Thus, previous versions of SignalP retain value for oomycete effector prediction, as the current version, SignalP 4, was unable to reliably predict the signal peptide of the oomycete Crinkler effectors in the test set. Our assessment of subcellular localization predictors shows that cytoplasmic effectors are often predicted as not extracellular. This limits the reliability of secretion predictions that depend on these tools. We present our assessment with a view to informing future pathogenomics studies and suggest revised pipelines for secretion prediction to obtain optimal effector predictions in fungi and oomycetes.

  6. A scoring function based on solvation thermodynamics for protein structure prediction

    PubMed Central

    Du, Shiqiao; Harano, Yuichi; Kinoshita, Masahiro; Sakurai, Minoru

    2012-01-01

    We predict protein structure using our recently developed free energy function for describing protein stability, which is focused on solvation thermodynamics. The function is combined with the current most reliable sampling methods, i.e., fragment assembly (FA) and comparative modeling (CM). The prediction is tested using 11 small proteins for which high-resolution crystal structures are available. For 8 of these proteins, sequence similarities are found in the database, and the prediction is performed with CM. Fairly accurate models with average Cα root mean square deviation (RMSD) ∼ 2.0 Å are successfully obtained for all cases. For the rest of the target proteins, we perform the prediction following FA protocols. For 2 cases, we obtain predicted models with an RMSD ∼ 3.0 Å as the best-scored structures. For the other case, the RMSD remains larger than 7 Å. For all the 11 target proteins, our scoring function identifies the experimentally determined native structure as the best structure. Starting from the predicted structure, replica exchange molecular dynamics is performed to further refine the structures. However, we are unable to improve its RMSD toward the experimental structure. The exhaustive sampling by coarse-grained normal mode analysis around the native structures reveals that our function has a linear correlation with RMSDs < 3.0 Å. These results suggest that the function is quite reliable for the protein structure prediction while the sampling method remains one of the major limiting factors in it. The aspects through which the methodology could further be improved are discussed. PMID:27493529

  7. Prediction of friction pressure drop for low pressure two-phase flows on the basis of approximate analytical models

    NASA Astrophysics Data System (ADS)

    Zubov, N. O.; Kaban'kov, O. N.; Yagov, V. V.; Sukomel, L. A.

    2017-12-01

    Wide use of natural circulation loops operating at low redused pressures generates the real need to develop reliable methods for predicting flow regimes and friction pressure drop for two-phase flows in this region of parameters. Although water-air flows at close-to-atmospheric pressures are the most widely studied subject in the field of two-phase hydrodynamics, the problem of reliably calculating friction pressure drop can hardly be regarded to have been fully solved. The specific volumes of liquid differ very much from those of steam (gas) under such conditions, due to which even a small change in flow quality may cause the flow pattern to alter very significantly. Frequently made attempts to use some or another universal approach to calculating friction pressure drop in a wide range of steam quality values do not seem to be justified and yield predicted values that are poorly consistent with experimentally measured data. The article analyzes the existing methods used to calculate friction pressure drop for two-phase flows at low pressures by comparing their results with the experimentally obtained data. The advisability of elaborating calculation procedures for determining the friction pressure drop and void fraction for two-phase flows taking their pattern (flow regime) into account is demonstrated. It is shown that, for flows characterized by low reduced pressures, satisfactory results are obtained from using a homogeneous model for quasi-homogeneous flows, whereas satisfactory results are obtained from using an annular flow model for flows characterized by high values of void fraction. Recommendations for making a shift from one model to another in carrying out engineering calculations are formulated and tested. By using the modified annular flow model, it is possible to obtain reliable predictions for not only the pressure gradient but also for the liquid film thickness; the consideration of droplet entrainment and deposition phenomena allows reasonable corrections to be introduced into calculations. To the best of the authors' knowledge, it is for the first time that the entrainment of droplets from the film surface is taken into consideration in the dispersed-annular flow model.

  8. Determination of Turboprop Reduction Gearbox System Fatigue Life and Reliability

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.; Lewicki, David G.; Savage, Michael; Vlcek, Brian L.

    2007-01-01

    Two computational models to determine the fatigue life and reliability of a commercial turboprop gearbox are compared with each other and with field data. These models are (1) Monte Carlo simulation of randomly selected lives of individual bearings and gears comprising the system and (2) two-parameter Weibull distribution function for bearings and gears comprising the system using strict-series system reliability to combine the calculated individual component lives in the gearbox. The Monte Carlo simulation included the virtual testing of 744,450 gearboxes. Two sets of field data were obtained from 64 gearboxes that were first-run to removal for cause, were refurbished and placed back in service, and then were second-run until removal for cause. A series of equations were empirically developed from the Monte Carlo simulation to determine the statistical variation in predicted life and Weibull slope as a function of the number of gearboxes failed. The resultant L(sub 10) life from the field data was 5,627 hr. From strict-series system reliability, the predicted L(sub 10) life was 774 hr. From the Monte Carlo simulation, the median value for the L(sub 10) gearbox lives equaled 757 hr. Half of the gearbox L(sub 10) lives will be less than this value and the other half more. The resultant L(sub 10) life of the second-run (refurbished) gearboxes was 1,334 hr. The apparent load-life exponent p for the roller bearings is 5.2. Were the bearing lives to be recalculated with a load-life exponent p equal to 5.2, the predicted L(sub 10) life of the gearbox would be equal to the actual life obtained in the field. The component failure distribution of the gearbox from the Monte Carlo simulation was nearly identical to that using the strict-series system reliability analysis, proving the compatibility of these methods.

  9. An Assertiveness Scale for Adolescents.

    ERIC Educational Resources Information Center

    Lee, Dong Yul; And Others

    1985-01-01

    Developed a 33-item, situation-specific instrument that measures assertiveness of adolescents. Based on data from 682 elementary and secondary school students, adequate reliability and validity of the Assertiveness Scale for Adolescents (ASA) were obtained when tested against several variables about which predictions could be made. (BH)

  10. Single point estimation of phenytoin dosing: a reappraisal.

    PubMed

    Koup, J R; Gibaldi, M; Godolphin, W

    1981-11-01

    A previously proposed method for estimation of phenytoin dosing requirement using a single serum sample obtained 24 hours after intravenous loading dose (18 mg/Kg) has been re-evaluated. Using more realistic values for the volume of distribution of phenytoin (0.4 to 1.2 L/Kg), simulations indicate that the proposed method will fail to consistently predict dosage requirements. Additional simulations indicate that two samples obtained during the 24 hour interval following the iv loading dose could be used to more reliably predict phenytoin dose requirement. Because of the nonlinear relationship which exists between phenytoin dose administration rate (RO) and the mean steady state serum concentration (CSS), small errors in prediction of the required RO result in much larger errors in CSS.

  11. Analysis of fatigue reliability for high temperature and high pressure multi-stage decompression control valve

    NASA Astrophysics Data System (ADS)

    Yu, Long; Xu, Juanjuan; Zhang, Lifang; Xu, Xiaogang

    2018-03-01

    Based on stress-strength interference theory to establish the reliability mathematical model for high temperature and high pressure multi-stage decompression control valve (HMDCV), and introduced to the temperature correction coefficient for revising material fatigue limit at high temperature. Reliability of key dangerous components and fatigue sensitivity curve of each component are calculated and analyzed by the means, which are analyzed the fatigue life of control valve and combined with reliability theory of control valve model. The impact proportion of each component on the control valve system fatigue failure was obtained. The results is shown that temperature correction factor makes the theoretical calculations of reliability more accurate, prediction life expectancy of main pressure parts accords with the technical requirements, and valve body and the sleeve have obvious influence on control system reliability, the stress concentration in key part of control valve can be reduced in the design process by improving structure.

  12. Structural Life and Reliability Metrics: Benchmarking and Verification of Probabilistic Life Prediction Codes

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Soditus, Sherry; Hendricks, Robert C.; Zaretsky, Erwin V.

    2002-01-01

    Over the past two decades there has been considerable effort by NASA Glenn and others to develop probabilistic codes to predict with reasonable engineering certainty the life and reliability of critical components in rotating machinery and, more specifically, in the rotating sections of airbreathing and rocket engines. These codes have, to a very limited extent, been verified with relatively small bench rig type specimens under uniaxial loading. Because of the small and very narrow database the acceptance of these codes within the aerospace community has been limited. An alternate approach to generating statistically significant data under complex loading and environments simulating aircraft and rocket engine conditions is to obtain, catalog and statistically analyze actual field data. End users of the engines, such as commercial airlines and the military, record and store operational and maintenance information. This presentation describes a cooperative program between the NASA GRC, United Airlines, USAF Wright Laboratory, U.S. Army Research Laboratory and Australian Aeronautical & Maritime Research Laboratory to obtain and analyze these airline data for selected components such as blades, disks and combustors. These airline data will be used to benchmark and compare existing life prediction codes.

  13. Quantifying uncertainties in streamflow predictions through signature based inference of hydrological model parameters

    NASA Astrophysics Data System (ADS)

    Fenicia, Fabrizio; Reichert, Peter; Kavetski, Dmitri; Albert, Calro

    2016-04-01

    The calibration of hydrological models based on signatures (e.g. Flow Duration Curves - FDCs) is often advocated as an alternative to model calibration based on the full time series of system responses (e.g. hydrographs). Signature based calibration is motivated by various arguments. From a conceptual perspective, calibration on signatures is a way to filter out errors that are difficult to represent when calibrating on the full time series. Such errors may for example occur when observed and simulated hydrographs are shifted, either on the "time" axis (i.e. left or right), or on the "streamflow" axis (i.e. above or below). These shifts may be due to errors in the precipitation input (time or amount), and if not properly accounted in the likelihood function, may cause biased parameter estimates (e.g. estimated model parameters that do not reproduce the recession characteristics of a hydrograph). From a practical perspective, signature based calibration is seen as a possible solution for making predictions in ungauged basins. Where streamflow data are not available, it may in fact be possible to reliably estimate streamflow signatures. Previous research has for example shown how FDCs can be reliably estimated at ungauged locations based on climatic and physiographic influence factors. Typically, the goal of signature based calibration is not the prediction of the signatures themselves, but the prediction of the system responses. Ideally, the prediction of system responses should be accompanied by a reliable quantification of the associated uncertainties. Previous approaches for signature based calibration, however, do not allow reliable estimates of streamflow predictive distributions. Here, we illustrate how the Bayesian approach can be employed to obtain reliable streamflow predictive distributions based on signatures. A case study is presented, where a hydrological model is calibrated on FDCs and additional signatures. We propose an approach where the likelihood function for the signatures is derived from the likelihood for streamflow (rather than using an "ad-hoc" likelihood for the signatures as done in previous approaches). This likelihood is not easily tractable analytically and we therefore cannot apply "simple" MCMC methods. This numerical problem is solved using Approximate Bayesian Computation (ABC). Our result indicate that the proposed approach is suitable for producing reliable streamflow predictive distributions based on calibration to signature data. Moreover, our results provide indications on which signatures are more appropriate to represent the information content of the hydrograph.

  14. A survey and new measurements of ice vapor pressure at temperatures between 170 and 250K

    NASA Technical Reports Server (NTRS)

    Marti, James; Mauersberger, Konrad

    1993-01-01

    New measurements of ice vapor pressures at temperatures between 170 and 250 K are presented and published vapor pressure data are summarized. An empirical vapor pressure equation was derived and allows prediction of vapor pressures between 170 k and the triple point of water with an accuracy of approximately 2 percent. Predictions obtained agree, within experimental uncertainty, with the most reliable equation derived from thermodynamic principles.

  15. Technologies for Developing Predictive Atomistic and Coarse-Grained Force Fields for Ionic Liquid Property Prediction

    DTIC Science & Technology

    2008-07-29

    minimization is performed. It is critical that all other force field parameters (for bonds, angles, charges, and Lennard-Jones interactions) be pre...and tailoring the parameterization accordingly may be critical . For Phase I, the above described procedure was performed manually to obtain dihedral... critical that a reliable approach is available to guide experimental efforts and design. In addition, the automation of force field development will

  16. Predictions of the residue cross-sections for the elements Z = 113 and Z = 114

    NASA Astrophysics Data System (ADS)

    Bouriquet, B.; Abe, Y.; Kosenko, G.

    2004-10-01

    A good reproduction of experimental excitation functions is obtained for the 1 n reactions producing the elements with Z = 108, 110, 111 and 112 by the combined usage of the two-step model for fusion and the statistical decay code KEWPIE. Furthermore, the model provides reliable predictions of productions of the elements with Z = 113 and Z = 114 which will be a useful guide for plannings of experiments.

  17. Prediction of the thickness of the compensator filter in radiation therapy using computational intelligence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dehlaghi, Vahab; Taghipour, Mostafa; Haghparast, Abbas

    In this study, artificial neural networks (ANNs) and adaptive neuro-fuzzy inference system (ANFIS) are investigated to predict the thickness of the compensator filter in radiation therapy. In the proposed models, the input parameters are field size (S), off-axis distance, and relative dose (D/D{sub 0}), and the output is the thickness of the compensator. The obtained results show that the proposed ANN and ANFIS models are useful, reliable, and cheap tools to predict the thickness of the compensator filter in intensity-modulated radiation therapy.

  18. Evaluating the Effect of Minimizing Screws on Stabilization of Symphysis Mandibular Fracture by 3D Finite Element Analysis.

    PubMed

    Kharmanda, Ghias; Kharma, Mohamed-Yaser

    2017-06-01

    The objective of this work is to integrate structural optimization and reliability concepts into mini-plate fixation strategy used in symphysis mandibular fractures. The structural reliability levels are next estimated when considering a single failure mode and multiple failure modes. A 3-dimensional finite element model is developed in order to evaluate the ability of reducing the negative effect due to the stabilization of the fracture. Topology optimization process is considered in the conceptual design stage to predict possible fixation layouts. In the detailed design stage, suitable mini-plates are selected taking into account the resulting topology and different anatomical considerations. Several muscle forces are considered in order to obtain realistic predictions. Since some muscles can be cut or harmed during the surgery and cannot operate at its maximum capacity, there is a strong motivation to introduce the loading uncertainties in order to obtain reliable designs. The structural reliability is carried out for a single failure mode and multiple failure modes. The different results are validated with a clinical case of a male patient with symphysis fracture. In this case while use of the upper plate fixation with four holes, only two screws were applied to protect adjacent vital structure. This behavior does not affect the stability of the fracture. The proposed strategy to optimize bone plates leads to fewer complications and second surgeries, less patient discomfort, and shorter time of healing.

  19. Accuracy of band dendrometers

    Treesearch

    L. R. Auchmoody

    1976-01-01

    A study to determine the reliability of first-year growth measurements obtained from aluminum band dendrometers showed that growth was underestimated for black cherry trees growing less than 0.5 inch in diameter or accumulating less than 0.080 square foot of basal area. Prediction equations to correct for these errors are given.

  20. Genomic predictions can accelerate selection for resistance against Piscirickettsia salmonis in Atlantic salmon (Salmo salar).

    PubMed

    Bangera, Rama; Correa, Katharina; Lhorente, Jean P; Figueroa, René; Yáñez, José M

    2017-01-31

    Salmon Rickettsial Syndrome (SRS) caused by Piscirickettsia salmonis is a major disease affecting the Chilean salmon industry. Genomic selection (GS) is a method wherein genome-wide markers and phenotype information of full-sibs are used to predict genomic EBV (GEBV) of selection candidates and is expected to have increased accuracy and response to selection over traditional pedigree based Best Linear Unbiased Prediction (PBLUP). Widely used GS methods such as genomic BLUP (GBLUP), SNPBLUP, Bayes C and Bayesian Lasso may perform differently with respect to accuracy of GEBV prediction. Our aim was to compare the accuracy, in terms of reliability of genome-enabled prediction, from different GS methods with PBLUP for resistance to SRS in an Atlantic salmon breeding program. Number of days to death (DAYS), binary survival status (STATUS) phenotypes, and 50 K SNP array genotypes were obtained from 2601 smolts challenged with P. salmonis. The reliability of different GS methods at different SNP densities with and without pedigree were compared to PBLUP using a five-fold cross validation scheme. Heritability estimated from GS methods was significantly higher than PBLUP. Pearson's correlation between predicted GEBV from PBLUP and GS models ranged from 0.79 to 0.91 and 0.79-0.95 for DAYS and STATUS, respectively. The relative increase in reliability from different GS methods for DAYS and STATUS with 50 K SNP ranged from 8 to 25% and 27-30%, respectively. All GS methods outperformed PBLUP at all marker densities. DAYS and STATUS showed superior reliability over PBLUP even at the lowest marker density of 3 K and 500 SNP, respectively. 20 K SNP showed close to maximal reliability for both traits with little improvement using higher densities. These results indicate that genomic predictions can accelerate genetic progress for SRS resistance in Atlantic salmon and implementation of this approach will contribute to the control of SRS in Chile. We recommend GBLUP for routine GS evaluation because this method is computationally faster and the results are very similar with other GS methods. The use of lower density SNP or the combination of low density SNP and an imputation strategy may help to reduce genotyping costs without compromising gain in reliability.

  1. Assessment of quantitative structure-activity relationship of toxicity prediction models for Korean chemical substance control legislation

    PubMed Central

    Kim, Kwang-Yon; Shin, Seong Eun; No, Kyoung Tai

    2015-01-01

    Objectives For successful adoption of legislation controlling registration and assessment of chemical substances, it is important to obtain sufficient toxicological experimental evidence and other related information. It is also essential to obtain a sufficient number of predicted risk and toxicity results. Particularly, methods used in predicting toxicities of chemical substances during acquisition of required data, ultimately become an economic method for future dealings with new substances. Although the need for such methods is gradually increasing, the-required information about reliability and applicability range has not been systematically provided. Methods There are various representative environmental and human toxicity models based on quantitative structure-activity relationships (QSAR). Here, we secured the 10 representative QSAR-based prediction models and its information that can make predictions about substances that are expected to be regulated. We used models that predict and confirm usability of the information expected to be collected and submitted according to the legislation. After collecting and evaluating each predictive model and relevant data, we prepared methods quantifying the scientific validity and reliability, which are essential conditions for using predictive models. Results We calculated predicted values for the models. Furthermore, we deduced and compared adequacies of the models using the Alternative non-testing method assessed for Registration, Evaluation, Authorization, and Restriction of Chemicals Substances scoring system, and deduced the applicability domains for each model. Additionally, we calculated and compared inclusion rates of substances expected to be regulated, to confirm the applicability. Conclusions We evaluated and compared the data, adequacy, and applicability of our selected QSAR-based toxicity prediction models, and included them in a database. Based on this data, we aimed to construct a system that can be used with predicted toxicity results. Furthermore, by presenting the suitability of individual predicted results, we aimed to provide a foundation that could be used in actual assessments and regulations. PMID:26206368

  2. Cross-validation of bioelectrical impedance analysis of body composition in children and adolescents.

    PubMed

    Wu, Y T; Nielsen, D H; Cassady, S L; Cook, J S; Janz, K F; Hansen, J R

    1993-05-01

    The reliability and validity of measurements obtained with two bioelectrical impedance analyzers (BIAs), an RJL Systems model BIA-103 and a Berkeley Medical Research BMR-2000, were investigated using the manufacturers' prediction equations for the assessment of fat-free mass (FFM) (in kilograms) in children and adolescents. Forty-seven healthy children and adolescents (23 male, 24 female), ranging in age from 8 to 20 years (mean = 12.1, SD = 2.3), participated. In the context of a repeated-measures design, the data were analyzed according to gender and maturation (Tanner staging). Hydrostatic weighing (HYDRO) and Lohman's Siri age-adjusted body density prediction equation served as the criteria for validating the BIA-obtained measurements. High intraclass correlation coefficients (ICC > or = .987) demonstrated good test-retest (between-week) measurement reliability for HYDRO and both BIA methods. Between-method (HYDRO versus BIA) correlation coefficients were high for both boys and girls (r > or = .97). The standard errors of estimate (SEEs) for FFM were slightly larger for boys than for girls and were consistently smaller for the RJL system than for the BMR system (RJL SEE = 1.8 kg for boys, 1.3 kg for girls; BMR SEE = 2.4 kg for boys, 1.9 kg for girls). The coefficients of determination were high for both BIA methods (r2 > or = .929). Total prediction errors (TEs) for FFM showed similar between-method trends (RJL TE = 2.1 kg for boys, 1.5 kg for girls; BMR TE = 4.4 kg for boys, 1.9 kg for girls). This study demonstrated that the RJL BIA with the manufacturer's prediction equations can be used to reliably and accurately assess FFM in 8- to 20-year-old children and adolescents. The prediction of FFM by the BMR system was acceptable for girls, but significant overprediction of FFM for boys was noted.

  3. Testing the reliability of ice-cream cone model

    NASA Astrophysics Data System (ADS)

    Pan, Zonghao; Shen, Chenglong; Wang, Chuanbing; Liu, Kai; Xue, Xianghui; Wang, Yuming; Wang, Shui

    2015-04-01

    Coronal Mass Ejections (CME)'s properties are important to not only the physical scene itself but space-weather prediction. Several models (such as cone model, GCS model, and so on) have been raised to get rid of the projection effects within the properties observed by spacecraft. According to SOHO/ LASCO observations, we obtain the 'real' 3D parameters of all the FFHCMEs (front-side full halo Coronal Mass Ejections) within the 24th solar cycle till July 2012, by the ice-cream cone model. Considering that the method to obtain 3D parameters from the CME observations by multi-satellite and multi-angle has higher accuracy, we use the GCS model to obtain the real propagation parameters of these CMEs in 3D space and compare the results with which by ice-cream cone model. Then we could discuss the reliability of the ice-cream cone model.

  4. The application of the statistical theory of extreme values to gust-load problems

    NASA Technical Reports Server (NTRS)

    Press, Harry

    1950-01-01

    An analysis is presented which indicates that the statistical theory of extreme values is applicable to the problems of predicting the frequency of encountering the larger gust loads and gust velocities for both specific test conditions as well as commercial transport operations. The extreme-value theory provides an analytic form for the distributions of maximum values of gust load and velocity. Methods of fitting the distribution are given along with a method of estimating the reliability of the predictions. The theory of extreme values is applied to available load data from commercial transport operations. The results indicate that the estimates of the frequency of encountering the larger loads are more consistent with the data and more reliable than those obtained in previous analyses. (author)

  5. Reliability of history of acetaminophen ingestion in intentional drug overdose patients.

    PubMed

    Bentur, Yedidia; Lurie, Yael; Tamir, Ada; Keyes, Daniel C; Basis, Fuad

    2011-01-01

    The objective of this study was to determine the reliability of denial of acetaminophen ingestion in intentional drug overdose patients. All intentional drug overdose patients admitted to an emergency department who were able to provide a history were included. A detailed history was obtained on names, timing and number of medications ingested, and serum acetaminophen was assayed. Multidrug ingestion was defined as the reporting of ≥2 medications. Patients were considered 'reliable' if they reported acetaminophen ingestion and had detectable acetaminophen levels or the other way around. Validity parameters of acetaminophen history were assessed by sensitivity, specificity and positive and negative predictive values. A total of 154 patients were included. History was significantly more reliable in patients who denied ingestion of acetaminophen (n = 107) compared with patients who reported it (n = 47; 95.3% vs 65.9%, respectively; p < 0.0001, 95% CI of the difference 17.5%-41.2%). No suicidal patient who denied both acetaminophen and multidrug ingestions had a detectable acetaminophen level (negative predictive value 1, 95% CI 0.93-1.0). It is suggested that denial of both acetaminophen and multidrug ingestions by intentional drug overdose patients after a thorough history taking can be considered reliable for acetaminophen history. In facilities with limited resources, these patients may not require routine acetaminophen screening.

  6. Determination of polyphenolic compounds of red wines by UV-VIS-NIR spectroscopy and chemometrics tools.

    PubMed

    Martelo-Vidal, M J; Vázquez, M

    2014-09-01

    Spectral analysis is a quick and non-destructive method to analyse wine. In this work, trans-resveratrol, oenin, malvin, catechin, epicatechin, quercetin and syringic acid were determined in commercial red wines from DO Rías Baixas and DO Ribeira Sacra (Spain) by UV-VIS-NIR spectroscopy. Calibration models were developed using principal component regression (PCR) or partial least squares (PLS) regression. HPLC was used as reference method. The results showed that reliable PLS models were obtained to quantify all polyphenols for Rías Baixas wines. For Ribeira Sacra, feasible models were obtained to determine quercetin, epicatechin, oenin and syringic acid. PCR calibration models showed worst reliable of prediction than PLS models. For red wines from mencía grapes, feasible models were obtained for catechin and oenin, regardless the geographical origin. The results obtained demonstrate that UV-VIS-NIR spectroscopy can be used to determine individual polyphenolic compounds in red wines. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. The number of measurements needed to obtain high reliability for traits related to enzymatic activities and photosynthetic compounds in soybean plants infected with Phakopsora pachyrhizi.

    PubMed

    Oliveira, Tássia Boeno de; Azevedo Peixoto, Leonardo de; Teodoro, Paulo Eduardo; Alvarenga, Amauri Alves de; Bhering, Leonardo Lopes; Campo, Clara Beatriz Hoffmann

    2018-01-01

    Asian rust affects the physiology of soybean plants and causes losses in yield. Repeatability coefficients may help breeders to know how many measurements are needed to obtain a suitable reliability for a target trait. Therefore, the objectives of this study were to determine the repeatability coefficients of 14 traits in soybean plants inoculated with Phakopsora pachyrhizi and to establish the minimum number of measurements needed to predict the breeding value with high accuracy. Experiments were performed in a 3x2 factorial arrangement with three treatments and two inoculations in a random block design. Repeatability coefficients, coefficients of determination and number of measurements needed to obtain a certain reliability were estimated using ANOVA, principal component analysis based on the covariance matrix and the correlation matrix, structural analysis and mixed model. It was observed that the principal component analysis based on the covariance matrix out-performed other methods for almost all traits. Significant differences were observed for all traits except internal CO2 concentration for the treatment effects. For the measurement effects, all traits were significantly different. In addition, significant differences were found for all Treatment x Measurement interaction traits except coumestrol, chitinase and chlorophyll content. Six measurements were suitable to obtain a coefficient of determination higher than 0.7 for all traits based on principal component analysis. The information obtained from this research will help breeders and physiologists determine exactly how many measurements are needed to evaluate each trait in soybean plants infected by P. pachyrhizi with a desirable reliability.

  8. The number of measurements needed to obtain high reliability for traits related to enzymatic activities and photosynthetic compounds in soybean plants infected with Phakopsora pachyrhizi

    PubMed Central

    de Oliveira, Tássia Boeno; Teodoro, Paulo Eduardo; de Alvarenga, Amauri Alves; Bhering, Leonardo Lopes; Campo, Clara Beatriz Hoffmann

    2018-01-01

    Asian rust affects the physiology of soybean plants and causes losses in yield. Repeatability coefficients may help breeders to know how many measurements are needed to obtain a suitable reliability for a target trait. Therefore, the objectives of this study were to determine the repeatability coefficients of 14 traits in soybean plants inoculated with Phakopsora pachyrhizi and to establish the minimum number of measurements needed to predict the breeding value with high accuracy. Experiments were performed in a 3x2 factorial arrangement with three treatments and two inoculations in a random block design. Repeatability coefficients, coefficients of determination and number of measurements needed to obtain a certain reliability were estimated using ANOVA, principal component analysis based on the covariance matrix and the correlation matrix, structural analysis and mixed model. It was observed that the principal component analysis based on the covariance matrix out-performed other methods for almost all traits. Significant differences were observed for all traits except internal CO2 concentration for the treatment effects. For the measurement effects, all traits were significantly different. In addition, significant differences were found for all Treatment x Measurement interaction traits except coumestrol, chitinase and chlorophyll content. Six measurements were suitable to obtain a coefficient of determination higher than 0.7 for all traits based on principal component analysis. The information obtained from this research will help breeders and physiologists determine exactly how many measurements are needed to evaluate each trait in soybean plants infected by P. pachyrhizi with a desirable reliability. PMID:29438380

  9. Prediction of School Performance from the Minnesota Child Development Inventory: Implications for Preschool Screening.

    ERIC Educational Resources Information Center

    Colligan, Robert C.

    Almost all preschool screening programs depend entirely on information and observations obtained during a brief evaluative session with the child. However, the logistics involved in managing large numbers of parents and children, the use of volunteers having varying degrees of sophistication or competency in assessment, the reliability and…

  10. Analysis of factors influencing hydration site prediction based on molecular dynamics simulations.

    PubMed

    Yang, Ying; Hu, Bingjie; Lill, Markus A

    2014-10-27

    Water contributes significantly to the binding of small molecules to proteins in biochemical systems. Molecular dynamics (MD) simulation based programs such as WaterMap and WATsite have been used to probe the locations and thermodynamic properties of hydration sites at the surface or in the binding site of proteins generating important information for structure-based drug design. However, questions associated with the influence of the simulation protocol on hydration site analysis remain. In this study, we use WATsite to investigate the influence of factors such as simulation length and variations in initial protein conformations on hydration site prediction. We find that 4 ns MD simulation is appropriate to obtain a reliable prediction of the locations and thermodynamic properties of hydration sites. In addition, hydration site prediction can be largely affected by the initial protein conformations used for MD simulations. Here, we provide a first quantification of this effect and further indicate that similar conformations of binding site residues (RMSD < 0.5 Å) are required to obtain consistent hydration site predictions.

  11. Limited evidence of individual differences in holistic processing in different versions of the part-whole paradigm.

    PubMed

    Sunday, Mackenzie A; Richler, Jennifer J; Gauthier, Isabel

    2017-07-01

    The part-whole paradigm was one of the first measures of holistic processing and it has been used to address several topics in face recognition, including its development, other-race effects, and more recently, whether holistic processing is correlated with face recognition ability. However the task was not designed to measure individual differences and it has produced measurements with low reliability. We created a new holistic processing test designed to measure individual differences based on the part-whole paradigm, the Vanderbilt Part Whole Test (VPWT). Measurements in the part and whole conditions were reliable, but, surprisingly, there was no evidence for reliable individual differences in the part-whole index (how well a person can take advantage of a face part presented within a whole face context compared to the part presented without a whole face) because part and whole conditions were strongly correlated. The same result was obtained in a version of the original part-whole task that was modified to increase its reliability. Controlling for object recognition ability, we found that variance in the whole condition does not predict any additional variance in face recognition over what is already predicted by performance in the part condition.

  12. TMFoldWeb: a web server for predicting transmembrane protein fold class.

    PubMed

    Kozma, Dániel; Tusnády, Gábor E

    2015-09-17

    Here we present TMFoldWeb, the web server implementation of TMFoldRec, a transmembrane protein fold recognition algorithm. TMFoldRec uses statistical potentials and utilizes topology filtering and a gapless threading algorithm. It ranks template structures and selects the most likely candidates and estimates the reliability of the obtained lowest energy model. The statistical potential was developed in a maximum likelihood framework on a representative set of the PDBTM database. According to the benchmark test the performance of TMFoldRec is about 77 % in correctly predicting fold class for a given transmembrane protein sequence. An intuitive web interface has been developed for the recently published TMFoldRec algorithm. The query sequence goes through a pipeline of topology prediction and a systematic sequence to structure alignment (threading). Resulting templates are ordered by energy and reliability values and are colored according to their significance level. Besides the graphical interface, a programmatic access is available as well, via a direct interface for developers or for submitting genome-wide data sets. The TMFoldWeb web server is unique and currently the only web server that is able to predict the fold class of transmembrane proteins while assigning reliability scores for the prediction. This method is prepared for genome-wide analysis with its easy-to-use interface, informative result page and programmatic access. Considering the info-communication evolution in the last few years, the developed web server, as well as the molecule viewer, is responsive and fully compatible with the prevalent tablets and mobile devices.

  13. Identifying and classifying hyperostosis frontalis interna via computerized tomography.

    PubMed

    May, Hila; Peled, Nathan; Dar, Gali; Hay, Ori; Abbas, Janan; Masharawi, Youssef; Hershkovitz, Israel

    2010-12-01

    The aim of this study was to recognize the radiological characteristics of hyperostosis frontalis interna (HFI) and to establish a valid and reliable method for its identification and classification. A reliability test was carried out on 27 individuals who had undergone a head computerized tomography (CT) scan. Intra-observer reliability was obtained by examining the images three times, by the same researcher, with a 2-week interval between each sample ranking. The inter-observer test was performed by three independent researchers. A validity test was carried out using two methods for identifying and classifying HFI: 46 cadaver skullcaps were ranked twice via computerized tomography scans and then by direct observation. Reliability and validity were calculated using Kappa test (SPSS 15.0). Reliability tests of ranking HFI via CT scans demonstrated good results (K > 0.7). As for validity, a very good consensus was obtained between the CT and direct observation, when moderate and advanced types of HFI were present (K = 0.82). The suggested classification method for HFI, using CT, demonstrated a sensitivity of 84%, specificity of 90.5%, and positive predictive value of 91.3%. In conclusion, volume rendering is a reliable and valid tool for identifying HFI. The suggested three-scale classification is most suitable for radiological diagnosis of the phenomena. Considering the increasing awareness of HFI as an early indicator of a developing malady, this study may assist radiologists in identifying and classifying the phenomena.

  14. Assessing the reliability, predictive and construct validity of historical, clinical and risk management-20 (HCR-20) in Mexican psychiatric inpatients.

    PubMed

    Sada, Andrea; Robles-García, Rebeca; Martínez-López, Nicolás; Hernández-Ramírez, Rafael; Tovilla-Zarate, Carlos-Alfonso; López-Munguía, Fernando; Suárez-Alvarez, Enrique; Ayala, Xochitl; Fresán, Ana

    2016-08-01

    Assessing dangerousness to gauge the likelihood of future violent behaviour has become an integral part of clinical mental health practice in forensic and non-forensic psychiatric settings, one of the most effective instruments for this being the Historical, Clinical and Risk Management-20 (HCR-20). To examine the HCR-20 factor structure in Mexican psychiatric inpatients and to obtain its predictive validity and reliability for use in this population. In total, 225 patients diagnosed with psychotic, affective or personality disorders were included. The HCR-20 was applied at hospital admission and violent behaviours were assessed during psychiatric hospitalization using the Overt Aggression Scale (OAS). Construct validity, predictive validity and internal consistency were determined. Violent behaviour remains more severe in patients classified in the high-risk group during hospitalization. Fifteen items displayed adequate communalities in the original designated domains of the HCR-20 and internal consistency of the instruments was high. The HCR-20 is a suitable instrument for predicting violence risk in Mexican psychiatric inpatients.

  15. A Time-Variant Reliability Model for Copper Bending Pipe under Seawater-Active Corrosion Based on the Stochastic Degradation Process

    PubMed Central

    Li, Mengmeng; Feng, Qiang; Yang, Dezhen

    2018-01-01

    In the degradation process, the randomness and multiplicity of variables are difficult to describe by mathematical models. However, they are common in engineering and cannot be neglected, so it is necessary to study this issue in depth. In this paper, the copper bending pipe in seawater piping systems is taken as the analysis object, and the time-variant reliability is calculated by solving the interference of limit strength and maximum stress. We did degradation experiments and tensile experiments on copper material, and obtained the limit strength at each time. In addition, degradation experiments on copper bending pipe were done and the thickness at each time has been obtained, then the response of maximum stress was calculated by simulation. Further, with the help of one kind of Monte Carlo method we propose, the time-variant reliability of copper bending pipe was calculated based on the stochastic degradation process and interference theory. Compared with traditional methods and verified by maintenance records, the results show that the time-variant reliability model based on the stochastic degradation process proposed in this paper has better applicability in the reliability analysis, and it can be more convenient and accurate to predict the replacement cycle of copper bending pipe under seawater-active corrosion. PMID:29584695

  16. A Bayesian modification to the Jelinski-Moranda software reliability growth model

    NASA Technical Reports Server (NTRS)

    Littlewood, B.; Sofer, A.

    1983-01-01

    The Jelinski-Moranda (JM) model for software reliability was examined. It is suggested that a major reason for the poor results given by this model is the poor performance of the maximum likelihood method (ML) of parameter estimation. A reparameterization and Bayesian analysis, involving a slight modelling change, are proposed. It is shown that this new Bayesian-Jelinski-Moranda model (BJM) is mathematically quite tractable, and several metrics of interest to practitioners are obtained. The BJM and JM models are compared by using several sets of real software failure data collected and in all cases the BJM model gives superior reliability predictions. A change in the assumption which underlay both models to present the debugging process more accurately is discussed.

  17. Predicting the aquatic toxicity mode of action using logistic regression and linear discriminant analysis.

    PubMed

    Ren, Y Y; Zhou, L C; Yang, L; Liu, P Y; Zhao, B W; Liu, H X

    2016-09-01

    The paper highlights the use of the logistic regression (LR) method in the construction of acceptable statistically significant, robust and predictive models for the classification of chemicals according to their aquatic toxic modes of action. Essentials accounting for a reliable model were all considered carefully. The model predictors were selected by stepwise forward discriminant analysis (LDA) from a combined pool of experimental data and chemical structure-based descriptors calculated by the CODESSA and DRAGON software packages. Model predictive ability was validated both internally and externally. The applicability domain was checked by the leverage approach to verify prediction reliability. The obtained models are simple and easy to interpret. In general, LR performs much better than LDA and seems to be more attractive for the prediction of the more toxic compounds, i.e. compounds that exhibit excess toxicity versus non-polar narcotic compounds and more reactive compounds versus less reactive compounds. In addition, model fit and regression diagnostics was done through the influence plot which reflects the hat-values, studentized residuals, and Cook's distance statistics of each sample. Overdispersion was also checked for the LR model. The relationships between the descriptors and the aquatic toxic behaviour of compounds are also discussed.

  18. Early detection of Alzheimer disease: methods, markers, and misgivings.

    PubMed

    Green, R C; Clarke, V C; Thompson, N J; Woodard, J L; Letz, R

    1997-01-01

    There is at present no reliable predictive test for most forms of Alzheimer disease (AD). Although some information about future risk for disease is available in theory through ApoE genotyping, it is of limited accuracy and utility. Once neuroprotective treatments are available for AD, reliable early detection will become a key component of the treatment strategy. We recently conducted a pilot survey eliciting attitudes and beliefs toward an unspecified and hypothetical predictive test for AD. The survey was completed by a convenience sample of 176 individuals, aged 22-77, which was 75% female, 30% African-American, and of which 33% had a family member with AD. The survey revealed that 69% of this sample would elect to obtain predictive testing for AD if the test were 100% accurate. Individuals were more likely to desire predictive testing if they had an a priori belief that they would develop AD (p = 0.0001), had a lower educational level (p = 0.003), were worried that they would develop AD (p = 0.02), had a self-defined history of depression (p = 0.04), and had a family member with AD (p = 0.04). However, the desire for predictive testing was not significantly associated with age, gender, ethnicity, or income. The desire to obtain predictive testing for AD decreased as the assumed accuracy of the hypothetical test decreased. A better short-term strategy for early detection of AD may be computer-based neuropsychological screening of at-risk (older aged) individuals to identify very early cognitive impairment. Individuals identified in this manner could be referred for diagnostic evaluation and early cases of AD could be identified and treated. A new self-administered, touch-screen, computer-based, neuropsychological screening instrument called Neurobehavioral Evaluation System-3 is described, which may facilitate this type of screening.

  19. Reliability of Strength Testing using the Advanced Resistive Exercise Device and Free Weights

    NASA Technical Reports Server (NTRS)

    English, Kirk L.; Loehr, James A.; Laughlin, Mitzi A.; Lee, Stuart M. C.; Hagan, R. Donald

    2008-01-01

    The Advanced Resistive Exercise Device (ARED) was developed for use on the International Space Station as a countermeasure against muscle atrophy and decreased strength. This investigation examined the reliability of one-repetition maximum (1RM) strength testing using ARED and traditional free weight (FW) exercise. Methods: Six males (180.8 +/- 4.3 cm, 83.6 +/- 6.4 kg, 36 +/- 8 y, mean +/- SD) who had not engaged in resistive exercise for at least six months volunteered to participate in this project. Subjects completed four 1RM testing sessions each for FW and ARED (eight total sessions) using a balanced, randomized, crossover design. All testing using one device was completed before progressing to the other. During each session, 1RM was measured for the squat, heel raise, and deadlift exercises. Generalizability (G) and intraclass correlation coefficients (ICC) were calculated for each exercise on each device and were used to predict the number of sessions needed to obtain a reliable 1RM measurement (G . 0.90). Interclass reliability coefficients and Pearson's correlation coefficients (R) also were calculated for the highest 1RM value (1RM9sub peak)) obtained for each exercise on each device to quantify 1RM relationships between devices.

  20. Interpolation/extrapolation technique with application to hypervelocity impact of space debris

    NASA Technical Reports Server (NTRS)

    Rule, William K.

    1992-01-01

    A new technique for the interpolation/extrapolation of engineering data is described. The technique easily allows for the incorporation of additional independent variables, and the most suitable data in the data base is automatically used for each prediction. The technique provides diagnostics for assessing the reliability of the prediction. Two sets of predictions made for known 5-degree-of-freedom, 15-parameter functions using the new technique produced an average coefficient of determination of 0.949. Here, the technique is applied to the prediction of damage to the Space Station from hypervelocity impact of space debris. A new set of impact data is presented for this purpose. Reasonable predictions for bumper damage were obtained, but predictions of pressure wall and multilayer insulation damage were poor.

  1. A review on data and predictions of water dielectric spectra for calculations of van der Waals surface forces.

    PubMed

    Wang, Jianlong; Nguyen, Anh V

    2017-12-01

    Van der Waals forces are one of the important components of intermolecular, colloidal and surface forces governing many phenomena and processes. The latest examples include the colloidal interactions between hydrophobic colloids and interfaces in ambient (non-degassed) water in which dissolved gases and nanobubbles are shown to affect the van der Waals attractions significantly. The advanced computation of van der Waals forces in aqueous systems by the Lifshitz theory requires reliable data for water dielectric spectra. In this paper we review the available predictions of water dielectric spectra for calculating colloidal and surface van der Waals forces. Specifically, the available experimental data for the real and imaginary parts of the complex dielectric function of liquid water in the microwave, IR and UV regions and various corresponding predictions of the water spectra are critically reviewed. The data in the UV region are critical, but the available predictions are still based on the outdated data obtained in 1974 (for frequency only up to 25.5eV). We also reviewed and analysed the experimental data obtained for the UV region in 2000 (for frequency up to 50eV) and 2015 (for frequency up to 100eV). The 1974 and 2000 data require extrapolations to higher frequencies needed for calculating the van der Waals forces but remain inaccurate. Our analysis shows that the latest data of 2015 do not require the extrapolation and can be used to reliably calculate van der Waals forces. The most recent water dielectric spectra gives the (non-retarded) Hamaker constant, A=5.20×10 -20 J, for foam films of liquid water. This review provides the most updated and reliable water dielectric spectra to compute van der Waals forces in aqueous systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Benchmark analysis of forecasted seasonal temperature over different climatic areas

    NASA Astrophysics Data System (ADS)

    Giunta, G.; Salerno, R.; Ceppi, A.; Ercolani, G.; Mancini, M.

    2015-12-01

    From a long-term perspective, an improvement of seasonal forecasting, which is often exclusively based on climatology, could provide a new capability for the management of energy resources in a time scale of just a few months. This paper regards a benchmark analysis in relation to long-term temperature forecasts over Italy in the year 2010, comparing the eni-kassandra meteo forecast (e-kmf®) model, the Climate Forecast System-National Centers for Environmental Prediction (CFS-NCEP) model, and the climatological reference (based on 25-year data) with observations. Statistical indexes are used to understand the reliability of the prediction of 2-m monthly air temperatures with a perspective of 12 weeks ahead. The results show how the best performance is achieved by the e-kmf® system which improves the reliability for long-term forecasts compared to climatology and the CFS-NCEP model. By using the reliable high-performance forecast system, it is possible to optimize the natural gas portfolio and management operations, thereby obtaining a competitive advantage in the European energy market.

  3. SCARE: A post-processor program to MSC/NASTRAN for the reliability analysis of structural ceramic components

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, J. P.

    1985-01-01

    A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  4. Extracting More Information from Passive Optical Tracking Observations for Reliable Orbit Element Generation

    NASA Astrophysics Data System (ADS)

    Bennett, J.; Gehly, S.

    2016-09-01

    This paper presents results from a preliminary method for extracting more orbital information from low rate passive optical tracking data. An improvement in the accuracy of the observation data yields more accurate and reliable orbital elements. A comparison between the orbit propagations from the orbital element generated using the new data processing method is compared with the one generated from the raw observation data for several objects. Optical tracking data collected by EOS Space Systems, located on Mount Stromlo, Australia, is fitted to provide a new orbital element. The element accuracy is determined from a comparison between the predicted orbit and subsequent tracking data or reference orbit if available. The new method is shown to result in a better orbit prediction which has important implications in conjunction assessments and the Space Environment Research Centre space object catalogue. The focus is on obtaining reliable orbital solutions from sparse data. This work forms part of the collaborative effort of the Space Environment Management Cooperative Research Centre which is developing new technologies and strategies to preserve the space environment (www.serc.org.au).

  5. Fatigue reliability of deck structures subjected to correlated crack growth

    NASA Astrophysics Data System (ADS)

    Feng, G. Q.; Garbatov, Y.; Guedes Soares, C.

    2013-12-01

    The objective of this work is to analyse fatigue reliability of deck structures subjected to correlated crack growth. The stress intensity factors of the correlated cracks are obtained by finite element analysis and based on which the geometry correction functions are derived. The Monte Carlo simulations are applied to predict the statistical descriptors of correlated cracks based on the Paris-Erdogan equation. A probabilistic model of crack growth as a function of time is used to analyse the fatigue reliability of deck structures accounting for the crack propagation correlation. A deck structure is modelled as a series system of stiffened panels, where a stiffened panel is regarded as a parallel system composed of plates and are longitudinal. It has been proven that the method developed here can be conveniently applied to perform the fatigue reliability assessment of structures subjected to correlated crack growth.

  6. Impact of relationships between test and training animals and among training animals on reliability of genomic prediction.

    PubMed

    Wu, X; Lund, M S; Sun, D; Zhang, Q; Su, G

    2015-10-01

    One of the factors affecting the reliability of genomic prediction is the relationship among the animals of interest. This study investigated the reliability of genomic prediction in various scenarios with regard to the relationship between test and training animals, and among animals within the training data set. Different training data sets were generated from EuroGenomics data and a group of Nordic Holstein bulls (born in 2005 and afterwards) as a common test data set. Genomic breeding values were predicted using a genomic best linear unbiased prediction model and a Bayesian mixture model. The results showed that a closer relationship between test and training animals led to a higher reliability of genomic predictions for the test animals, while a closer relationship among training animals resulted in a lower reliability. In addition, the Bayesian mixture model in general led to a slightly higher reliability of genomic prediction, especially for the scenario of distant relationships between training and test animals. Therefore, to prevent a decrease in reliability, constant updates of the training population with animals from more recent generations are required. Moreover, a training population consisting of less-related animals is favourable for reliability of genomic prediction. © 2015 Blackwell Verlag GmbH.

  7. Predicting the onset and persistence of episodes of depression in primary health care. The predictD-Spain study: Methodology

    PubMed Central

    Bellón, Juan Ángel; Moreno-Küstner, Berta; Torres-González, Francisco; Montón-Franco, Carmen; GildeGómez-Barragán, María Josefa; Sánchez-Celaya, Marta; Díaz-Barreiros, Miguel Ángel; Vicens, Catalina; de Dios Luna, Juan; Cervilla, Jorge A; Gutierrez, Blanca; Martínez-Cañavate, María Teresa; Oliván-Blázquez, Bárbara; Vázquez-Medrano, Ana; Sánchez-Artiaga, María Soledad; March, Sebastia; Motrico, Emma; Ruiz-García, Victor Manuel; Brangier-Wainberg, Paulette Renée; del Mar Muñoz-García, María; Nazareth, Irwin; King, Michael

    2008-01-01

    Background The effects of putative risk factors on the onset and/or persistence of depression remain unclear. We aim to develop comprehensive models to predict the onset and persistence of episodes of depression in primary care. Here we explain the general methodology of the predictD-Spain study and evaluate the reliability of the questionnaires used. Methods This is a prospective cohort study. A systematic random sample of general practice attendees aged 18 to 75 has been recruited in seven Spanish provinces. Depression is being measured with the CIDI at baseline, and at 6, 12, 24 and 36 months. A set of individual, environmental, genetic, professional and organizational risk factors are to be assessed at each follow-up point. In a separate reliability study, a proportional random sample of 401 participants completed the test-retest (251 researcher-administered and 150 self-administered) between October 2005 and February 2006. We have also checked 118,398 items for data entry from a random sample of 480 patients stratified by province. Results All items and questionnaires had good test-retest reliability for both methods of administration, except for the use of recreational drugs over the previous six months. Cronbach's alphas were good and their factorial analyses coherent for the three scales evaluated (social support from family and friends, dissatisfaction with paid work, and dissatisfaction with unpaid work). There were 191 (0.16%) data entry errors. Conclusion The items and questionnaires were reliable and data quality control was excellent. When we eventually obtain our risk index for the onset and persistence of depression, we will be able to determine the individual risk of each patient evaluated in primary health care. PMID:18657275

  8. Mechanistic Approach to Stability Studies as a Tool for the Optimization and Development of New Products Based on L. rhamnosus Lcr35® in Compliance with Current Regulations

    PubMed Central

    Muller, Claudia; Busignies, Virginie; Mazel, Vincent; Forestier, Christiane; Nivoliez, Adrien; Tchoreloff, Pierre

    2013-01-01

    Probiotics are of great current interest in the pharmaceutical industry because of their multiple effects on human health. To beneficially affect the host, an adequate dosage of the probiotic bacteria in the product must be guaranteed from the time of manufacturing to expiration date. Stability test guidelines as laid down by the ICH-Q1A stipulate a minimum testing period of 12 months. The challenge for producers is to reduce this time. In this paper, a mechanistic approach using the Arrhenius model is proposed to predict stability. Applied for the first time to laboratory and industrial probiotic powders, the model was able to provide a reliable mathematical representation of the effects of temperature on bacterial death (R2>0.9). The destruction rate (k) was determined according to the manufacturing process, strain and storage conditions. The marketed product demonstrated a better stability (k = 0.08 months−1) than the laboratory sample (k = 0.80 months−1). With industrial batches, k obtained at 6 months of studies was comparable to that obtained at 12 months, evidence of the model’s robustness. In addition, predicted values at 12 months were greatly similar (±30%) to those obtained by real-time assessing the model’s reliability. This method could be an interesting approach to predict the probiotic stability and could reduce to 6 months the length of stability studies as against 12 (ICH guideline) or 24 months (expiration date). PMID:24244412

  9. Predicting elastic properties of β-HMX from first-principles calculations.

    PubMed

    Peng, Qing; Rahul; Wang, Guangyu; Liu, Gui-Rong; Grimme, Stefan; De, Suvranu

    2015-05-07

    We investigate the performance of van der Waals (vdW) functions in predicting the elastic constants of β cyclotetramethylene tetranitramine (HMX) energetic molecular crystals using density functional theory (DFT) calculations. We confirm that the accuracy of the elastic constants is significantly improved using the vdW corrections with environment-dependent C6 together with PBE and revised PBE exchange-correlation functionals. The elastic constants obtained using PBE-D3(0) calculations yield the most accurate mechanical response of β-HMX when compared with experimental stress-strain data. Our results suggest that PBE-D3 calculations are reliable in predicting the elastic constants of this material.

  10. Influence of flowfield and vehicle parameters on engineering aerothermal methods

    NASA Technical Reports Server (NTRS)

    Wurster, Kathryn E.; Zoby, E. Vincent; Thompson, Richard A.

    1989-01-01

    The reliability and flexibility of three engineering codes used in the aerosphace industry (AEROHEAT, INCHES, and MINIVER) were investigated by comparing the results of these codes with Reentry F flight data and ground-test heat-transfer data for a range of cone angles, and with the predictions obtained using the detailed VSL3D code; the engineering solutions were also compared. In particular, the impact of several vehicle and flow-field parameters on the heat transfer and the capability of the engineering codes to predict these results were determined. It was found that entropy, pressure gradient, nose bluntness, gas chemistry, and angle of attack all affect heating levels. A comparison of the results of the three engineering codes with Reentry F flight data and with the predictions obtained of the VSL3D code showed a very good agreement in the regions of the applicability of the codes. It is emphasized that the parameters used in this study can significantly influence the actual heating levels and the prediction capability of a code.

  11. Henry's Constants of Persistent Organic Pollutants by a Group-Contribution Method Based on Scaled-Particle Theory.

    PubMed

    Razdan, Neil K; Koshy, David M; Prausnitz, John M

    2017-11-07

    A group-contribution method based on scaled-particle theory was developed to predict Henry's constants for six families of persistent organic pollutants: polychlorinated benzenes, polychlorinated biphenyls, polychlorinated dibenzodioxins, polychlorinated dibenzofurans, polychlorinated naphthalenes, and polybrominated diphenyl ethers. The group-contribution model uses limited experimental data to obtain group-interaction parameters for an easy-to-use method to predict Henry's constants for systems where reliable experimental data are scarce. By using group-interaction parameters obtained from data reduction, scaled-particle theory gives the partial molar Gibbs energy of dissolution, Δg̅ 2 , allowing calculation of Henry's constant, H 2 , for more than 700 organic pollutants. The average deviation between predicted values of log H 2 and experiment is 4%. Application of an approximate van't Hoff equation gives the temperature dependence of Henry's constants for polychlorinated biphenyls, polychlorinated naphthalenes, and polybrominated diphenyl ethers in the environmentally relevant range 0-40 °C.

  12. Fat grafting and breast reconstruction: tips for ensuring predictability.

    PubMed

    Gabriel, Allen; Champaneria, Manish C; Maxwell, G Patrick

    2015-06-01

    Autologous fat grafting is widely used in breast surgery to refine and optimize aesthetic outcomes. Despite its widespread use, obtaining predictable, reliable, and consistent outcomes remains a significant challenge and is influenced by the technique used for procurement, processing, and placement of the fat. At present, there is no published consensus on the optimal technique. The purpose of this article is to review current techniques at each stage of fat grafting and provide tips on best practices based on the published literature as well as our extensive clinical experience.

  13. Wealth and price distribution by diffusive approximation in a repeated prediction market

    NASA Astrophysics Data System (ADS)

    Bottazzi, Giulio; Giachini, Daniele

    2017-04-01

    The approximate agents' wealth and price invariant densities of a repeated prediction market model is derived using the Fokker-Planck equation of the associated continuous-time jump process. We show that the approximation obtained from the evolution of log-wealth difference can be reliably exploited to compute all the quantities of interest in all the acceptable parameter space. When the risk aversion of the trader is high enough, we are able to derive an explicit closed-form solution for the price distribution which is asymptotically correct.

  14. Evaluation of 3D-Jury on CASP7 models.

    PubMed

    Kaján, László; Rychlewski, Leszek

    2007-08-21

    3D-Jury, the structure prediction consensus method publicly available in the Meta Server http://meta.bioinfo.pl/, was evaluated using models gathered in the 7th round of the Critical Assessment of Techniques for Protein Structure Prediction (CASP7). 3D-Jury is an automated expert process that generates protein structure meta-predictions from sets of models obtained from partner servers. The performance of 3D-Jury was analysed for three aspects. First, we examined the correlation between the 3D-Jury score and a model quality measure: the number of correctly predicted residues. The 3D-Jury score was shown to correlate significantly with the number of correctly predicted residues, the correlation is good enough to be used for prediction. 3D-Jury was also found to improve upon the competing servers' choice of the best structure model in most cases. The value of the 3D-Jury score as a generic reliability measure was also examined. We found that the 3D-Jury score separates bad models from good models better than the reliability score of the original server in 27 cases and falls short of it in only 5 cases out of a total of 38. We report the release of a new Meta Server feature: instant 3D-Jury scoring of uploaded user models. The 3D-Jury score continues to be a good indicator of structural model quality. It also provides a generic reliability score, especially important for models that were not assigned such by the original server. Individual structure modellers can also benefit from the 3D-Jury scoring system by testing their models in the new instant scoring feature http://meta.bioinfo.pl/compare_your_model_example.pl available in the Meta Server.

  15. Evaluation of 3D-Jury on CASP7 models

    PubMed Central

    Kaján, László; Rychlewski, Leszek

    2007-01-01

    Background 3D-Jury, the structure prediction consensus method publicly available in the Meta Server , was evaluated using models gathered in the 7th round of the Critical Assessment of Techniques for Protein Structure Prediction (CASP7). 3D-Jury is an automated expert process that generates protein structure meta-predictions from sets of models obtained from partner servers. Results The performance of 3D-Jury was analysed for three aspects. First, we examined the correlation between the 3D-Jury score and a model quality measure: the number of correctly predicted residues. The 3D-Jury score was shown to correlate significantly with the number of correctly predicted residues, the correlation is good enough to be used for prediction. 3D-Jury was also found to improve upon the competing servers' choice of the best structure model in most cases. The value of the 3D-Jury score as a generic reliability measure was also examined. We found that the 3D-Jury score separates bad models from good models better than the reliability score of the original server in 27 cases and falls short of it in only 5 cases out of a total of 38. We report the release of a new Meta Server feature: instant 3D-Jury scoring of uploaded user models. Conclusion The 3D-Jury score continues to be a good indicator of structural model quality. It also provides a generic reliability score, especially important for models that were not assigned such by the original server. Individual structure modellers can also benefit from the 3D-Jury scoring system by testing their models in the new instant scoring feature available in the Meta Server. PMID:17711571

  16. Perceived noisiness under anechoic, semi-reverberant and earphone listening conditions

    NASA Technical Reports Server (NTRS)

    Clarke, F. R.; Kryter, K. D.

    1972-01-01

    Magnitude estimates by each of 31 listeners were obtained for a variety of noise sources under three methods of stimuli presentation: loudspeaker presentation in an anechoic chamber, loudspeaker presentation in a normal semi-reverberant room, and earphone presentation. Comparability of ratings obtained in these environments were evaluated with respect to predictability of ratings from physical measures, reliability of ratings, and to the scale values assigned to various noise stimuli. Acoustic environment was found to have little effect upon physical predictive measures and ratings of perceived noisiness were little affected by the acoustic environment in which they were obtained. The need for further study of possible differing interactions between judged noisiness of steady state sound and the methods of magnitude estimation and paired comparisons is indicated by the finding that in these tests the subjects, though instructed otherwise, apparently judged the maximum rather than the effective magnitude of steady-state noises.

  17. Prediction of wastewater quality indicators at the inflow to the wastewater treatment plant using data mining methods

    NASA Astrophysics Data System (ADS)

    Szeląg, Bartosz; Barbusiński, Krzysztof; Studziński, Jan; Bartkiewicz, Lidia

    2017-11-01

    In the study, models developed using data mining methods are proposed for predicting wastewater quality indicators: biochemical and chemical oxygen demand, total suspended solids, total nitrogen and total phosphorus at the inflow to wastewater treatment plant (WWTP). The models are based on values measured in previous time steps and daily wastewater inflows. Also, independent prediction systems that can be used in case of monitoring devices malfunction are provided. Models of wastewater quality indicators were developed using MARS (multivariate adaptive regression spline) method, artificial neural networks (ANN) of the multilayer perceptron type combined with the classification model (SOM) and cascade neural networks (CNN). The lowest values of absolute and relative errors were obtained using ANN+SOM, whereas the MARS method produced the highest error values. It was shown that for the analysed WWTP it is possible to obtain continuous prediction of selected wastewater quality indicators using the two developed independent prediction systems. Such models can ensure reliable WWTP work when wastewater quality monitoring systems become inoperable, or are under maintenance.

  18. Analysis of Factors Influencing Hydration Site Prediction Based on Molecular Dynamics Simulations

    PubMed Central

    2015-01-01

    Water contributes significantly to the binding of small molecules to proteins in biochemical systems. Molecular dynamics (MD) simulation based programs such as WaterMap and WATsite have been used to probe the locations and thermodynamic properties of hydration sites at the surface or in the binding site of proteins generating important information for structure-based drug design. However, questions associated with the influence of the simulation protocol on hydration site analysis remain. In this study, we use WATsite to investigate the influence of factors such as simulation length and variations in initial protein conformations on hydration site prediction. We find that 4 ns MD simulation is appropriate to obtain a reliable prediction of the locations and thermodynamic properties of hydration sites. In addition, hydration site prediction can be largely affected by the initial protein conformations used for MD simulations. Here, we provide a first quantification of this effect and further indicate that similar conformations of binding site residues (RMSD < 0.5 Å) are required to obtain consistent hydration site predictions. PMID:25252619

  19. A study of fault prediction and reliability assessment in the SEL environment

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Patnaik, Debabrata

    1986-01-01

    An empirical study on estimation and prediction of faults, prediction of fault detection and correction effort, and reliability assessment in the Software Engineering Laboratory environment (SEL) is presented. Fault estimation using empirical relationships and fault prediction using curve fitting method are investigated. Relationships between debugging efforts (fault detection and correction effort) in different test phases are provided, in order to make an early estimate of future debugging effort. This study concludes with the fault analysis, application of a reliability model, and analysis of a normalized metric for reliability assessment and reliability monitoring during development of software.

  20. Feasibility of the Two-Point Method for Determining the One-Repetition Maximum in the Bench Press Exercise.

    PubMed

    García-Ramos, Amador; Haff, Guy Gregory; Pestaña-Melero, Francisco Luis; Pérez-Castilla, Alejandro; Rojas, Francisco Javier; Balsalobre-Fernández, Carlos; Jaric, Slobodan

    2017-09-05

    This study compared the concurrent validity and reliability of previously proposed generalized group equations for estimating the bench press (BP) one-repetition maximum (1RM) with the individualized load-velocity relationship modelled with a two-point method. Thirty men (BP 1RM relative to body mass: 1.08 0.18 kg·kg -1 ) performed two incremental loading tests in the concentric-only BP exercise and another two in the eccentric-concentric BP exercise to assess their actual 1RM and load-velocity relationships. A high velocity (≈ 1 m·s -1 ) and a low velocity (≈ 0.5 m·s -1 ) was selected from their load-velocity relationships to estimate the 1RM from generalized group equations and through an individual linear model obtained from the two velocities. The directly measured 1RM was highly correlated with all predicted 1RMs (r range: 0.847-0.977). The generalized group equations systematically underestimated the actual 1RM when predicted from the concentric-only BP (P <0.001; effect size [ES] range: 0.15-0.94), but overestimated it when predicted from the eccentric-concentric BP (P <0.001; ES range: 0.36-0.98). Conversely, a low systematic bias (range: -2.3-0.5 kg) and random errors (range: 3.0-3.8 kg), no heteroscedasticity of errors (r 2 range: 0.053-0.082), and trivial ES (range: -0.17-0.04) were observed when the prediction was based on the two-point method. Although all examined methods reported the 1RM with high reliability (CV≤5.1%; ICC≥0.89), the direct method was the most reliable (CV<2.0%; ICC≥0.98). The quick, fatigue-free, and practical two-point method was able to predict the BP 1RM with high reliability and practically perfect validity, and therefore we recommend its use over generalized group equations.

  1. ASME V\\&V challenge problem: Surrogate-based V&V

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beghini, Lauren L.; Hough, Patricia D.

    2015-12-18

    The process of verification and validation can be resource intensive. From the computational model perspective, the resource demand typically arises from long simulation run times on multiple cores coupled with the need to characterize and propagate uncertainties. In addition, predictive computations performed for safety and reliability analyses have similar resource requirements. For this reason, there is a tradeoff between the time required to complete the requisite studies and the fidelity or accuracy of the results that can be obtained. At a high level, our approach is cast within a validation hierarchy that provides a framework in which we perform sensitivitymore » analysis, model calibration, model validation, and prediction. The evidence gathered as part of these activities is mapped into the Predictive Capability Maturity Model to assess credibility of the model used for the reliability predictions. With regard to specific technical aspects of our analysis, we employ surrogate-based methods, primarily based on polynomial chaos expansions and Gaussian processes, for model calibration, sensitivity analysis, and uncertainty quantification in order to reduce the number of simulations that must be done. The goal is to tip the tradeoff balance to improving accuracy without increasing the computational demands.« less

  2. Bayes Analysis and Reliability Implications of Stress-Rupture Testing a Kevlar/Epoxy COPV Using Temperature and Pressure Acceleration

    NASA Technical Reports Server (NTRS)

    Phoenix, S. Leigh; Kezirian, Michael T.; Murthy, Pappu L. N.

    2009-01-01

    Composite Overwrapped Pressure Vessels (COPVs) that have survived a long service time under pressure generally must be recertified before service is extended. Flight certification is dependent on the reliability analysis to quantify the risk of stress rupture failure in existing flight vessels. Full certification of this reliability model would require a statistically significant number of lifetime tests to be performed and is impractical given the cost and limited flight hardware for certification testing purposes. One approach to confirm the reliability model is to perform a stress rupture test on a flight COPV. Currently, testing of such a Kevlar49 (Dupont)/epoxy COPV is nearing completion. The present paper focuses on a Bayesian statistical approach to analyze the possible failure time results of this test and to assess the implications in choosing between possible model parameter values that in the past have had significant uncertainty. The key uncertain parameters in this case are the actual fiber stress ratio at operating pressure, and the Weibull shape parameter for lifetime; the former has been uncertain due to ambiguities in interpreting the original and a duplicate burst test. The latter has been uncertain due to major differences between COPVs in the database and the actual COPVs in service. Any information obtained that clarifies and eliminates uncertainty in these parameters will have a major effect on the predicted reliability of the service COPVs going forward. The key result is that the longer the vessel survives, the more likely the more optimistic stress ratio model is correct. At the time of writing, the resulting effect on predicted future reliability is dramatic, increasing it by about one "nine," that is, reducing the predicted probability of failure by an order of magnitude. However, testing one vessel does not change the uncertainty on the Weibull shape parameter for lifetime since testing several vessels would be necessary.

  3. Improved Rubin-Bodner Model for the Prediction of Soft Tissue Deformations

    PubMed Central

    Zhang, Guangming; Xia, James J.; Liebschner, Michael; Zhang, Xiaoyan; Kim, Daeseung; Zhou, Xiaobo

    2016-01-01

    In craniomaxillofacial (CMF) surgery, a reliable way of simulating the soft tissue deformation resulted from skeletal reconstruction is vitally important for preventing the risks of facial distortion postoperatively. However, it is difficult to simulate the soft tissue behaviors affected by different types of CMF surgery. This study presents an integrated bio-mechanical and statistical learning model to improve accuracy and reliability of predictions on soft facial tissue behavior. The Rubin-Bodner (RB) model is initially used to describe the biomechanical behavior of the soft facial tissue. Subsequently, a finite element model (FEM) computers the stress of each node in soft facial tissue mesh data resulted from bone displacement. Next, the Generalized Regression Neural Network (GRNN) method is implemented to obtain the relationship between the facial soft tissue deformation and the stress distribution corresponding to different CMF surgical types and to improve evaluation of elastic parameters included in the RB model. Therefore, the soft facial tissue deformation can be predicted by biomechanical properties and statistical model. Leave-one-out cross-validation is used on eleven patients. As a result, the average prediction error of our model (0.7035mm) is lower than those resulting from other approaches. It also demonstrates that the more accurate bio-mechanical information the model has, the better prediction performance it could achieve. PMID:27717593

  4. Use of differential scanning calorimetry to detect canola oil (Brassica napus L.) adulterated with lard stearin.

    PubMed

    Marikkar, Jalaldeen Mohammed Nazrim; Rana, Sohel

    2014-01-01

    A study was conducted to detect and quantify lard stearin (LS) content in canola oil (CaO) using differential scanning calorimetry (DSC). Authentic samples of CaO were obtained from a reliable supplier and the adulterant LS were obtained through a fractional crystallization procedure as reported previously. Pure CaO samples spiked with LS in levels ranging from 5 to 15% (w/w) were analyzed using DSC to obtain their cooling and heating profiles. The results showed that samples contaminated with LS at 5% (w/w) level can be detected using characteristic contaminant peaks appearing in the higher temperature regions (0 to 70°C) of the cooling and heating curves. Pearson correlation analysis of LS content against individual DSC parameters of the adulterant peak namely peak temperature, peak area, peak onset temperature indicated that there were strong correlations between these with the LS content of the CaO admixtures. When these three parameters were engaged as variables in the execution of the stepwise regression procedure, predictive models for determination of LS content in CaO were obtained. The predictive models obtained with single DSC parameter had relatively lower coefficient of determination (R(2) value) and higher standard error than the models obtained using two DSC parameters in combination. This study concluded that the predictive models obtained with peak area and peak onset temperature of the adulteration peak would be more accurate for prediction of LS content in CaO based on the highest coefficient of determination (R(2) value) and smallest standard error.

  5. Improving the reliability of female fertility breeding values using type and milk yield traits that predict energy status in Australian Holstein cattle.

    PubMed

    González-Recio, O; Haile-Mariam, M; Pryce, J E

    2016-01-01

    The objectives of this study were (1) to propose changing the selection criteria trait for evaluating fertility in Australia from calving interval to conception rate at d 42 after the beginning of the mating season and (2) to use type traits as early fertility predictors, to increase the reliability of estimated breeding values for fertility. The breeding goal in Australia is conception within 6 wk of the start of the mating season. Currently, the Australian model to predict fertility breeding values (expressed as a linear transformation of calving interval) is a multitrait model that includes calving interval (CVI), lactation length (LL), calving to first service (CFS), first nonreturn rate (FNRR), and conception rate. However, CVI has a lower genetic correlation with the breeding goal (conception within 6 wk of the start of the mating season) than conception rate. Milk yield, type, and fertility data from 164,318 cow sired by 4,766 bulls were used. Principal component analysis and genetic correlation estimates between type and fertility traits were used to select type traits that could subsequently be used in a multitrait analysis. Angularity, foot angle, and pin set were chosen as type traits to include in an index with the traits that are included in the multitrait fertility model: CVI, LL, CFS, FNRR, and conception rate at d 42 (CR42). An index with these 8 traits is expected to achieve an average bull first proof reliability of 0.60 on the breeding objective (conception within 6 wk of the start of the mating season) compared with reliabilities of 0.39 and 0.45 for CR42 only or the current 5-trait Australian model. Subsequently, we used the first eigenvector of a principal component analysis with udder texture, bone quality, angularity, and body condition score to calculate an energy status indicator trait. The inclusion of the energy status indicator trait composite in a multitrait index with CVI, LL, CFS, FNRR, and CR42 achieved a 12-point increase in fertility breeding value reliability (i.e., increased by 30%; up to 0.72 points of reliability), whereas a lower increase in reliability (4 points, i.e., increased by 10%) was obtained by including angularity, foot angle, and pin set in the index. In situations when a limited number of daughters have been phenotyped for CR42, including type data for sires increased reliabilities compared with when type data were omitted. However, sires with more than 80 daughters with CR42 records achieved reliability estimates close to 80% on average, and there did not appear to be a benefit from having daughters with type records. The cost of phenotyping to obtain such reliabilities (assuming a cost of AU$14 per cow with type data and AU$5 per cow with pregnancy diagnosed) is lower if more pregnancy data are collected in preference to type data. That is, efforts to increase the reliability of fertility EBV are most cost effective when directed at obtaining a larger number of pregnancy tests. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  6. Analytical Algorithms to Quantify the Uncertainty in Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Saxena, Abhinav; Daigle, Matthew; Goebel, Kai

    2013-01-01

    This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decisionmaking. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the the first-order second moment method (FOSM), the first-order reliability method (FORM), and the inverse first-order reliability method (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.

  7. Independent-Trajectory Thermodynamic Integration: a practical guide to protein-drug binding free energy calculations using distributed computing.

    PubMed

    Lawrenz, Morgan; Baron, Riccardo; Wang, Yi; McCammon, J Andrew

    2012-01-01

    The Independent-Trajectory Thermodynamic Integration (IT-TI) approach for free energy calculation with distributed computing is described. IT-TI utilizes diverse conformational sampling obtained from multiple, independent simulations to obtain more reliable free energy estimates compared to single TI predictions. The latter may significantly under- or over-estimate the binding free energy due to finite sampling. We exemplify the advantages of the IT-TI approach using two distinct cases of protein-ligand binding. In both cases, IT-TI yields distributions of absolute binding free energy estimates that are remarkably centered on the target experimental values. Alternative protocols for the practical and general application of IT-TI calculations are investigated. We highlight a protocol that maximizes predictive power and computational efficiency.

  8. Optimal level of continuous positive airway pressure: auto-adjusting titration versus titration with a predictive equation.

    PubMed

    Choi, Ji Ho; Jun, Young Joon; Oh, Jeong In; Jung, Jong Yoon; Hwang, Gyu Ho; Kwon, Soon Young; Lee, Heung Man; Kim, Tae Hoon; Lee, Sang Hag; Lee, Seung Hoon

    2013-05-01

    The aims of the present study were twofold. We sought to compare two methods of titrating the level of continuous positive airway pressure (CPAP) - auto-adjusting titration and titration using a predictive equation - with full-night manual titration used as the benchmark. We also investigated the reliability of the two methods in patients with obstructive sleep apnea syndrome (OSAS). Twenty consecutive adult patients with OSAS who had successful, full-night manual and auto-adjusting CPAP titration participated in this study. The titration pressure level was calculated with a previously developed predictive equation based on body mass index and apnea-hypopnea index. The mean titration pressure levels obtained with the manual, auto-adjusting, and predictive equation methods were 9.0 +/- 3.6, 9.4 +/- 3.0, and 8.1 +/- 1.6 cm H2O,respectively. There was a significant difference in the concordance within the range of +/- 2 cm H2O (p = 0.019) between both the auto-adjusting titration and the titration using the predictive equation compared to the full-night manual titration. However, there was no significant difference in the concordance within the range of +/- 1 cm H2O (p > 0.999). When compared to full-night manual titration as the standard method, auto-adjusting titration appears to be more reliable than using a predictive equation for determining the optimal CPAP level in patients with OSAS.

  9. Analysis of linear measurements on 3D surface models using CBCT data segmentation obtained by automatic standard pre-set thresholds in two segmentation software programs: an in vitro study.

    PubMed

    Poleti, Marcelo Lupion; Fernandes, Thais Maria Freire; Pagin, Otávio; Moretti, Marcela Rodrigues; Rubira-Bullen, Izabel Regina Fischer

    2016-01-01

    The aim of this in vitro study was to evaluate the reliability and accuracy of linear measurements on three-dimensional (3D) surface models obtained by standard pre-set thresholds in two segmentation software programs. Ten mandibles with 17 silica markers were scanned for 0.3-mm voxels in the i-CAT Classic (Imaging Sciences International, Hatfield, PA, USA). Twenty linear measurements were carried out by two observers two times on the 3D surface models: the Dolphin Imaging 11.5 (Dolphin Imaging & Management Solutions, Chatsworth, CA, USA), using two filters(Translucent and Solid-1), and in the InVesalius 3.0.0 (Centre for Information Technology Renato Archer, Campinas, SP, Brazil). The physical measurements were made by another observer two times using a digital caliper on the dry mandibles. Excellent intra- and inter-observer reliability for the markers, physical measurements, and 3D surface models were found (intra-class correlation coefficient (ICC) and Pearson's r ≥ 0.91). The linear measurements on 3D surface models by Dolphin and InVesalius software programs were accurate (Dolphin Solid-1 > InVesalius > Dolphin Translucent). The highest absolute and percentage errors were obtained for the variable R1-R1 (1.37 mm) and MF-AC (2.53 %) in the Dolphin Translucent and InVesalius software, respectively. Linear measurements on 3D surface models obtained by standard pre-set thresholds in the Dolphin and InVesalius software programs are reliable and accurate compared with physical measurements. Studies that evaluate the reliability and accuracy of the 3D models are necessary to ensure error predictability and to establish diagnosis, treatment plan, and prognosis in a more realistic way.

  10. Mining data from hemodynamic simulations for generating prediction and explanation models.

    PubMed

    Bosnić, Zoran; Vračar, Petar; Radović, Milos D; Devedžić, Goran; Filipović, Nenad D; Kononenko, Igor

    2012-03-01

    One of the most common causes of human death is stroke, which can be caused by carotid bifurcation stenosis. In our work, we aim at proposing a prototype of a medical expert system that could significantly aid medical experts to detect hemodynamic abnormalities (increased artery wall shear stress). Based on the acquired simulated data, we apply several methodologies for1) predicting magnitudes and locations of maximum wall shear stress in the artery, 2) estimating reliability of computed predictions, and 3) providing user-friendly explanation of the model's decision. The obtained results indicate that the evaluated methodologies can provide a useful tool for the given problem domain. © 2012 IEEE

  11. On the Tradeoff Between Altruism and Selfishness in MANET Trust Management

    DTIC Science & Technology

    2016-04-07

    to discourage selfish behaviors, using a hidden Markov model (HMM) to quanti - tatively measure the trustworthiness of nodes. Adams et al. [18...based reliability metric to predict trust-based system survivability. Section 4 analyzes numerical results obtained through the evaluation of our SPN...concepts in MANETs, trust man- agement for MANETs should consider the following design features: trust metrics must be customizable, evaluation of

  12. Peer Ratings: Scoring Strategy Development and Reliability Demonstration on Air Force Basic Trainees. Final Report.

    ERIC Educational Resources Information Center

    Borman, Walter C.; Rosse, Rodney L.

    As an alternative for or adjunct to paper-and-pencil tests for predicting personnel performance, the United States Air Force studied the use of peer ratings as an evaluative tool. Purpose of this study was to evaluate the psychometric characteristics of peer ratings among Air Force basic trainees. Peer ratings were obtained from more than 27,000…

  13. Modeling and simulation of reliability of unmanned intelligent vehicles

    NASA Astrophysics Data System (ADS)

    Singh, Harpreet; Dixit, Arati M.; Mustapha, Adam; Singh, Kuldip; Aggarwal, K. K.; Gerhart, Grant R.

    2008-04-01

    Unmanned ground vehicles have a large number of scientific, military and commercial applications. A convoy of such vehicles can have collaboration and coordination. For the movement of such a convoy, it is important to predict the reliability of the system. A number of approaches are available in literature which describes the techniques for determining the reliability of the system. Graph theoretic approaches are popular in determining terminal reliability and system reliability. In this paper we propose to exploit Fuzzy and Neuro-Fuzzy approaches for predicting the node and branch reliability of the system while Boolean algebra approaches are used to determine terminal reliability and system reliability. Hence a combination of intelligent approaches like Fuzzy, Neuro-Fuzzy and Boolean approaches is used to predict the overall system reliability of a convoy of vehicles. The node reliabilities may correspond to the collaboration of vehicles while branch reliabilities will determine the terminal reliabilities between different nodes. An algorithm is proposed for determining the system reliabilities of a convoy of vehicles. The simulation of the overall system is proposed. Such simulation should be helpful to the commander to take an appropriate action depending on the predicted reliability in different terrain and environmental conditions. It is hoped that results of this paper will lead to more important techniques to have a reliable convoy of vehicles in a battlefield.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Puskar, Joseph David; Quintana, Michael A.; Sorensen, Neil Robert

    A program is underway at Sandia National Laboratories to predict long-term reliability of photovoltaic (PV) systems. The vehicle for the reliability predictions is a Reliability Block Diagram (RBD), which models system behavior. Because this model is based mainly on field failure and repair times, it can be used to predict current reliability, but it cannot currently be used to accurately predict lifetime. In order to be truly predictive, physics-informed degradation processes and failure mechanisms need to be included in the model. This paper describes accelerated life testing of metal foil tapes used in thin-film PV modules, and how tape jointmore » degradation, a possible failure mode, can be incorporated into the model.« less

  15. Star-shaped discotic compounds with tetrazole and oxadiazole fragments

    NASA Astrophysics Data System (ADS)

    Usol'tseva, Nadezhda V.; Akopova, Olga B.; Smirnova, Antonina I.; Kovaleva, Maria I.; Bumbina, Natalia V.; Zharnikova, Nataliia V.

    2017-08-01

    Two series of star-shaped discotic compounds (A and B) were studied to establish the relationship between their molecular structure and mesogenity. Series A included 19 three-arm compounds with known mesomorphism. Series B consisted of 132 new compounds with unknown mesomorphism: pyromellitic and cyanuric acid derivatives, 5,5‧-azo-bis-isophthalic and 4,4‧-azodiphthalic acids and triphenylene derivatives. The columnar mesomorphism prediction data for both series were obtained using the original program СМР ChemCard. The prediction data for series A are in good agreement with the experimental results and the reliability of the prediction was estimated to be 89.5%. The same method was applied for series B. The prediction results were approved by the synthesis of individual representatives of series B. A good correlation of the prediction with the experimental data was revealed.

  16. Predicting Cost/Reliability/Maintainability of Advanced General Aviation Avionics Equipment

    NASA Technical Reports Server (NTRS)

    Davis, M. R.; Kamins, M.; Mooz, W. E.

    1978-01-01

    A methodology is provided for assisting NASA in estimating the cost, reliability, and maintenance (CRM) requirements for general avionics equipment operating in the 1980's. Practical problems of predicting these factors are examined. The usefulness and short comings of different approaches for modeling coast and reliability estimates are discussed together with special problems caused by the lack of historical data on the cost of maintaining general aviation avionics. Suggestions are offered on how NASA might proceed in assessing cost reliability CRM implications in the absence of reliable generalized predictive models.

  17. Application of common y-intercept regression parameters for log Kp vs 1/ T for predicting gas-particle partitioning in the urban environment

    NASA Astrophysics Data System (ADS)

    Pankow, James F.

    Gas-particle partitioning is examined using a partitioning constant Kp = ( F/ TSP)/ A, where F (ng m -3) and A (ng m -3) are the particulate-associated and gas-phase concentrations, respectively, and TSP is the total suspended particulate matter level (μg m -3). Compound-dependent values of Kp depend on temperature ( T) according to Kp = mp/ T + bp. Limitations in data quality can cause errors in estimates of mp and bp obtained by simple linear regression (SLR). However, within a group of similar compounds, the bp values will be similar. By pooling data, an improved set of mp and a single bp can be obtained by common y-intercept regression (CYIR). SLR estimates for mp and bp for polycyclic aromatic hydrocarbons (PAHs) sorbing to urban Osaka particulate matter are available (Yamasaki et al., 1982, Envir. Sci. Technol.16, 189-194), as are CYIR estimates for the same particulate matter (Pankow, 1991, Atmospheric Environment25A, 2229-2239). In this work, a comparison was conducted of the ability of these two sets of mp and bp to predict A/ F ratios for PAHs based on measured T and TSP values for data obtained in other urban locations, specifically: (1) in and near the Baltimore Harbor Tunnel by Benner (1988, Ph.D thesis, University of Maryland) and Benner et al. (1989, Envir. Sci. Technol.23, 1269-1278); and (2) in Chicago by Cotham (1990, Ph.D. thesis, University of South Carolina). In general, the CYIR estimates for mp and bp obtained for Osaka particulate matter were found to be at least as reliable, and for some compounds more reliable than their SLR counterparts in predicting gas-particle ratios for PAHs. This result provides further evidence of the utility of the CYIR approach in quantitating the dependence of log Kp values on 1/ T.

  18. Reliable and fast quantitative analysis of active ingredient in pharmaceutical suspension using Raman spectroscopy.

    PubMed

    Park, Seok Chan; Kim, Minjung; Noh, Jaegeun; Chung, Hoeil; Woo, Youngah; Lee, Jonghwa; Kemper, Mark S

    2007-06-12

    The concentration of acetaminophen in a turbid pharmaceutical suspension has been measured successfully using Raman spectroscopy. The spectrometer was equipped with a large spot probe which enabled the coverage of a representative area during sampling. This wide area illumination (WAI) scheme (coverage area 28.3 mm2) for Raman data collection proved to be more reliable for the compositional determination of these pharmaceutical suspensions, especially when the samples were turbid. The reproducibility of measurement using the WAI scheme was compared to that of using a conventional small-spot scheme which employed a much smaller illumination area (about 100 microm spot size). A layer of isobutyric anhydride was placed in front of the sample vials to correct the variation in the Raman intensity due to the fluctuation of laser power. Corrections were accomplished using the isolated carbonyl band of isobutyric anhydride. The acetaminophen concentrations of prediction samples were accurately estimated using a partial least squares (PLS) calibration model. The prediction accuracy was maintained even with changes in laser power. It was noted that the prediction performance was somewhat degraded for turbid suspensions with high acetaminophen contents. When comparing the results of reproducibility obtained with the WAI scheme and those obtained using the conventional scheme, it was concluded that the quantitative determination of the active pharmaceutical ingredient (API) in turbid suspensions is much improved when employing a larger laser coverage area. This is presumably due to the improvement in representative sampling.

  19. Reliability analysis and fault-tolerant system development for a redundant strapdown inertial measurement unit. [inertial platforms

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.

  20. Evaluation and comparison of the ability of online available prediction programs to predict true linear B-cell epitopes.

    PubMed

    Costa, Juan G; Faccendini, Pablo L; Sferco, Silvano J; Lagier, Claudia M; Marcipar, Iván S

    2013-06-01

    This work deals with the use of predictors to identify useful B-cell linear epitopes to develop immunoassays. Experimental techniques to meet this goal are quite expensive and time consuming. Therefore, we tested 5 free, online prediction methods (AAPPred, ABCpred, BcePred, BepiPred and Antigenic) widely used for predicting linear epitopes, using the primary structure of the protein as the only input. We chose a set of 65 experimentally well documented epitopes obtained by the most reliable experimental techniques as our true positive set. To compare the quality of the predictor methods we used their positive predictive value (PPV), i.e. the proportion of the predicted epitopes that are true, experimentally confirmed epitopes, in relation to all the epitopes predicted. We conclude that AAPPred and ABCpred yield the best results as compared with the other programs and with a random prediction procedure. Our results also indicate that considering the consensual epitopes predicted by several programs does not improve the PPV.

  1. Test-Retest Reliability and Predictive Validity of the Implicit Association Test in Children

    ERIC Educational Resources Information Center

    Rae, James R.; Olson, Kristina R.

    2018-01-01

    The Implicit Association Test (IAT) is increasingly used in developmental research despite minimal evidence of whether children's IAT scores are reliable across time or predictive of behavior. When test-retest reliability and predictive validity have been assessed, the results have been mixed, and because these studies have differed on many…

  2. Prediction task guided representation learning of medical codes in EHR.

    PubMed

    Cui, Liwen; Xie, Xiaolei; Shen, Zuojun

    2018-06-18

    There have been rapidly growing applications using machine learning models for predictive analytics in Electronic Health Records (EHR) to improve the quality of hospital services and the efficiency of healthcare resource utilization. A fundamental and crucial step in developing such models is to convert medical codes in EHR to feature vectors. These medical codes are used to represent diagnoses or procedures. Their vector representations have a tremendous impact on the performance of machine learning models. Recently, some researchers have utilized representation learning methods from Natural Language Processing (NLP) to learn vector representations of medical codes. However, most previous approaches are unsupervised, i.e. the generation of medical code vectors is independent from prediction tasks. Thus, the obtained feature vectors may be inappropriate for a specific prediction task. Moreover, unsupervised methods often require a lot of samples to obtain reliable results, but most practical problems have very limited patient samples. In this paper, we develop a new method called Prediction Task Guided Health Record Aggregation (PTGHRA), which aggregates health records guided by prediction tasks, to construct training corpus for various representation learning models. Compared with unsupervised approaches, representation learning models integrated with PTGHRA yield a significant improvement in predictive capability of generated medical code vectors, especially for limited training samples. Copyright © 2018. Published by Elsevier Inc.

  3. A novel approach based on KATZ measure to predict associations of human microbiota with non-infectious diseases.

    PubMed

    Chen, Xing; Huang, Yu-An; You, Zhu-Hong; Yan, Gui-Ying; Wang, Xue-Song

    2017-03-01

    Accumulating clinical observations have indicated that microbes living in the human body are closely associated with a wide range of human noninfectious diseases, which provides promising insights into the complex disease mechanism understanding. Predicting microbe-disease associations could not only boost human disease diagnostic and prognostic, but also improve the new drug development. However, little efforts have been attempted to understand and predict human microbe-disease associations on a large scale until now. In this work, we constructed a microbe-human disease association network and further developed a novel computational model of KATZ measure for Human Microbe-Disease Association prediction (KATZHMDA) based on the assumption that functionally similar microbes tend to have similar interaction and non-interaction patterns with noninfectious diseases, and vice versa. To our knowledge, KATZHMDA is the first tool for microbe-disease association prediction. The reliable prediction performance could be attributed to the use of KATZ measurement, and the introduction of Gaussian interaction profile kernel similarity for microbes and diseases. LOOCV and k-fold cross validation were implemented to evaluate the effectiveness of this novel computational model based on known microbe-disease associations obtained from HMDAD database. As a result, KATZHMDA achieved reliable performance with average AUCs of 0.8130 ± 0.0054, 0.8301 ± 0.0033 and 0.8382 in 2-fold and 5-fold cross validation and LOOCV framework, respectively. It is anticipated that KATZHMDA could be used to obtain more novel microbes associated with important noninfectious human diseases and therefore benefit drug discovery and human medical improvement. Matlab codes and dataset explored in this work are available at http://dwz.cn/4oX5mS . xingchen@amss.ac.cn or zhuhongyou@gmail.com or wangxuesongcumt@163.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  4. ARACNe-based inference, using curated microarray data, of Arabidopsis thaliana root transcriptional regulatory networks

    PubMed Central

    2014-01-01

    Background Uncovering the complex transcriptional regulatory networks (TRNs) that underlie plant and animal development remains a challenge. However, a vast amount of data from public microarray experiments is available, which can be subject to inference algorithms in order to recover reliable TRN architectures. Results In this study we present a simple bioinformatics methodology that uses public, carefully curated microarray data and the mutual information algorithm ARACNe in order to obtain a database of transcriptional interactions. We used data from Arabidopsis thaliana root samples to show that the transcriptional regulatory networks derived from this database successfully recover previously identified root transcriptional modules and to propose new transcription factors for the SHORT ROOT/SCARECROW and PLETHORA pathways. We further show that these networks are a powerful tool to integrate and analyze high-throughput expression data, as exemplified by our analysis of a SHORT ROOT induction time-course microarray dataset, and are a reliable source for the prediction of novel root gene functions. In particular, we used our database to predict novel genes involved in root secondary cell-wall synthesis and identified the MADS-box TF XAL1/AGL12 as an unexpected participant in this process. Conclusions This study demonstrates that network inference using carefully curated microarray data yields reliable TRN architectures. In contrast to previous efforts to obtain root TRNs, that have focused on particular functional modules or tissues, our root transcriptional interactions provide an overview of the transcriptional pathways present in Arabidopsis thaliana roots and will likely yield a plethora of novel hypotheses to be tested experimentally. PMID:24739361

  5. Genomic prediction using imputed whole-genome sequence data in Holstein Friesian cattle.

    PubMed

    van Binsbergen, Rianne; Calus, Mario P L; Bink, Marco C A M; van Eeuwijk, Fred A; Schrooten, Chris; Veerkamp, Roel F

    2015-09-17

    In contrast to currently used single nucleotide polymorphism (SNP) panels, the use of whole-genome sequence data is expected to enable the direct estimation of the effects of causal mutations on a given trait. This could lead to higher reliabilities of genomic predictions compared to those based on SNP genotypes. Also, at each generation of selection, recombination events between a SNP and a mutation can cause decay in reliability of genomic predictions based on markers rather than on the causal variants. Our objective was to investigate the use of imputed whole-genome sequence genotypes versus high-density SNP genotypes on (the persistency of) the reliability of genomic predictions using real cattle data. Highly accurate phenotypes based on daughter performance and Illumina BovineHD Beadchip genotypes were available for 5503 Holstein Friesian bulls. The BovineHD genotypes (631,428 SNPs) of each bull were used to impute whole-genome sequence genotypes (12,590,056 SNPs) using the Beagle software. Imputation was done using a multi-breed reference panel of 429 sequenced individuals. Genomic estimated breeding values for three traits were predicted using a Bayesian stochastic search variable selection (BSSVS) model and a genome-enabled best linear unbiased prediction model (GBLUP). Reliabilities of predictions were based on 2087 validation bulls, while the other 3416 bulls were used for training. Prediction reliabilities ranged from 0.37 to 0.52. BSSVS performed better than GBLUP in all cases. Reliabilities of genomic predictions were slightly lower with imputed sequence data than with BovineHD chip data. Also, the reliabilities tended to be lower for both sequence data and BovineHD chip data when relationships between training animals were low. No increase in persistency of prediction reliability using imputed sequence data was observed. Compared to BovineHD genotype data, using imputed sequence data for genomic prediction produced no advantage. To investigate the putative advantage of genomic prediction using (imputed) sequence data, a training set with a larger number of individuals that are distantly related to each other and genomic prediction models that incorporate biological information on the SNPs or that apply stricter SNP pre-selection should be considered.

  6. Using ensemble of classifiers for predicting HIV protease cleavage sites in proteins.

    PubMed

    Nanni, Loris; Lumini, Alessandra

    2009-03-01

    The focus of this work is the use of ensembles of classifiers for predicting HIV protease cleavage sites in proteins. Due to the complex relationships in the biological data, several recent works show that often ensembles of learning algorithms outperform stand-alone methods. We show that the fusion of approaches based on different encoding models can be useful for improving the performance of this classification problem. In particular, in this work four different feature encodings for peptides are described and tested. An extensive evaluation on a large dataset according to a blind testing protocol is reported which demonstrates how different feature extraction methods and classifiers can be combined for obtaining a robust and reliable system. The comparison with other stand-alone approaches allows quantifying the performance improvement obtained by the ensembles proposed in this work.

  7. Effects of uncertainties in hydrological modelling. A case study of a mountainous catchment in Southern Norway

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjørn; Steinsland, Ingelin; Johansen, Stian Solvang; Petersen-Øverleir, Asgeir; Kolberg, Sjur

    2016-05-01

    In this study, we explore the effect of uncertainty and poor observation quality on hydrological model calibration and predictions. The Osali catchment in Western Norway was selected as case study and an elevation distributed HBV-model was used. We systematically evaluated the effect of accounting for uncertainty in parameters, precipitation input, temperature input and streamflow observations. For precipitation and temperature we accounted for the interpolation uncertainty, and for streamflow we accounted for rating curve uncertainty. Further, the effects of poorer quality of precipitation input and streamflow observations were explored. Less information about precipitation was obtained by excluding the nearest precipitation station from the analysis, while reduced information about the streamflow was obtained by omitting the highest and lowest streamflow observations when estimating the rating curve. The results showed that including uncertainty in the precipitation and temperature inputs has a negligible effect on the posterior distribution of parameters and for the Nash-Sutcliffe (NS) efficiency for the predicted flows, while the reliability and the continuous rank probability score (CRPS) improves. Less information in precipitation input resulted in a shift in the water balance parameter Pcorr, a model producing smoother streamflow predictions, giving poorer NS and CRPS, but higher reliability. The effect of calibrating the hydrological model using streamflow observations based on different rating curves is mainly seen as variability in the water balance parameter Pcorr. When evaluating predictions, the best evaluation scores were not achieved for the rating curve used for calibration, but for rating curves giving smoother streamflow observations. Less information in streamflow influenced the water balance parameter Pcorr, and increased the spread in evaluation scores by giving both better and worse scores.

  8. Tracking reliability for space cabin-borne equipment in development by Crow model.

    PubMed

    Chen, J D; Jiao, S J; Sun, H L

    2001-12-01

    Objective. To study and track the reliability growth of manned spaceflight cabin-borne equipment in the course of its development. Method. A new technique of reliability growth estimation and prediction, which is composed of the Crow model and test data conversion (TDC) method was used. Result. The estimation and prediction value of the reliability growth conformed to its expectations. Conclusion. The method could dynamically estimate and predict the reliability of the equipment by making full use of various test information in the course of its development. It offered not only a possibility of tracking the equipment reliability growth, but also the reference for quality control in manned spaceflight cabin-borne equipment design and development process.

  9. Student assessment by objective structured examination in a neurology clerkship

    PubMed Central

    Adesoye, Taiwo; Smith, Sandy; Blood, Angela; Brorson, James R.

    2012-01-01

    Objectives: We evaluated the reliability and predictive ability of an objective structured clinical examination (OSCE) in the assessment of medical students at the completion of a neurology clerkship. Methods: We analyzed data from 195 third-year medical students who took the OSCE. For each student, the OSCE consisted of 2 standardized patient encounters. The scores obtained from each encounter were compared. Faculty clinical evaluations of each student for 2 clinical inpatient rotations were also compared. Hierarchical regression analysis was applied to test the ability of the averaged OSCE scores to predict standardized written examination scores and composite clinical scores. Results: Students' OSCE scores from the 2 standardized patient encounters were significantly correlated with each other (r = 0.347, p < 0.001), and the scores for all students were normally distributed. In contrast, students' faculty clinical evaluation scores from 2 different clinical inpatient rotations were uncorrelated, and scores were skewed toward the highest ratings. After accounting for clerkship order, better OSCE scores were predictive of better National Board of Medical Examiners standardized examination scores (R2Δ = 0.131, p < 0.001) and of better faculty clinical scores (R2Δ = 0.078, p < 0.001). Conclusions: Student assessment by an OSCE provides a reliable and predictive objective assessment of clinical performance in a neurology clerkship. PMID:22855865

  10. Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

    2002-01-01

    The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

  11. The Shutdown Dissociation Scale (Shut-D)

    PubMed Central

    Schalinski, Inga; Schauer, Maggie; Elbert, Thomas

    2015-01-01

    The evolutionary model of the defense cascade by Schauer and Elbert (2010) provides a theoretical frame for a short interview to assess problems underlying and leading to the dissociative subtype of posttraumatic stress disorder. Based on known characteristics of the defense stages “fright,” “flag,” and “faint,” we designed a structured interview to assess the vulnerability for the respective types of dissociation. Most of the scales that assess dissociative phenomena are designed as self-report questionnaires. Their items are usually selected based on more heuristic considerations rather than a theoretical model and thus include anything from minor dissociative experiences to major pathological dissociation. The shutdown dissociation scale (Shut-D) was applied in several studies in patients with a history of multiple traumatic events and different disorders that have been shown previously to be prone to symptoms of dissociation. The goal of the present investigation was to obtain psychometric characteristics of the Shut-D (including factor structure, internal consistency, retest reliability, predictive, convergent and criterion-related concurrent validity). A total population of 225 patients and 68 healthy controls were accessed. Shut-D appears to have sufficient internal reliability, excellent retest reliability, high convergent validity, and satisfactory predictive validity, while the summed score of the scale reliably separates patients with exposure to trauma (in different diagnostic groups) from healthy controls. The Shut-D is a brief structured interview for assessing the vulnerability to dissociate as a consequence of exposure to traumatic stressors. The scale demonstrates high-quality psychometric properties and may be useful for researchers and clinicians in assessing shutdown dissociation as well as in predicting the risk of dissociative responding. PMID:25976478

  12. Predictive ability of genomic selection models for breeding value estimation on growth traits of Pacific white shrimp Litopenaeus vannamei

    NASA Astrophysics Data System (ADS)

    Wang, Quanchao; Yu, Yang; Li, Fuhua; Zhang, Xiaojun; Xiang, Jianhai

    2017-09-01

    Genomic selection (GS) can be used to accelerate genetic improvement by shortening the selection interval. The successful application of GS depends largely on the accuracy of the prediction of genomic estimated breeding value (GEBV). This study is a first attempt to understand the practicality of GS in Litopenaeus vannamei and aims to evaluate models for GS on growth traits. The performance of GS models in L. vannamei was evaluated in a population consisting of 205 individuals, which were genotyped for 6 359 single nucleotide polymorphism (SNP) markers by specific length amplified fragment sequencing (SLAF-seq) and phenotyped for body length and body weight. Three GS models (RR-BLUP, BayesA, and Bayesian LASSO) were used to obtain the GEBV, and their predictive ability was assessed by the reliability of the GEBV and the bias of the predicted phenotypes. The mean reliability of the GEBVs for body length and body weight predicted by the different models was 0.296 and 0.411, respectively. For each trait, the performances of the three models were very similar to each other with respect to predictability. The regression coefficients estimated by the three models were close to one, suggesting near to zero bias for the predictions. Therefore, when GS was applied in a L. vannamei population for the studied scenarios, all three models appeared practicable. Further analyses suggested that improved estimation of the genomic prediction could be realized by increasing the size of the training population as well as the density of SNPs.

  13. Comparisons Between Experimental and Semi-theoretical Cutting Forces of CCS Disc Cutters

    NASA Astrophysics Data System (ADS)

    Xia, Yimin; Guo, Ben; Tan, Qing; Zhang, Xuhui; Lan, Hao; Ji, Zhiyong

    2018-05-01

    This paper focuses on comparisons between the experimental and semi-theoretical forces of CCS disc cutters acting on different rocks. The experimental forces obtained from LCM tests were used to evaluate the prediction accuracy of a semi-theoretical CSM model. The results show that the CSM model reliably predicts the normal forces acting on red sandstone and granite, but underestimates the normal forces acting on marble. Some additional LCM test data from the literature were collected to further explore the ability of the CSM model to predict the normal forces acting on rocks of different strengths. The CSM model underestimates the normal forces acting on soft rocks, semi-hard rocks and hard rocks by approximately 38, 38 and 10%, respectively, but very accurately predicts those acting on very hard and extremely hard rocks. A calibration factor is introduced to modify the normal forces estimated by the CSM model. The overall trend of the calibration factor is characterized by an exponential decrease with increasing rock uniaxial compressive strength. The mean fitting ratios between the normal forces estimated by the modified CSM model and the experimental normal forces acting on soft rocks, semi-hard rocks and hard rocks are 1.076, 0.879 and 1.013, respectively. The results indicate that the prediction accuracy and the reliability of the CSM model have been improved.

  14. Advances in ultrasonic testing of austenitic stainless steel welds. Towards a 3D description of the material including attenuation and optimisation by inversion

    NASA Astrophysics Data System (ADS)

    Moysan, J.; Gueudré, C.; Ploix, M.-A.; Corneloup, G.; Guy, Ph.; Guerjouma, R. El; Chassignole, B.

    In the case of multi-pass welds, the material is very difficult to describe due to its anisotropic and heterogeneous properties. Anisotropy results from the metal solidification and is correlated with the grain orientation. A precise description of the material is one of the key points to obtain reliable results with wave propagation codes. A first advance is the model MINA which predicts the grain orientations in multi-pass 316-L steel welds. For flat position welding, good predictions of the grains orientations were obtained using 2D modelling. In case of welding in position the resulting grain structure may be 3D oriented. We indicate how the MINA model can be improved for 3D description. A second advance is a good quantification of the attenuation. Precise measurements are obtained using plane waves angular spectrum method together with the computation of the transmission coefficients for triclinic material. With these two first advances, the third one is now possible: developing an inverse method to obtain the material description through ultrasonic measurements at different positions.

  15. Microcomputer-based tests for repeated-measures: Metric properties and predictive validities

    NASA Technical Reports Server (NTRS)

    Kennedy, Robert S.; Baltzley, Dennis R.; Dunlap, William P.; Wilkes, Robert L.; Kuntz, Lois-Ann

    1989-01-01

    A menu of psychomotor and mental acuity tests were refined. Field applications of such a battery are, for example, a study of the effects of toxic agents or exotic environments on performance readiness, or the determination of fitness for duty. The key requirement of these tasks is that they be suitable for repeated-measures applications, and so questions of stability and reliability are a continuing, central focus of this work. After the initial (practice) session, seven replications of 14 microcomputer-based performance tests (32 measures) were completed by 37 subjects. Each test in the battery had previously been shown to stabilize in less than five 90-second administrations and to possess retest reliabilities greater than r = 0.707 for three minutes of testing. However, all the tests had never been administered together as a battery and they had never been self-administered. In order to provide predictive validity for intelligence measurement, the Wechsler Adult Intelligence Scale-Revised and the Wonderlic Personnel Test were obtained on the same subjects.

  16. Stochastic estimation of plant-available soil water under fluctuating water table depths

    NASA Astrophysics Data System (ADS)

    Or, Dani; Groeneveld, David P.

    1994-12-01

    Preservation of native valley-floor phreatophytes while pumping groundwater for export from Owens Valley, California, requires reliable predictions of plant water use. These predictions are compared with stored soil water within well field regions and serve as a basis for managing groundwater resources. Soil water measurement errors, variable recharge, unpredictable climatic conditions affecting plant water use, and modeling errors make soil water predictions uncertain and error-prone. We developed and tested a scheme based on soil water balance coupled with implementation of Kalman filtering (KF) for (1) providing physically based soil water storage predictions with prediction errors projected from the statistics of the various inputs, and (2) reducing the overall uncertainty in both estimates and predictions. The proposed KF-based scheme was tested using experimental data collected at a location on the Owens Valley floor where the water table was artificially lowered by groundwater pumping and later allowed to recover. Vegetation composition and per cent cover, climatic data, and soil water information were collected and used for developing a soil water balance. Predictions and updates of soil water storage under different types of vegetation were obtained for a period of 5 years. The main results show that: (1) the proposed predictive model provides reliable and resilient soil water estimates under a wide range of external conditions; (2) the predicted soil water storage and the error bounds provided by the model offer a realistic and rational basis for decisions such as when to curtail well field operation to ensure plant survival. The predictive model offers a practical means for accommodating simple aspects of spatial variability by considering the additional source of uncertainty as part of modeling or measurement uncertainty.

  17. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    PubMed

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.

  18. Composite Stress Rupture: A New Reliability Model Based on Strength Decay

    NASA Technical Reports Server (NTRS)

    Reeder, James R.

    2012-01-01

    A model is proposed to estimate reliability for stress rupture of composite overwrap pressure vessels (COPVs) and similar composite structures. This new reliability model is generated by assuming a strength degradation (or decay) over time. The model suggests that most of the strength decay occurs late in life. The strength decay model will be shown to predict a response similar to that predicted by a traditional reliability model for stress rupture based on tests at a single stress level. In addition, the model predicts that even though there is strength decay due to proof loading, a significant overall increase in reliability is gained by eliminating any weak vessels, which would fail early. The model predicts that there should be significant periods of safe life following proof loading, because time is required for the strength to decay from the proof stress level to the subsequent loading level. Suggestions for testing the strength decay reliability model have been made. If the strength decay reliability model predictions are shown through testing to be accurate, COPVs may be designed to carry a higher level of stress than is currently allowed, which will enable the production of lighter structures

  19. Statistics of return intervals between long heartbeat intervals and their usability for online prediction of disorders

    NASA Astrophysics Data System (ADS)

    Bogachev, Mikhail I.; Kireenkov, Igor S.; Nifontov, Eugene M.; Bunde, Armin

    2009-06-01

    We study the statistics of return intervals between large heartbeat intervals (above a certain threshold Q) in 24 h records obtained from healthy subjects. We find that both the linear and the nonlinear long-term memory inherent in the heartbeat intervals lead to power-laws in the probability density function PQ(r) of the return intervals. As a consequence, the probability WQ(t; Δt) that at least one large heartbeat interval will occur within the next Δt heartbeat intervals, with an increasing elapsed number of intervals t after the last large heartbeat interval, follows a power-law. Based on these results, we suggest a method of obtaining a priori information about the occurrence of the next large heartbeat interval, and thus to predict it. We show explicitly that the proposed method, which exploits long-term memory, is superior to the conventional precursory pattern recognition technique, which focuses solely on short-term memory. We believe that our results can be straightforwardly extended to obtain more reliable predictions in other physiological signals like blood pressure, as well as in other complex records exhibiting multifractal behaviour, e.g. turbulent flow, precipitation, river flows and network traffic.

  20. An adaptive data-driven method for accurate prediction of remaining useful life of rolling bearings

    NASA Astrophysics Data System (ADS)

    Peng, Yanfeng; Cheng, Junsheng; Liu, Yanfei; Li, Xuejun; Peng, Zhihua

    2018-06-01

    A novel data-driven method based on Gaussian mixture model (GMM) and distance evaluation technique (DET) is proposed to predict the remaining useful life (RUL) of rolling bearings. The data sets are clustered by GMM to divide all data sets into several health states adaptively and reasonably. The number of clusters is determined by the minimum description length principle. Thus, either the health state of the data sets or the number of the states is obtained automatically. Meanwhile, the abnormal data sets can be recognized during the clustering process and removed from the training data sets. After obtaining the health states, appropriate features are selected by DET for increasing the classification and prediction accuracy. In the prediction process, each vibration signal is decomposed into several components by empirical mode decomposition. Some common statistical parameters of the components are calculated first and then the features are clustered using GMM to divide the data sets into several health states and remove the abnormal data sets. Thereafter, appropriate statistical parameters of the generated components are selected using DET. Finally, least squares support vector machine is utilized to predict the RUL of rolling bearings. Experimental results indicate that the proposed method reliably predicts the RUL of rolling bearings.

  1. Benchmarking DFT and semi-empirical methods for a reliable and cost-efficient computational screening of benzofulvene derivatives as donor materials for small-molecule organic solar cells.

    PubMed

    Tortorella, Sara; Talamo, Maurizio Mastropasqua; Cardone, Antonio; Pastore, Mariachiara; De Angelis, Filippo

    2016-02-24

    A systematic computational investigation on the optical properties of a group of novel benzofulvene derivatives (Martinelli 2014 Org. Lett. 16 3424-7), proposed as possible donor materials in small molecule organic photovoltaic (smOPV) devices, is presented. A benchmark evaluation against experimental results on the accuracy of different exchange and correlation functionals and semi-empirical methods in predicting both reliable ground state equilibrium geometries and electronic absorption spectra is carried out. The benchmark of the geometry optimization level indicated that the best agreement with x-ray data is achieved by using the B3LYP functional. Concerning the optical gap prediction, we found that, among the employed functionals, MPW1K provides the most accurate excitation energies over the entire set of benzofulvenes. Similarly reliable results were also obtained for range-separated hybrid functionals (CAM-B3LYP and wB97XD) and for global hybrid methods incorporating a large amount of non-local exchange (M06-2X and M06-HF). Density functional theory (DFT) hybrids with a moderate (about 20-30%) extent of Hartree-Fock exchange (HFexc) (PBE0, B3LYP and M06) were also found to deliver HOMO-LUMO energy gaps which compare well with the experimental absorption maxima, thus representing a valuable alternative for a prompt and predictive estimation of the optical gap. The possibility of using completely semi-empirical approaches (AM1/ZINDO) is also discussed.

  2. Prediction of Multidimensional Fatigue After Childhood Brain Injury.

    PubMed

    Crichton, Alison J; Babl, Franz; Oakley, Ed; Greenham, Mardee; Hearps, Stephen; Delzoppo, Carmel; Hutchison, Jamie; Beauchamp, Miriam; Anderson, Vicki A

    To determine (1) the presence of fatigue symptoms and predictors of fatigue after childhood brain injury and examine (2) the feasibility, reliability, and validity of a multidimensional fatigue measure (PedsQL Multidimensional Fatigue Scale [MFS]) obtained from parent and child perspectives. Emergency and intensive care units of a hospital in Melbourne, Australia. Thirty-five families (34 parent-proxies and 32 children) aged 8 to 18 years (mean child age = 13.29 years) with traumatic brain injury (TBI) of all severities (27 mild, 5 moderate, and 3 severe) admitted to the Royal Children's Hospital. Longitudinal prospective study. Fatigue data collected at 6-week follow-up (mean = 6.9 weeks). Postinjury child- and parent-rated fatigue (PedsQL MFS), mood, sleep, and pain based on questionnaire report: TBI severity (mild vs moderate/severe TBI). A score greater than 2 standard deviations below healthy control data indicated the presence of abnormal fatigue, rates of which were higher compared with normative data for both parent and child reports (47% and 29%). Fatigue was predicted by postinjury depression and sleep disturbance for parent, but not child ratings. Fatigue, as rated by children, was not significantly predicted by TBI severity or other symptoms. The PedsQL MFS demonstrated acceptable measurement properties in child TBI participants, evidenced by good feasibility and reliability (Cronbach α values >0.90). Interrater reliability between parent and child reports was poor to moderate. Results underscore the need to assess fatigue and associated sleep-wake disturbance and depression after child TBI from both parent and child perspectives.

  3. Humanized mouse lines and their application for prediction of human drug metabolism and toxicological risk assessment

    PubMed Central

    Cheung, Connie; Gonzalez, Frank J

    2008-01-01

    Cytochrome P450s (P450s) are important enzymes involved in the metabolism of xenobiotics, particularly clinically used drugs, and are also responsible for metabolic activation of chemical carcinogens and toxins. Many xenobiotics can activate nuclear receptors that in turn induce the expression of genes encoding xenobiotic metabolizing enzymes and drug transporters. Marked species differences in the expression and regulation of cytochromes P450 and xenobiotic nuclear receptors exist. Thus obtaining reliable rodent models to accurately reflect human drug and carcinogen metabolism is severely limited. Humanized transgenic mice were developed in an effort to create more reliable in vivo systems to study and predict human responses to xenobiotics. Human P450s or human xenobiotic-activated nuclear receptors were introduced directly or replaced the corresponding mouse gene, thus creating “humanized” transgenic mice. Mice expressing human CYP1A1/CYP1A2, CYP2E1, CYP2D6, CYP3A4, CY3A7, PXR, PPARα were generated and characterized. These humanized mouse models offers a broad utility in the evaluation and prediction of toxicological risk that may aid in the development of safer drugs. PMID:18682571

  4. Do in-training evaluation reports deserve their bad reputations? A study of the reliability and predictive ability of ITER scores and narrative comments.

    PubMed

    Ginsburg, Shiphra; Eva, Kevin; Regehr, Glenn

    2013-10-01

    Although scores on in-training evaluation reports (ITERs) are often criticized for poor reliability and validity, ITER comments may yield valuable information. The authors assessed across-rotation reliability of ITER scores in one internal medicine program, ability of ITER scores and comments to predict postgraduate year three (PGY3) performance, and reliability and incremental predictive validity of attendings' analysis of written comments. Numeric and narrative data from the first two years of ITERs for one cohort of residents at the University of Toronto Faculty of Medicine (2009-2011) were assessed for reliability and predictive validity of third-year performance. Twenty-four faculty attendings rank-ordered comments (without scores) such that each resident was ranked by three faculty. Mean ITER scores and comment rankings were submitted to regression analyses; dependent variables were PGY3 ITER scores and program directors' rankings. Reliabilities of ITER scores across nine rotations for 63 residents were 0.53 for both postgraduate year one (PGY1) and postgraduate year two (PGY2). Interrater reliabilities across three attendings' rankings were 0.83 for PGY1 and 0.79 for PGY2. There were strong correlations between ITER scores and comments within each year (0.72 and 0.70). Regressions revealed that PGY1 and PGY2 ITER scores collectively explained 25% of variance in PGY3 scores and 46% of variance in PGY3 rankings. Comment rankings did not improve predictions. ITER scores across multiple rotations showed decent reliability and predictive validity. Comment ranks did not add to the predictive ability, but correlation analyses suggest that trainee performance can be measured through these comments.

  5. Mobility assessment: Sensitivity and specificity of measurement sets in older adults

    PubMed Central

    Panzer, Victoria P.; Wakefield, Dorothy B.; Hall, Charles B.; Wolfson, Leslie I.

    2011-01-01

    Objective To identify quantitative measurement variables that characterize mobility in older adults, meet reliability and validity criteria, distinguish fall-risk and predict future falls. Design Observational study with 1-year weekly falls follow-up Setting Mobility laboratory Participants Community-dwelling volunteers (n=74; 65–94 years old) categorized at entry as 27 ‘Non-fallers’ or 47 ‘Fallers’ by Medicare criteria (1 injury fall or >1 non-injury falls in the previous year). Interventions None Outcome Measures Test-retest and within-subject reliability, criterion and concurrent validity; predictive ability indicated by observed sensitivity and specificity to entry fall-risk group (Falls-status), Tinetti Performance Oriented Mobility Assessment (POMA), Computerized Dynamic Posturography Sensory Organization Test (SOT) and subsequent falls reported weekly. Results Measurement variables were selected that met reliability (ICC > 0.6) and/or discrimination (p<.01) criteria (Clinical variables- Turn- steps, time, Gait- velocity, Step-in-tub-time, and Downstairs- time; Force plate variables- Quiet standing Romberg ratio sway-area, Maximal lean- anterior-posterior excursion, Sit-to-stand medial-lateral excursion and sway-area). Sets were created (3 clinical, 2 force plate) utilizing combinations of variables appropriate for older adults with different functional activity levels and composite scores were calculated. Scores identified entry Falls-status and concurred with POMA and SOT. The Full clinical set (5 measurement variables) produced sensitivity/specificity (.80/.74) to Falls-status. Composite scores were sensitive and specific in predicting subsequent injury falls and multiple falls compared to Falls-status, POMA or SOT. Conclusions Sets of quantitative measurement variables obtained with this mobility battery provided sensitive prediction of future injury falls and screening for multiple subsequent falls using tasks that should be appropriate to diverse participants. PMID:21621667

  6. Prediction of Aerodynamic Coefficient using Genetic Algorithm Optimized Neural Network for Sparse Data

    NASA Technical Reports Server (NTRS)

    Rajkumar, T.; Bardina, Jorge; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Wind tunnels use scale models to characterize aerodynamic coefficients, Wind tunnel testing can be slow and costly due to high personnel overhead and intensive power utilization. Although manual curve fitting can be done, it is highly efficient to use a neural network to define the complex relationship between variables. Numerical simulation of complex vehicles on the wide range of conditions required for flight simulation requires static and dynamic data. Static data at low Mach numbers and angles of attack may be obtained with simpler Euler codes. Static data of stalled vehicles where zones of flow separation are usually present at higher angles of attack require Navier-Stokes simulations which are costly due to the large processing time required to attain convergence. Preliminary dynamic data may be obtained with simpler methods based on correlations and vortex methods; however, accurate prediction of the dynamic coefficients requires complex and costly numerical simulations. A reliable and fast method of predicting complex aerodynamic coefficients for flight simulation I'S presented using a neural network. The training data for the neural network are derived from numerical simulations and wind-tunnel experiments. The aerodynamic coefficients are modeled as functions of the flow characteristics and the control surfaces of the vehicle. The basic coefficients of lift, drag and pitching moment are expressed as functions of angles of attack and Mach number. The modeled and training aerodynamic coefficients show good agreement. This method shows excellent potential for rapid development of aerodynamic models for flight simulation. Genetic Algorithms (GA) are used to optimize a previously built Artificial Neural Network (ANN) that reliably predicts aerodynamic coefficients. Results indicate that the GA provided an efficient method of optimizing the ANN model to predict aerodynamic coefficients. The reliability of the ANN using the GA includes prediction of aerodynamic coefficients to an accuracy of 110% . In our problem, we would like to get an optimized neural network architecture and minimum data set. This has been accomplished within 500 training cycles of a neural network. After removing training pairs (outliers), the GA has produced much better results. The neural network constructed is a feed forward neural network with a back propagation learning mechanism. The main goal has been to free the network design process from constraints of human biases, and to discover better forms of neural network architectures. The automation of the network architecture search by genetic algorithms seems to have been the best way to achieve this goal.

  7. Parts and Components Reliability Assessment: A Cost Effective Approach

    NASA Technical Reports Server (NTRS)

    Lee, Lydia

    2009-01-01

    System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.

  8. How to Compute Electron Ionization Mass Spectra from First Principles.

    PubMed

    Bauer, Christoph Alexander; Grimme, Stefan

    2016-06-02

    The prediction of electron ionization (EI) mass spectra (MS) from first principles has been a major challenge for quantum chemistry (QC). The unimolecular reaction space grows rapidly with increasing molecular size. On the one hand, statistical models like Eyring's quasi-equilibrium theory and Rice-Ramsperger-Kassel-Marcus theory have provided valuable insight, and some predictions and quantitative results can be obtained from such calculations. On the other hand, molecular dynamics-based methods are able to explore automatically the energetically available regions of phase space and thus yield reaction paths in an unbiased way. We describe in this feature article the status of both methodologies in relation to mass spectrometry for small to medium sized molecules. We further present results obtained with the QCEIMS program developed in our laboratory. Our method, which incorporates stochastic and dynamic elements, has been a significant step toward the reliable routine calculation of EI mass spectra.

  9. Comparison Between Predicted and Experimentally Measured Flow Fields at the Exit of the SSME HPFTP Impeller

    NASA Technical Reports Server (NTRS)

    Bache, George

    1993-01-01

    Validation of CFD codes is a critical first step in the process of developing CFD design capability. The MSFC Pump Technology Team has recognized the importance of validation and has thus funded several experimental programs designed to obtain CFD quality validation data. The first data set to become available is for the SSME High Pressure Fuel Turbopump Impeller. LDV Data was taken at the impeller inlet (to obtain a reliable inlet boundary condition) and three radial positions at the impeller discharge. Our CFD code, TASCflow, is used within the Propulsion and Commercial Pump industry as a tool for pump design. The objective of this work, therefore, is to further validate TASCflow for application in pump design. TASCflow was used to predict flow at the impeller discharge for flowrates of 80, 100 and 115 percent of design flow. Comparison to data has been made with encouraging results.

  10. CFD Modeling of Free-Piston Stirling Engines

    NASA Technical Reports Server (NTRS)

    Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.

    2001-01-01

    NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.

  11. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  12. Modeling heterogeneous (co)variances from adjacent-SNP groups improves genomic prediction for milk protein composition traits.

    PubMed

    Gebreyesus, Grum; Lund, Mogens S; Buitenhuis, Bart; Bovenhuis, Henk; Poulsen, Nina A; Janss, Luc G

    2017-12-05

    Accurate genomic prediction requires a large reference population, which is problematic for traits that are expensive to measure. Traits related to milk protein composition are not routinely recorded due to costly procedures and are considered to be controlled by a few quantitative trait loci of large effect. The amount of variation explained may vary between regions leading to heterogeneous (co)variance patterns across the genome. Genomic prediction models that can efficiently take such heterogeneity of (co)variances into account can result in improved prediction reliability. In this study, we developed and implemented novel univariate and bivariate Bayesian prediction models, based on estimates of heterogeneous (co)variances for genome segments (BayesAS). Available data consisted of milk protein composition traits measured on cows and de-regressed proofs of total protein yield derived for bulls. Single-nucleotide polymorphisms (SNPs), from 50K SNP arrays, were grouped into non-overlapping genome segments. A segment was defined as one SNP, or a group of 50, 100, or 200 adjacent SNPs, or one chromosome, or the whole genome. Traditional univariate and bivariate genomic best linear unbiased prediction (GBLUP) models were also run for comparison. Reliabilities were calculated through a resampling strategy and using deterministic formula. BayesAS models improved prediction reliability for most of the traits compared to GBLUP models and this gain depended on segment size and genetic architecture of the traits. The gain in prediction reliability was especially marked for the protein composition traits β-CN, κ-CN and β-LG, for which prediction reliabilities were improved by 49 percentage points on average using the MT-BayesAS model with a 100-SNP segment size compared to the bivariate GBLUP. Prediction reliabilities were highest with the BayesAS model that uses a 100-SNP segment size. The bivariate versions of our BayesAS models resulted in extra gains of up to 6% in prediction reliability compared to the univariate versions. Substantial improvement in prediction reliability was possible for most of the traits related to milk protein composition using our novel BayesAS models. Grouping adjacent SNPs into segments provided enhanced information to estimate parameters and allowing the segments to have different (co)variances helped disentangle heterogeneous (co)variances across the genome.

  13. Automated Segmentability Index for Layer Segmentation of Macular SD-OCT Images.

    PubMed

    Lee, Kyungmoo; Buitendijk, Gabriëlle H S; Bogunovic, Hrvoje; Springelkamp, Henriët; Hofman, Albert; Wahle, Andreas; Sonka, Milan; Vingerling, Johannes R; Klaver, Caroline C W; Abràmoff, Michael D

    2016-03-01

    To automatically identify which spectral-domain optical coherence tomography (SD-OCT) scans will provide reliable automated layer segmentations for more accurate layer thickness analyses in population studies. Six hundred ninety macular SD-OCT image volumes (6.0 × 6.0 × 2.3 mm 3 ) were obtained from one eyes of 690 subjects (74.6 ± 9.7 [mean ± SD] years, 37.8% of males) randomly selected from the population-based Rotterdam Study. The dataset consisted of 420 OCT volumes with successful automated retinal nerve fiber layer (RNFL) segmentations obtained from our previously reported graph-based segmentation method and 270 volumes with failed segmentations. To evaluate the reliability of the layer segmentations, we have developed a new metric, segmentability index SI, which is obtained from a random forest regressor based on 12 features using OCT voxel intensities, edge-based costs, and on-surface costs. The SI was compared with well-known quality indices, quality index (QI), and maximum tissue contrast index (mTCI), using receiver operating characteristic (ROC) analysis. The 95% confidence interval (CI) and the area under the curve (AUC) for the QI are 0.621 to 0.805 with AUC 0.713, for the mTCI 0.673 to 0.838 with AUC 0.756, and for the SI 0.784 to 0.920 with AUC 0.852. The SI AUC is significantly larger than either the QI or mTCI AUC ( P < 0.01). The segmentability index SI is well suited to identify SD-OCT scans for which successful automated intraretinal layer segmentations can be expected. Interpreting the quantification of SD-OCT images requires the underlying segmentation to be reliable, but standard SD-OCT quality metrics do not predict which segmentations are reliable and which are not. The segmentability index SI presented in this study does allow reliable segmentations to be identified, which is important for more accurate layer thickness analyses in research and population studies.

  14. Effects of feather wear and temperature on prediction of food intake and residual food consumption.

    PubMed

    Herremans, M; Decuypere, E; Siau, O

    1989-03-01

    Heat production, which accounts for 0.6 of gross energy intake, is insufficiently represented in predictions of food intake. Especially when heat production is elevated (for example by lower temperature or poor feathering) the classical predictions based on body weight, body-weight change and egg mass are inadequate. Heat production was reliably estimated as [35.5-environmental temperature (degree C)] x [Defeathering (=%IBPW) + 21]. Including this term (PHP: predicted heat production) in equations predicting food intake significantly increased accuracy of prediction, especially under suboptimal conditions. Within the range of body weights tested (from 1.6 kg in brown layers to 2.8 kg in dwarf broiler breeders), body weight as an independent variable contributed little to the prediction of food intake; especially within strains its effect was better included in the intercept. Significantly reduced absolute values of residual food consumption were obtained over a wide range of conditions by using predictions of food intake based on body-weight change, egg mass, predicted heat production (PHP) and an intercept, instead of body weight, body-weight change, egg mass and an intercept.

  15. The harmonic frequencies of benzene

    NASA Astrophysics Data System (ADS)

    Handy, Nicholas C.; Maslen, Paul E.; Amos, Roger D.; Andrews, Jamie S.; Murray, Christopher W.; Laming, Gregory J.

    1992-09-01

    We report calculations for the harmonic frequencies of C 6H 6 and C 6D 6. Our most sophisticated quantum chemistry values are obtained with the MP2 method and a TZ2P+f basis set (288 basis functions), which are the largest such calculations reported on benzene to date. Using the SCF density, we also calculate the frequencies using the exchange and correlation expressions of density functional theory. We compare our calculated harmonic frequencies with those deduced from experiment by Goodman, Ozkabak and Thakur. The density functional frequencies appear to be more reliable predictions than the MP2 frequencies and they are obtained at significantly less cost.

  16. Prediction of monthly regional groundwater levels through hybrid soft-computing techniques

    NASA Astrophysics Data System (ADS)

    Chang, Fi-John; Chang, Li-Chiu; Huang, Chien-Wei; Kao, I.-Feng

    2016-10-01

    Groundwater systems are intrinsically heterogeneous with dynamic temporal-spatial patterns, which cause great difficulty in quantifying their complex processes, while reliable predictions of regional groundwater levels are commonly needed for managing water resources to ensure proper service of water demands within a region. In this study, we proposed a novel and flexible soft-computing technique that could effectively extract the complex high-dimensional input-output patterns of basin-wide groundwater-aquifer systems in an adaptive manner. The soft-computing models combined the Self Organized Map (SOM) and the Nonlinear Autoregressive with Exogenous Inputs (NARX) network for predicting monthly regional groundwater levels based on hydrologic forcing data. The SOM could effectively classify the temporal-spatial patterns of regional groundwater levels, the NARX could accurately predict the mean of regional groundwater levels for adjusting the selected SOM, the Kriging was used to interpolate the predictions of the adjusted SOM into finer grids of locations, and consequently the prediction of a monthly regional groundwater level map could be obtained. The Zhuoshui River basin in Taiwan was the study case, and its monthly data sets collected from 203 groundwater stations, 32 rainfall stations and 6 flow stations during 2000 and 2013 were used for modelling purpose. The results demonstrated that the hybrid SOM-NARX model could reliably and suitably predict monthly basin-wide groundwater levels with high correlations (R2 > 0.9 in both training and testing cases). The proposed methodology presents a milestone in modelling regional environmental issues and offers an insightful and promising way to predict monthly basin-wide groundwater levels, which is beneficial to authorities for sustainable water resources management.

  17. Methodology for Software Reliability Prediction. Volume 1.

    DTIC Science & Technology

    1987-11-01

    SPACECRAFT 0 MANNED SPACECRAFT B ATCH SYSTEM AIRBORNE AVIONICS 0 UNMANNED EVENT C014TROL a REAL TIME CLOSED 0 UNMANNED SPACECRAFT LOOP OPERATINS SPACECRAFT...software reliability. A Software Reliability Measurement Framework was established which spans the life cycle of a software system and includes the...specification, prediction, estimation, and assessment of software reliability. Data from 59 systems , representing over 5 million lines of code, were

  18. Obtaining Reliable Predictions of Terrestrial Energy Coupling From Real-Time Solar Wind Measurements

    NASA Technical Reports Server (NTRS)

    Weimer, Daniel R.

    2002-01-01

    Measurements of the interplanetary magnetic field (IMF) from the ACE (Advanced Composition Explorer), Wind, IMP-8 (Interplanetary Monitoring Platform), and Geotail spacecraft have revealed that the IMF variations are contained in phase planes that are tilted with respect to the propagation direction, resulting in continuously variable changes in propagation times between spacecraft, and therefore, to the Earth. Techniques for using 'minimum variance analysis' have been developed in order to be able to measure the phase front tilt angles, and better predict the actual propagation times from the L1 orbit to the Earth, using only the real-time IMF measurements from one spacecraft. The use of empirical models with the IMF measurements at L1 from ACE (or future satellites) for predicting 'space weather' effects has also been demonstrated.

  19. Statistical model selection for better prediction and discovering science mechanisms that affect reliability

    DOE PAGES

    Anderson-Cook, Christine M.; Morzinski, Jerome; Blecker, Kenneth D.

    2015-08-19

    Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidatemore » inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. As a result, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.« less

  20. Life prediction and mechanical reliability of NT551 silicon nitride

    NASA Astrophysics Data System (ADS)

    Andrews, Mark Jay

    The inert strength and fatigue performance of a diesel engine exhaust valve made from silicon nitride (Si3N4) ceramic were assessed. The Si3N4 characterized in this study was manufactured by Saint Gobain/Norton Industrial Ceramics and was designated as NT551. The evaluation was made utilizing a probabilistic life prediction algorithm that combined censored test specimen strength data with a Weibull distribution function and the stress field of the ceramic valve obtained from finite element analysis. The major assumptions of the life prediction algorithm are that the bulk ceramic material is isotropic and homogeneous and that the strength-limiting flaws are uniformly distributed. The results from mechanical testing indicated that NT551 was not a homogeneous ceramic and that its strength were functions of temperature, loading rate, and machining orientation. Fractographic analysis identified four different failure modes; 2 were identified as inhomogeneities that were located throughout the bulk of NT551 and were due to processing operations. The fractographic analysis concluded that the strength degradation of NT551 observed from the temperature and loading rate test parameters was due to a change of state that occurred in its secondary phase. Pristine and engine-tested valves made from NT551 were loaded to failure and the inert strengths were obtained. Fractographic analysis of the valves identified the same four failure mechanisms as found with the test specimens. The fatigue performance and the inert strength of the Si3N 4 valves were assessed from censored and uncensored test specimen strength data, respectively. The inert strength failure probability predictions were compared to the inert strength of the Si3N4 valves. The inert strength failure probability predictions were more conservative than the strength of the valves. The lack of correlation between predicted and actual valve strength was due to the nonuniform distribution of inhomogeneities present in NT551. For the same reasons, the predicted and actual fatigue performance did not correlate well. The results of this study should not be considered a limitation of the life prediction algorithm but emphasize the requirement that ceramics be homogeneous and strength-limiting flaws uniformly distributed as a perquisite for accurate life prediction and reliability analyses.

  1. Care 3 model overview and user's guide, first revision

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Petersen, P. L.

    1985-01-01

    A manual was written to introduce the CARE III (Computer-Aided Reliability Estimation) capability to reliability and design engineers who are interested in predicting the reliability of highly reliable fault-tolerant systems. It was also structured to serve as a quick-look reference manual for more experienced users. The guide covers CARE III modeling and reliability predictions for execution in the CDC CYber 170 series computers, DEC VAX-11/700 series computer, and most machines that compile ANSI Standard FORTRAN 77.

  2. Improved RMR Rock Mass Classification Using Artificial Intelligence Algorithms

    NASA Astrophysics Data System (ADS)

    Gholami, Raoof; Rasouli, Vamegh; Alimoradi, Andisheh

    2013-09-01

    Rock mass classification systems such as rock mass rating (RMR) are very reliable means to provide information about the quality of rocks surrounding a structure as well as to propose suitable support systems for unstable regions. Many correlations have been proposed to relate measured quantities such as wave velocity to rock mass classification systems to limit the associated time and cost of conducting the sampling and mechanical tests conventionally used to calculate RMR values. However, these empirical correlations have been found to be unreliable, as they usually overestimate or underestimate the RMR value. The aim of this paper is to compare the results of RMR classification obtained from the use of empirical correlations versus machine-learning methodologies based on artificial intelligence algorithms. The proposed methods were verified based on two case studies located in northern Iran. Relevance vector regression (RVR) and support vector regression (SVR), as two robust machine-learning methodologies, were used to predict the RMR for tunnel host rocks. RMR values already obtained by sampling and site investigation at one tunnel were taken into account as the output of the artificial networks during training and testing phases. The results reveal that use of empirical correlations overestimates the predicted RMR values. RVR and SVR, however, showed more reliable results, and are therefore suggested for use in RMR classification for design purposes of rock structures.

  3. A Reliability Estimation in Modeling Watershed Runoff With Uncertainties

    NASA Astrophysics Data System (ADS)

    Melching, Charles S.; Yen, Ben Chie; Wenzel, Harry G., Jr.

    1990-10-01

    The reliability of simulation results produced by watershed runoff models is a function of uncertainties in nature, data, model parameters, and model structure. A framework is presented here for using a reliability analysis method (such as first-order second-moment techniques or Monte Carlo simulation) to evaluate the combined effect of the uncertainties on the reliability of output hydrographs from hydrologic models. For a given event the prediction reliability can be expressed in terms of the probability distribution of the estimated hydrologic variable. The peak discharge probability for a watershed in Illinois using the HEC-1 watershed model is given as an example. The study of the reliability of predictions from watershed models provides useful information on the stochastic nature of output from deterministic models subject to uncertainties and identifies the relative contribution of the various uncertainties to unreliability of model predictions.

  4. A Microstructure-Based Time-Dependent Crack Growth Model for Life and Reliability Prediction of Turbopropulsion Systems

    NASA Astrophysics Data System (ADS)

    Chan, Kwai S.; Enright, Michael P.; Moody, Jonathan; Fitch, Simeon H. K.

    2014-01-01

    The objective of this investigation was to develop an innovative methodology for life and reliability prediction of hot-section components in advanced turbopropulsion systems. A set of generic microstructure-based time-dependent crack growth (TDCG) models was developed and used to assess the sources of material variability due to microstructure and material parameters such as grain size, activation energy, and crack growth threshold for TDCG. A comparison of model predictions and experimental data obtained in air and in vacuum suggests that oxidation is responsible for higher crack growth rates at high temperatures, low frequencies, and long dwell times, but oxidation can also induce higher crack growth thresholds (Δ K th or K th) under certain conditions. Using the enhanced risk analysis tool and material constants calibrated to IN 718 data, the effect of TDCG on the risk of fracture in turboengine components was demonstrated for a generic rotor design and a realistic mission profile using the DARWIN® probabilistic life-prediction code. The results of this investigation confirmed that TDCG and cycle-dependent crack growth in IN 718 can be treated by a simple summation of the crack increments over a mission. For the temperatures considered, TDCG in IN 718 can be considered as a K-controlled or a diffusion-controlled oxidation-induced degradation process. This methodology provides a pathway for evaluating microstructural effects on multiple damage modes in hot-section components.

  5. Sonic boom predictions using a modified Euler code

    NASA Technical Reports Server (NTRS)

    Siclari, Michael J.

    1992-01-01

    The environmental impact of a next generation fleet of high-speed civil transports (HSCT) is of great concern in the evaluation of the commercial development of such a transport. One of the potential environmental impacts of a high speed civilian transport is the sonic boom generated by the aircraft and its effects on the population, wildlife, and structures in the vicinity of its flight path. If an HSCT aircraft is restricted from flying overland routes due to excessive booms, the commercial feasibility of such a venture may be questionable. NASA has taken the lead in evaluating and resolving the issues surrounding the development of a high speed civilian transport through its High-Speed Research Program (HSRP). The present paper discusses the usage of a Computational Fluid Dynamics (CFD) nonlinear code in predicting the pressure signature and ultimately the sonic boom generated by a high speed civilian transport. NASA had designed, built, and wind tunnel tested two low boom configurations for flight at Mach 2 and Mach 3. Experimental data was taken at several distances from these models up to a body length from the axis of the aircraft. The near field experimental data serves as a test bed for computational fluid dynamic codes in evaluating their accuracy and reliability for predicting the behavior of future HSCT designs. Sonic boom prediction methodology exists which is based on modified linear theory. These methods can be used reliably if near field signatures are available at distances from the aircraft where nonlinear and three dimensional effects have diminished in importance. Up to the present time, the only reliable method to obtain this data was via the wind tunnel with costly model construction and testing. It is the intent of the present paper to apply a modified three dimensional Euler code to predict the near field signatures of the two low boom configurations recently tested by NASA.

  6. Plateletpheresis efficiency and mathematical correction of software-derived platelet yield prediction: A linear regression and ROC modeling approach.

    PubMed

    Jaime-Pérez, José Carlos; Jiménez-Castillo, Raúl Alberto; Vázquez-Hernández, Karina Elizabeth; Salazar-Riojas, Rosario; Méndez-Ramírez, Nereida; Gómez-Almaguer, David

    2017-10-01

    Advances in automated cell separators have improved the efficiency of plateletpheresis and the possibility of obtaining double products (DP). We assessed cell processor accuracy of predicted platelet (PLT) yields with the goal of a better prediction of DP collections. This retrospective proof-of-concept study included 302 plateletpheresis procedures performed on a Trima Accel v6.0 at the apheresis unit of a hematology department. Donor variables, software predicted yield and actual PLT yield were statistically evaluated. Software prediction was optimized by linear regression analysis and its optimal cut-off to obtain a DP assessed by receiver operating characteristic curve (ROC) modeling. Three hundred and two plateletpheresis procedures were performed; in 271 (89.7%) occasions, donors were men and in 31 (10.3%) women. Pre-donation PLT count had the best direct correlation with actual PLT yield (r = 0.486. P < .001). Means of software machine-derived values differed significantly from actual PLT yield, 4.72 × 10 11 vs.6.12 × 10 11 , respectively, (P < .001). The following equation was developed to adjust these values: actual PLT yield= 0.221 + (1.254 × theoretical platelet yield). ROC curve model showed an optimal apheresis device software prediction cut-off of 4.65 × 10 11 to obtain a DP, with a sensitivity of 82.2%, specificity of 93.3%, and an area under the curve (AUC) of 0.909. Trima Accel v6.0 software consistently underestimated PLT yields. Simple correction derived from linear regression analysis accurately corrected this underestimation and ROC analysis identified a precise cut-off to reliably predict a DP. © 2016 Wiley Periodicals, Inc.

  7. Detection of β-Thalassemia Carriers by Red Cell Parameters Obtained from Automatic Counters using Mathematical Formulas

    PubMed Central

    Roth, Idit Lachover; Lachover, Boaz; Koren, Guy; Levin, Carina; Zalman, Luci; Koren, Ariel

    2018-01-01

    Background β-thalassemia major is a severe disease with high morbidity. The world prevalence of carriers is around 1.5–7%. The present study aimed to find a reliable formula for detecting β-thalassemia carriers using an extensive database of more than 22,000 samples obtained from a homogeneous population of childbearing age women with 3161 (13.6%) of β-thalassemia carriers and to check previously published formulas. Methods We applied a mathematical method based on the support vector machine (SVM) algorithm in the search for a reliable formula that can differentiate between thalassemia carriers and non-carriers, including normal counts or counts suspected to belong to iron-deficient women. Results Shine’s formula and our SVM formula showed >98% sensitivity and >99.77% negative predictive value (NPV). All other published formulas gave inferior results. Conclusions We found a reliable formula that can be incorporated into any automatic blood counter to alert health providers to the possibility of a woman being a β-thalassemia carrier. A further simple hemoglobin characterization by HPLC analysis should be performed to confirm the diagnosis, and subsequent family studies should be carried out. Our SVM formula is currently limited to women of fertility age until further analysis in other groups can be performed. PMID:29326805

  8. Numerical simulation of turbulent gas flames in tubes.

    PubMed

    Salzano, E; Marra, F S; Russo, G; Lee, J H S

    2002-12-02

    Computational fluid dynamics (CFD) is an emerging technique to predict possible consequences of gas explosion and it is often considered a powerful and accurate tool to obtain detailed results. However, systematic analyses of the reliability of this approach to real-scale industrial configurations are still needed. Furthermore, few experimental data are available for comparison and validation. In this work, a set of well documented experimental data related to the flame acceleration obtained within obstacle-filled tubes filled with flammable gas-air mixtures, has been simulated. In these experiments, terminal steady flame speeds corresponding to different propagation regimes were observed, thus, allowing a clear and prompt characterisation of the numerical results with respect to numerical parameters, as grid definition, geometrical parameters, as blockage ratio and to mixture parameters, as mixture reactivity. The CFD code AutoReagas was used for the simulations. Numerical predictions were compared with available experimental data and some insights into the code accuracy were determined. Computational results are satisfactory for the relatively slower turbulent deflagration regimes and became fair when choking regime is observed, whereas transition to quasi-detonation or Chapman-Jogouet (CJ) were never predicted.

  9. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

  10. A group electronegativity equalization scheme including external potential effects.

    PubMed

    Leyssens, Tom; Geerlings, Paul; Peeters, Daniel

    2006-07-20

    By calculating the electron affinity and ionization energy of different functional groups, CCSD electronegativity values are obtained, which implicitly account for the effect of the molecular environment. This latter is approximated using a chemically justified point charge model. On the basis of Sanderson's electronegativity equalization principle, this approach is shown to lead to reliable "group in molecule" electronegativities. Using a slight adjustment of the modeled environment and first-order principles, an electronegativity equalization scheme is obtained, which implicitly accounts for the major part of the external potential effect. This scheme can be applied in a predictive manner to estimate the charge transfer between two functional groups, without having to rely on cumbersome calibrations. A very satisfactory correlation is obtained between these charge transfers and those obtained from an ab initio calculation of the entire molecule.

  11. Development and Validation of an HIV Risk Exposure and Indicator Conditions Questionnaire to Support Targeted HIV Screening.

    PubMed

    Elías, María Jesús Pérez; Gómez-Ayerbe, Cristina; Elías, Pilar Pérez; Muriel, Alfonso; de Santiago, Alberto Diaz; Martinez-Colubi, María; Moreno, Ana; Santos, Cristina; Polo, Lidia; Barea, Rafa; Robledillo, Gema; Uranga, Almudena; Espín, Agustina Cano; Quereda, Carmen; Dronda, Fernando; Casado, Jose Luis; Moreno, Santiago

    2016-02-01

    The aim of our study was to develop a Spanish-structured HIV risk of exposure and indicator conditions (RE&IC) questionnaire. People attending to an emergency room or to a primary clinical care center were offered to participate in a prospective, 1 arm, open label study, in which all enrolled patients filled out our developed questionnaire and were HIV tested. Questionnaire accuracy, feasibility, and reliability were evaluated.Valid paired 5329 HIV RE&IC questionnaire and rapid HIV tests were performed, 69.3% in the primary clinical care center, 49.6% women, median age 37 years old, 74.9% Spaniards, 20.1% Latin-Americans. Confirmed hidden HIV infection was detected in 4.1%, while HIV RE&IC questionnaire was positive in 51.2%. HIV RE&IC questionnaire sensitivity was 100% to predict HIV infection, with a 100% negative predictive value. When considered separately, RE or IC items sensitivity decreases to 86.4% or 91%, and similarly their negative predictive value to 99.9% for both of them. The majority of people studied, 90.8% self-completed HIV RE&IC questionnaire. Median time to complete was 3 minutes. Overall HIV RE&IC questionnaire test-retest Kappa agreement was 0.82 (almost perfect), likewise for IC items 0.89, while for RE items was lower 0.78 (substantial).A feasible and reliable Spanish HIV RE&IC self questionnaire accurately discriminated all non-HIV-infected people without missing any HIV diagnoses, in a low prevalence HIV infection area. The best accuracy and reliability were obtained when combining HIV RE&IC items.

  12. Development and Validation of an HIV Risk Exposure and Indicator Conditions Questionnaire to Support Targeted HIV Screening

    PubMed Central

    Elías, María Jesús Pérez; Gómez-Ayerbe, Cristina; Elías, Pilar Pérez; Muriel, Alfonso; de Santiago, Alberto Diaz; Martinez-Colubi, María; Moreno, Ana; Santos, Cristina; Polo, Lidia; Barea, Rafa; Robledillo, Gema; Uranga, Almudena; Espín, Agustina Cano; Quereda, Carmen; Dronda, Fernando; Casado, Jose Luis; Moreno, Santiago

    2016-01-01

    Abstract The aim of our study was to develop a Spanish-structured HIV risk of exposure and indicator conditions (RE&IC) questionnaire. People attending to an emergency room or to a primary clinical care center were offered to participate in a prospective, 1 arm, open label study, in which all enrolled patients filled out our developed questionnaire and were HIV tested. Questionnaire accuracy, feasibility, and reliability were evaluated. Valid paired 5329 HIV RE&IC questionnaire and rapid HIV tests were performed, 69.3% in the primary clinical care center, 49.6% women, median age 37 years old, 74.9% Spaniards, 20.1% Latin-Americans. Confirmed hidden HIV infection was detected in 4.1%, while HIV RE&IC questionnaire was positive in 51.2%. HIV RE&IC questionnaire sensitivity was 100% to predict HIV infection, with a 100% negative predictive value. When considered separately, RE or IC items sensitivity decreases to 86.4% or 91%, and similarly their negative predictive value to 99.9% for both of them. The majority of people studied, 90.8% self-completed HIV RE&IC questionnaire. Median time to complete was 3 minutes. Overall HIV RE&IC questionnaire test-retest Kappa agreement was 0.82 (almost perfect), likewise for IC items 0.89, while for RE items was lower 0.78 (substantial). A feasible and reliable Spanish HIV RE&IC self questionnaire accurately discriminated all non–HIV-infected people without missing any HIV diagnoses, in a low prevalence HIV infection area. The best accuracy and reliability were obtained when combining HIV RE&IC items. PMID:26844471

  13. Investigating Postgraduate College Admission Interviews: Generalizability Theory Reliability and Incremental Predictive Validity

    ERIC Educational Resources Information Center

    Arce-Ferrer, Alvaro J.; Castillo, Irene Borges

    2007-01-01

    The use of face-to-face interviews is controversial for college admissions decisions in light of the lack of availability of validity and reliability evidence for most college admission processes. This study investigated reliability and incremental predictive validity of a face-to-face postgraduate college admission interview with a sample of…

  14. Impact of modellers' decisions on hydrological a priori predictions

    NASA Astrophysics Data System (ADS)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2013-07-01

    The purpose of this paper is to stimulate a re-thinking of how we, the catchment hydrologists, could become reliable forecasters. A group of catchment modellers predicted the hydrological response of a man-made 6 ha catchment in its initial phase (Chicken Creek) without having access to the observed records. They used conceptually different model families. Their modelling experience differed largely. The prediction exercise was organized in three steps: (1) for the 1st prediction modellers received a basic data set describing the internal structure of the catchment (somewhat more complete than usually available to a priori predictions in ungauged catchments). They did not obtain time series of stream flow, soil moisture or groundwater response. (2) Before the 2nd improved prediction they inspected the catchment on-site and attended a workshop where the modellers presented and discussed their first attempts. (3) For their improved 3rd prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step 1. Here, we detail the modeller's decisions in accounting for the various processes based on what they learned during the field visit (step 2) and add the final outcome of step 3 when the modellers made use of additional data. We document the prediction progress as well as the learning process resulting from the availability of added information. For the 2nd and 3rd step, the progress in prediction quality could be evaluated in relation to individual modelling experience and costs of added information. We learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing), and (iii) that added process understanding can be as efficient as adding data for improving parameters needed to satisfy model requirements.

  15. Poor outcome prediction by burst suppression ratio in adults with post-anoxic coma without hypothermia.

    PubMed

    Yang, Qinglin; Su, Yingying; Hussain, Mohammed; Chen, Weibi; Ye, Hong; Gao, Daiquan; Tian, Fei

    2014-05-01

    Burst suppression ratio (BSR) is a quantitative electroencephalography (qEEG) parameter. The purpose of our study was to compare the accuracy of BSR when compared to other EEG parameters in predicting poor outcomes in adults who sustained post-anoxic coma while not being subjected to therapeutic hypothermia. EEG was registered and recorded at least once within 7 days of post-anoxic coma onset. Electrodes were placed according to the international 10-20 system, using a 16-channel layout. Each EEG expert scored raw EEG using a grading scale adapted from Young and scored amplitude-integrated electroencephalography tracings, in addition to obtaining qEEG parameters defined as BSR with a defined threshold. Glasgow outcome scales of 1 and 2 at 3 months, determined by two blinded neurologists, were defined as poor outcome. Sixty patients with Glasgow coma scale score of 8 or less after anoxic accident were included. The sensitivity (97.1%), specificity (73.3%), positive predictive value (82.5%), and negative prediction value (95.0%) of BSR in predicting poor outcome were higher than other EEG variables. BSR1 and BSR2 were reliable in predicting death (area under the curve > 0.8, P < 0.05), with the respective cutoff points being 39.8% and 61.6%. BSR1 was reliable in predicting poor outcome (area under the curve  =  0.820, P < 0.05) with a cutoff point of 23.9%. BSR1 was also an independent predictor of increased risk of death (odds ratio  =  1.042, 95% confidence intervals: 1.012-1.073, P  =  0.006). BSR may be a better predictor in prognosticating poor outcomes in patients with post-anoxic coma who do not undergo therapeutic hypothermia when compared to other qEEG parameters.

  16. Influences on the Test-Retest Reliability of Functional Connectivity MRI and its Relationship with Behavioral Utility.

    PubMed

    Noble, Stephanie; Spann, Marisa N; Tokoglu, Fuyuze; Shen, Xilin; Constable, R Todd; Scheinost, Dustin

    2017-11-01

    Best practices are currently being developed for the acquisition and processing of resting-state magnetic resonance imaging data used to estimate brain functional organization-or "functional connectivity." Standards have been proposed based on test-retest reliability, but open questions remain. These include how amount of data per subject influences whole-brain reliability, the influence of increasing runs versus sessions, the spatial distribution of reliability, the reliability of multivariate methods, and, crucially, how reliability maps onto prediction of behavior. We collected a dataset of 12 extensively sampled individuals (144 min data each across 2 identically configured scanners) to assess test-retest reliability of whole-brain connectivity within the generalizability theory framework. We used Human Connectome Project data to replicate these analyses and relate reliability to behavioral prediction. Overall, the historical 5-min scan produced poor reliability averaged across connections. Increasing the number of sessions was more beneficial than increasing runs. Reliability was lowest for subcortical connections and highest for within-network cortical connections. Multivariate reliability was greater than univariate. Finally, reliability could not be used to improve prediction; these findings are among the first to underscore this distinction for functional connectivity. A comprehensive understanding of test-retest reliability, including its limitations, supports the development of best practices in the field. © The Author 2017. Published by Oxford University Press.

  17. Mapping SOC content and bulk density of a disturbed peatland relict with electromagnetic induction and DEM data

    NASA Astrophysics Data System (ADS)

    Altdorff, Daniel; Bechtold, Michel; van der Kruk, Jan; Tiemeyer, Bärbel; von Hebel, Christian; Huisman, Johan Alexander

    2014-05-01

    Peatlands represent a huge storage of soil organic carbon (SOC), and there is considerable interest to assess the total amount of carbon stored in these ecosystems. However, reliable field-scale information about peat properties, particularly SOC content and bulk density (BD) necessary to estimate C stocks, remains difficult to obtain. A potential way to acquire information on these properties and its spatial variation is the non-invasive mapping of easily recordable physical variables that correlate with peat properties, such as bulk electrical conductivity (ECa) measured with electromagnetic induction (EMI). However, ECa depends on a range of soil properties, including BD, soil and water chemistry, and water content, and thus results often show complex and site-specific relationships. Therefore, a reliable prediction of SOC and BD from ECa data is not necessarily given. In this study, we aim to explore the usefulness of Multiple Linear Regression (MLR) models to predict the peat soil properties SOC and BD from multi-offset EMI and high-resolution DEM data. The quality of the MLR models is assessed by cross-validation. We use data from a medium-scale disturbed peat relict (approximately 35ha) in Northern Germany. The potential explanatory variables considered in MLR were: EMI data of six different integral depths (approximately 0.25, 0.5, 0.6, 0.9, 1, and 1.80 m), their vertical heterogeneity, as well as several topographical variables extracted from the DEM. Ground truth information for SOC, BD content and peat layer thickness was obtained from 34 soil cores of 1 m depth. Each core was divided into several 5 to 20 cm thick layers so that integral information of the upper 0.25, 0.5, and 1 m as well as from the total peat layer was obtained. For cross-validation of results, we clustered the 34 soil cores into 4 classes using K-means clustering and selected 8 cores for validation from the clusters with a probability that depended on the size of the cluster. With the remaining 26 samples, we performed a stepwise MLR and generated separate models for each depth and soil property. Preliminary results indicate reliable model predictions for SOC and BD (R² = 0.83- 0.95). The RMSE values of the validation ranged between 3.5 and 7.2 vol. % for SOC and 0.13 and 0.37 g/cm³ for BD for the independent samples. This equates roughly the quality of SOC predictions obtained by field application of vis-NIR (visible-near infrared) presented in literature for a similar peatland setting. However, the EMI approach offers the potential to derive information from deeper depths and allows non-invasive mapping of BD variability, which is not possible with vis-NIR. Therefore, this new approach potentially provides a more useful tool for total carbon stock assessment in peatlands.

  18. Analysis of energy-based algorithms for RNA secondary structure prediction

    PubMed Central

    2012-01-01

    Background RNA molecules play critical roles in the cells of organisms, including roles in gene regulation, catalysis, and synthesis of proteins. Since RNA function depends in large part on its folded structures, much effort has been invested in developing accurate methods for prediction of RNA secondary structure from the base sequence. Minimum free energy (MFE) predictions are widely used, based on nearest neighbor thermodynamic parameters of Mathews, Turner et al. or those of Andronescu et al. Some recently proposed alternatives that leverage partition function calculations find the structure with maximum expected accuracy (MEA) or pseudo-expected accuracy (pseudo-MEA) methods. Advances in prediction methods are typically benchmarked using sensitivity, positive predictive value and their harmonic mean, namely F-measure, on datasets of known reference structures. Since such benchmarks document progress in improving accuracy of computational prediction methods, it is important to understand how measures of accuracy vary as a function of the reference datasets and whether advances in algorithms or thermodynamic parameters yield statistically significant improvements. Our work advances such understanding for the MFE and (pseudo-)MEA-based methods, with respect to the latest datasets and energy parameters. Results We present three main findings. First, using the bootstrap percentile method, we show that the average F-measure accuracy of the MFE and (pseudo-)MEA-based algorithms, as measured on our largest datasets with over 2000 RNAs from diverse families, is a reliable estimate (within a 2% range with high confidence) of the accuracy of a population of RNA molecules represented by this set. However, average accuracy on smaller classes of RNAs such as a class of 89 Group I introns used previously in benchmarking algorithm accuracy is not reliable enough to draw meaningful conclusions about the relative merits of the MFE and MEA-based algorithms. Second, on our large datasets, the algorithm with best overall accuracy is a pseudo MEA-based algorithm of Hamada et al. that uses a generalized centroid estimator of base pairs. However, between MFE and other MEA-based methods, there is no clear winner in the sense that the relative accuracy of the MFE versus MEA-based algorithms changes depending on the underlying energy parameters. Third, of the four parameter sets we considered, the best accuracy for the MFE-, MEA-based, and pseudo-MEA-based methods is 0.686, 0.680, and 0.711, respectively (on a scale from 0 to 1 with 1 meaning perfect structure predictions) and is obtained with a thermodynamic parameter set obtained by Andronescu et al. called BL* (named after the Boltzmann likelihood method by which the parameters were derived). Conclusions Large datasets should be used to obtain reliable measures of the accuracy of RNA structure prediction algorithms, and average accuracies on specific classes (such as Group I introns and Transfer RNAs) should be interpreted with caution, considering the relatively small size of currently available datasets for such classes. The accuracy of the MEA-based methods is significantly higher when using the BL* parameter set of Andronescu et al. than when using the parameters of Mathews and Turner, and there is no significant difference between the accuracy of MEA-based methods and MFE when using the BL* parameters. The pseudo-MEA-based method of Hamada et al. with the BL* parameter set significantly outperforms all other MFE and MEA-based algorithms on our large data sets. PMID:22296803

  19. Analysis of energy-based algorithms for RNA secondary structure prediction.

    PubMed

    Hajiaghayi, Monir; Condon, Anne; Hoos, Holger H

    2012-02-01

    RNA molecules play critical roles in the cells of organisms, including roles in gene regulation, catalysis, and synthesis of proteins. Since RNA function depends in large part on its folded structures, much effort has been invested in developing accurate methods for prediction of RNA secondary structure from the base sequence. Minimum free energy (MFE) predictions are widely used, based on nearest neighbor thermodynamic parameters of Mathews, Turner et al. or those of Andronescu et al. Some recently proposed alternatives that leverage partition function calculations find the structure with maximum expected accuracy (MEA) or pseudo-expected accuracy (pseudo-MEA) methods. Advances in prediction methods are typically benchmarked using sensitivity, positive predictive value and their harmonic mean, namely F-measure, on datasets of known reference structures. Since such benchmarks document progress in improving accuracy of computational prediction methods, it is important to understand how measures of accuracy vary as a function of the reference datasets and whether advances in algorithms or thermodynamic parameters yield statistically significant improvements. Our work advances such understanding for the MFE and (pseudo-)MEA-based methods, with respect to the latest datasets and energy parameters. We present three main findings. First, using the bootstrap percentile method, we show that the average F-measure accuracy of the MFE and (pseudo-)MEA-based algorithms, as measured on our largest datasets with over 2000 RNAs from diverse families, is a reliable estimate (within a 2% range with high confidence) of the accuracy of a population of RNA molecules represented by this set. However, average accuracy on smaller classes of RNAs such as a class of 89 Group I introns used previously in benchmarking algorithm accuracy is not reliable enough to draw meaningful conclusions about the relative merits of the MFE and MEA-based algorithms. Second, on our large datasets, the algorithm with best overall accuracy is a pseudo MEA-based algorithm of Hamada et al. that uses a generalized centroid estimator of base pairs. However, between MFE and other MEA-based methods, there is no clear winner in the sense that the relative accuracy of the MFE versus MEA-based algorithms changes depending on the underlying energy parameters. Third, of the four parameter sets we considered, the best accuracy for the MFE-, MEA-based, and pseudo-MEA-based methods is 0.686, 0.680, and 0.711, respectively (on a scale from 0 to 1 with 1 meaning perfect structure predictions) and is obtained with a thermodynamic parameter set obtained by Andronescu et al. called BL* (named after the Boltzmann likelihood method by which the parameters were derived). Large datasets should be used to obtain reliable measures of the accuracy of RNA structure prediction algorithms, and average accuracies on specific classes (such as Group I introns and Transfer RNAs) should be interpreted with caution, considering the relatively small size of currently available datasets for such classes. The accuracy of the MEA-based methods is significantly higher when using the BL* parameter set of Andronescu et al. than when using the parameters of Mathews and Turner, and there is no significant difference between the accuracy of MEA-based methods and MFE when using the BL* parameters. The pseudo-MEA-based method of Hamada et al. with the BL* parameter set significantly outperforms all other MFE and MEA-based algorithms on our large data sets.

  20. Examining parents' ratings of middle-school students' academic self-regulation using principal axis factoring analysis.

    PubMed

    Chen, Peggy P; Cleary, Timothy J; Lui, Angela M

    2015-09-01

    This study examined the reliability and validity of a parent rating scale, the Self-Regulation Strategy Inventory: Parent Rating Scale (SRSI-PRS), using a sample of 451 parents of sixth- and seventh-grade middle-school students. Principal axis factoring (PAF) analysis revealed a 3-factor structure for the 23-item SRSI-PRS: (a) Managing Behavior and Learning (α = .92), (b) Maladaptive Regulatory Behaviors (α = .76), and (c) Managing Environment (α = .84). The majority of the observed relations between these 3 subscales, and the SRSI-SR, student motivation beliefs, and student mathematics grades were statistically significant and in the small to medium range. After controlling for various student variables and motivation indices of parental involvement, 2 SRSI-PRS factors (Managing Behavior and Learning, Maladaptive Regulatory Behaviors) reliably predicted students' achievement in their mathematics course. This study provides initial support for the validity and reliability of the SRSI-PRS and underscores the advantages of obtaining parental ratings of students' SRL behaviors. (c) 2015 APA, all rights reserved).

  1. Ternary isocratic mobile phase optimization utilizing resolution Design Space based on retention time and peak width modeling.

    PubMed

    Kawabe, Takefumi; Tomitsuka, Toshiaki; Kajiro, Toshi; Kishi, Naoyuki; Toyo'oka, Toshimasa

    2013-01-18

    An optimization procedure of ternary isocratic mobile phase composition in the HPLC method using a statistical prediction model and visualization technique is described. In this report, two prediction models were first evaluated to obtain reliable prediction results. The retention time prediction model was constructed by modification from past respectable knowledge of retention modeling against ternary solvent strength changes. An excellent correlation between observed and predicted retention time was given in various kinds of pharmaceutical compounds by the multiple regression modeling of solvent strength parameters. The peak width of half height prediction model employed polynomial fitting of the retention time, because a linear relationship between the peak width of half height and the retention time was not obtained even after taking into account the contribution of the extra-column effect based on a moment method. Accurate prediction results were able to be obtained by such model, showing mostly over 0.99 value of correlation coefficient between observed and predicted peak width of half height. Then, a procedure to visualize a resolution Design Space was tried as the secondary challenge. An artificial neural network method was performed to link directly between ternary solvent strength parameters and predicted resolution, which were determined by accurate prediction results of retention time and a peak width of half height, and to visualize appropriate ternary mobile phase compositions as a range of resolution over 1.5 on the contour profile. By using mixtures of similar pharmaceutical compounds in case studies, we verified a possibility of prediction to find the optimal range of condition. Observed chromatographic results on the optimal condition mostly matched with the prediction and the average of difference between observed and predicted resolution were approximately 0.3. This means that enough accuracy for prediction could be achieved by the proposed procedure. Consequently, the procedure to search the optimal range of ternary solvent strength achieving an appropriate separation is provided by using the resolution Design Space based on accurate prediction. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction

    NASA Astrophysics Data System (ADS)

    Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad

    2018-03-01

    In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.

  3. Statistical evaluation for stability studies under stress storage conditions.

    PubMed

    Gil-Alegre, M E; Bernabeu, J A; Camacho, M A; Torres-Suarez, A I

    2001-11-01

    During the pharmaceutical development of a new drug, it is necessary to select as soon as possible the formulation with the best stability characteristics. The current International Commission for Harmonisation (ICH) regulations regarding stability testing requirements for a Registration Application provide the stress testing conditions with the aim of assessing the effect of severe conditions on the drug product. In practice, the well-known Arrhenius theory is still used to make a rapid stability prediction, to estimate a drug product shelf life during early stages of its pharmaceutical development. In this work, both the planning of a stress stability study to obtain a correct stability prediction from a temperature extrapolation and the suitable data treatment to discern the reliability of the stability results are discussed. The study was focused on the early formulation step of a very stable drug, Mitonafide (antineoplastic agent), formulated in a parenteral solution and in tablets. It was observed, for the solid system, that the extrapolated results using Arrhenius theory might be statistically good, but far from the real situation if the stability study is not designed in a correct way. The statistical data treatment and the stress-stability test proposed in this work are suitable to make a reliable stability prediction of different formulations with the same drug, within its pharmaceutical development.

  4. Fast and reliable prediction of domain-peptide binding affinity using coarse-grained structure models.

    PubMed

    Tian, Feifei; Tan, Rui; Guo, Tailin; Zhou, Peng; Yang, Li

    2013-07-01

    Domain-peptide recognition and interaction are fundamentally important for eukaryotic signaling and regulatory networks. It is thus essential to quantitatively infer the binding stability and specificity of such interaction based upon large-scale but low-accurate complex structure models which could be readily obtained from sophisticated molecular modeling procedure. In the present study, a new method is described for the fast and reliable prediction of domain-peptide binding affinity with coarse-grained structure models. This method is designed to tolerate strong random noises involved in domain-peptide complex structures and uses statistical modeling approach to eliminate systematic bias associated with a group of investigated samples. As a paradigm, this method was employed to model and predict the binding behavior of various peptides to four evolutionarily unrelated peptide-recognition domains (PRDs), i.e. human amph SH3, human nherf PDZ, yeast syh GYF and yeast bmh 14-3-3, and moreover, we explored the molecular mechanism and biological implication underlying the binding of cognate and noncognate peptide ligands to their domain receptors. It is expected that the newly proposed method could be further used to perform genome-wide inference of domain-peptide binding at three-dimensional structure level. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. Rapid biochemical methane potential prediction of urban organic waste with near-infrared reflectance spectroscopy.

    PubMed

    Fitamo, T; Triolo, J M; Boldrin, A; Scheutz, C

    2017-08-01

    The anaerobic digestibility of various biomass feedstocks in biogas plants is determined with biochemical methane potential (BMP) assays. However, experimental BMP analysis is time-consuming, costly and challenging to optimise stock management and feeding to achieve improved biogas production. The aim of the present study is to develop a fast and reliable model based on near-infrared reflectance spectroscopy (NIRS) for the BMP prediction of urban organic waste (UOW). The model comprised 87 UOW samples. Additionally, 88 plant biomass samples were included, to develop a combined model predicting BMP. The coefficient of determination (R 2 ) and root mean square error in prediction (RMSE P ) of the UOW model were 0.88 and 44 mL CH 4 /g VS, while the combined model was 0.89 and 50 mL CH 4 /g VS. Improved model performance was obtained for the two individual models compared to the combined version. The BMP prediction with NIRS was satisfactory and moderately successful. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Uncertainty Quantification in Remaining Useful Life of Aerospace Components using State Space Models and Inverse FORM

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2013-01-01

    This paper investigates the use of the inverse first-order reliability method (inverse- FORM) to quantify the uncertainty in the remaining useful life (RUL) of aerospace components. The prediction of remaining useful life is an integral part of system health prognosis, and directly helps in online health monitoring and decision-making. However, the prediction of remaining useful life is affected by several sources of uncertainty, and therefore it is necessary to quantify the uncertainty in the remaining useful life prediction. While system parameter uncertainty and physical variability can be easily included in inverse-FORM, this paper extends the methodology to include: (1) future loading uncertainty, (2) process noise; and (3) uncertainty in the state estimate. The inverse-FORM method has been used in this paper to (1) quickly obtain probability bounds on the remaining useful life prediction; and (2) calculate the entire probability distribution of remaining useful life prediction, and the results are verified against Monte Carlo sampling. The proposed methodology is illustrated using a numerical example.

  7. [Validation of the abbreviated Zarit scales for measuring burden syndrome in the primary caregiver of an elderly patient].

    PubMed

    Vélez Lopera, Johana María; Berbesí Fernández, Dedsy; Cardona Arango, Doris; Segura Cardona, Angela; Ordóñez Molina, Jaime

    2012-07-01

    To determine which abbreviated Zarit Scale (ZS) better evaluates the burden of the caregiver of an elderly patient in Medellin, Colombia. Validation study. Primary Care setting in the city of Medellin. Primary caregiver of dependent elderly patients over 65 years old. Sensitivity, specificity, positive predictive value, and negative predictive value for the different abbreviated Zarit scales, plus performing a reliability analysis using the Cronbach Alpha coefficient. The abbreviated scales obtained a sensitivity of between 36.84 and 81.58%, specificity between 95.99 and 100%, positive predictive values between 71.05 and 100%, and negative predictive values of between 91.64 and 97.42%. The scale that better determined caregiver burden in Primary Care was the Bedard Screening scale, with a sensitivity of 81.58%, a specificity of 96.35% and positive and negative predictive values of 75.61% and 97.42%, respectively. Copyright © 2010 Elsevier España, S.L. All rights reserved.

  8. Predicting age from cortical structure across the lifespan.

    PubMed

    Madan, Christopher R; Kensinger, Elizabeth A

    2018-03-01

    Despite interindividual differences in cortical structure, cross-sectional and longitudinal studies have demonstrated a large degree of population-level consistency in age-related differences in brain morphology. This study assessed how accurately an individual's age could be predicted by estimates of cortical morphology, comparing a variety of structural measures, including thickness, gyrification and fractal dimensionality. Structural measures were calculated across up to seven different parcellation approaches, ranging from one region to 1000 regions. The age prediction framework was trained using morphological measures obtained from T1-weighted MRI volumes collected from multiple sites, yielding a training dataset of 1056 healthy adults, aged 18-97. Age predictions were calculated using a machine-learning approach that incorporated nonlinear differences over the lifespan. In two independent, held-out test samples, age predictions had a median error of 6-7 years. Age predictions were best when using a combination of cortical metrics, both thickness and fractal dimensionality. Overall, the results reveal that age-related differences in brain structure are systematic enough to enable reliable age prediction based on metrics of cortical morphology. © 2018 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  9. Science and Technology Review June 2000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Pruneda, J.H.

    2000-06-01

    This issue contains the following articles: (1) ''Accelerating on the ASCI Challenge''. (2) ''New Day Daws in Supercomputing'' When the ASCI White supercomputer comes online this summer, DOE's Stockpile Stewardship Program will make another significant advanced toward helping to ensure the safety, reliability, and performance of the nation's nuclear weapons. (3) ''Uncovering the Secrets of Actinides'' Researchers are obtaining fundamental information about the actinides, a group of elements with a key role in nuclear weapons and fuels. (4) ''A Predictable Structure for Aerogels''. (5) ''Tibet--Where Continents Collide''.

  10. Catastrophe optics of sharp-edge diffraction.

    PubMed

    Borghi, Riccardo

    2016-07-01

    A classical problem of diffraction theory, namely plane wave diffraction by sharp-edge apertures, is here reformulated from the viewpoint of the fairly new subject of catastrophe optics. On using purely geometrical arguments, properly embedded into a wave optics context, uniform analytical estimates of the diffracted wavefield at points close to fold caustics are obtained, within paraxial approximation, in terms of the Airy function and its first derivative. Diffraction from parabolic apertures is proposed to test reliability and accuracy of our theoretical predictions.

  11. A non-invasive experimental approach for surface temperature measurements on semi-crystalline thermoplastics

    NASA Astrophysics Data System (ADS)

    Boztepe, Sinan; Gilblas, Remi; de Almeida, Olivier; Le Maoult, Yannick; Schmidt, Fabrice

    2017-10-01

    Most of the thermoforming processes of thermoplastic polymers and their composites are performed adopting a combined heating and forming stages at which a precursor is heated prior to the forming. This step is done in order to improve formability by softening the thermoplastic polymer. Due to low thermal conductivity and semi-transparency of polymers, infrared (IR) heating is widely used for thermoforming of such materials. Predictive radiation heat transfer models for temperature distributions are therefore critical for optimizations of thermoforming process. One of the key challenges is to build a predictive model including the physical background of radiation heat transfer phenomenon in semi-crystalline thermoplastics as their microcrystalline structure introduces an optically heterogeneous medium. In addition, the accuracy of a predictive model is required to be validated experimentally where IR thermography is one of the suitable methods for such a validation as it provides a non-invasive, full-field surface temperature measurement. Although IR cameras provide a non-invasive measurement, a key issue for obtaining a reliable measurement depends on the optical characteristics of a heated material and the operating spectral band of IR camera. It is desired that the surface of a material to be measured has a spectral band where the material behaves opaque and an employed IR camera operates in the corresponding band. In this study, the optical characteristics of the PO-based polymer are discussed and, an experimental approach is proposed in order to measure the surface temperature of the PO-based polymer via IR thermography. The preliminary analyses showed that IR thermographic measurements may not be simply performed on PO-based polymers and require a correction method as their semi-transparent medium introduce a challenge to obtain reliable surface temperature measurements.

  12. GalaxyTBM: template-based modeling by building a reliable core and refining unreliable local regions.

    PubMed

    Ko, Junsu; Park, Hahnbeom; Seok, Chaok

    2012-08-10

    Protein structures can be reliably predicted by template-based modeling (TBM) when experimental structures of homologous proteins are available. However, it is challenging to obtain structures more accurate than the single best templates by either combining information from multiple templates or by modeling regions that vary among templates or are not covered by any templates. We introduce GalaxyTBM, a new TBM method in which the more reliable core region is modeled first from multiple templates and less reliable, variable local regions, such as loops or termini, are then detected and re-modeled by an ab initio method. This TBM method is based on "Seok-server," which was tested in CASP9 and assessed to be amongst the top TBM servers. The accuracy of the initial core modeling is enhanced by focusing on more conserved regions in the multiple-template selection and multiple sequence alignment stages. Additional improvement is achieved by ab initio modeling of up to 3 unreliable local regions in the fixed framework of the core structure. Overall, GalaxyTBM reproduced the performance of Seok-server, with GalaxyTBM and Seok-server resulting in average GDT-TS of 68.1 and 68.4, respectively, when tested on 68 single-domain CASP9 TBM targets. For application to multi-domain proteins, GalaxyTBM must be combined with domain-splitting methods. Application of GalaxyTBM to CASP9 targets demonstrates that accurate protein structure prediction is possible by use of a multiple-template-based approach, and ab initio modeling of variable regions can further enhance the model quality.

  13. Prediction of thoracic dimensions and spine length on the basis of individual pelvic dimensions: validation of the use of pelvic inlet width obtained by radiographs compared with computed tomography.

    PubMed

    Gold, Meryl; Dombek, Michael; Miller, Patricia E; Emans, John B; Glotzbecker, Michael P

    2014-01-01

    Retrospective review. To validate the pelvic inlet width (PIW) measurement obtained on radiograph as an independent standard used to correlate with thoracic dimensions (TDs) in treated and untreated patients with early-onset scoliosis. In children with early-onset scoliosis, the change in TD and spine length is a key treatment goal. Quantifying this change is confounded by varied growth rates and differing diagnoses. PIW measured on computed tomographic (CT) scan in patients without scoliosis has been shown to correlate with TD in an age-independent manner. The first arm included 49 patients with scoliosis who had both a CT scan and pelvic radiograph. Agreement between PIW measurements on CT scan and radiograph was analyzed. The second arm consisted of 163 patients (age, 0.2-18.7 yr), with minimal spinal deformity (mean Cobb, 9.0°) and radiographs in which PIW was measurable. PIW was compared with previously published CT-based TD measurements; maximal chest width, T1-T12 height, and T1-S1 height. Linear regression analysis was used to develop and validate sex-specific predictive equations for each TD measurement on the basis of PIW. Interobserver reliability was evaluated for all measurements. Bland-Altman analysis indicated agreement with no dependence on observed value, but a consistent 8.5 mm (95% CI: 7.2-9.9 mm) difference in CT scan measurement compared with radiographical PIW measurement. Sex and PIW were significantly correlated to each TD measurement (P < 0.01). Predictive models were validated and may be used to estimate TD measurements on the basis of sex and radiographical PIW. Intraclass correlation coefficients for all measurements were between 0.978 and 0.997. PIW on radiographs and CT scan correlate in patients with deformity and with spine and TD in patients with minimal deformity. It is a fast, reliable method of assessing growth while lowering patient's radiation exposure. It can be reliably used to assess patients with early-onset scoliosis and the impact surgical treatment has on chest and spinal growth. 3.

  14. Approximation of reliabilities for multiple-trait model with maternal effects.

    PubMed

    Strabel, T; Misztal, I; Bertrand, J K

    2001-04-01

    Reliabilities for a multiple-trait maternal model were obtained by combining reliabilities obtained from single-trait models. Single-trait reliabilities were obtained using an approximation that supported models with additive and permanent environmental effects. For the direct effect, the maternal and permanent environmental variances were assigned to the residual. For the maternal effect, variance of the direct effect was assigned to the residual. Data included 10,550 birth weight, 11,819 weaning weight, and 3,617 postweaning gain records of Senepol cattle. Reliabilities were obtained by generalized inversion and by using single-trait and multiple-trait approximation methods. Some reliabilities obtained by inversion were negative because inbreeding was ignored in calculating the inverse of the relationship matrix. The multiple-trait approximation method reduced the bias of approximation when compared with the single-trait method. The correlations between reliabilities obtained by inversion and by multiple-trait procedures for the direct effect were 0.85 for birth weight, 0.94 for weaning weight, and 0.96 for postweaning gain. Correlations for maternal effects for birth weight and weaning weight were 0.96 to 0.98 for both approximations. Further improvements can be achieved by refining the single-trait procedures.

  15. The ventriloquist in periphery: impact of eccentricity-related reliability on audio-visual localization.

    PubMed

    Charbonneau, Geneviève; Véronneau, Marie; Boudrias-Fournier, Colin; Lepore, Franco; Collignon, Olivier

    2013-10-28

    The relative reliability of separate sensory estimates influences the way they are merged into a unified percept. We investigated how eccentricity-related changes in reliability of auditory and visual stimuli influence their integration across the entire frontal space. First, we surprisingly found that despite a strong decrease in auditory and visual unisensory localization abilities in periphery, the redundancy gain resulting from the congruent presentation of audio-visual targets was not affected by stimuli eccentricity. This result therefore contrasts with the common prediction that a reduction in sensory reliability necessarily induces an enhanced integrative gain. Second, we demonstrate that the visual capture of sounds observed with spatially incongruent audio-visual targets (ventriloquist effect) steadily decreases with eccentricity, paralleling a lowering of the relative reliability of unimodal visual over unimodal auditory stimuli in periphery. Moreover, at all eccentricities, the ventriloquist effect positively correlated with a weighted combination of the spatial resolution obtained in unisensory conditions. These findings support and extend the view that the localization of audio-visual stimuli relies on an optimal combination of auditory and visual information according to their respective spatial reliability. All together, these results evidence that the external spatial coordinates of multisensory events relative to an observer's body (e.g., eyes' or head's position) influence how this information is merged, and therefore determine the perceptual outcome.

  16. Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.

    2005-01-01

    An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  17. Identification of Some Zeolite Group Minerals by Application of Artificial Neural Network and Decision Tree Algorithm Based on SEM-EDS Data

    NASA Astrophysics Data System (ADS)

    Akkaş, Efe; Evren Çubukçu, H.; Akin, Lutfiye; Erkut, Volkan; Yurdakul, Yasin; Karayigit, Ali Ihsan

    2016-04-01

    Identification of zeolite group minerals is complicated due to their similar chemical formulas and habits. Although the morphologies of various zeolite crystals can be recognized under Scanning Electron Microscope (SEM), it is relatively more challenging and problematic process to identify zeolites using their mineral chemical data. SEMs integrated with energy dispersive X-ray spectrometers (EDS) provide fast and reliable chemical data of minerals. However, considering elemental similarities of characteristic chemical formulae of zeolite species (e.g. Clinoptilolite ((Na,K,Ca)2 -3Al3(Al,Si)2Si13O3612H2O) and Erionite ((Na2,K2,Ca)2Al4Si14O36ṡ15H2O)) EDS data alone does not seem to be sufficient for correct identification. Furthermore, the physical properties of the specimen (e.g. roughness, electrical conductivity) and the applied analytical conditions (e.g. accelerating voltage, beam current, spot size) of the SEM-EDS should be uniform in order to obtain reliable elemental results of minerals having high alkali (Na, K) and H2O (approx. %14-18) contents. This study which was funded by The Scientific and Technological Research Council of Turkey (TUBITAK Project No: 113Y439), aims to construct a database as large as possible for various zeolite minerals and to develop a general prediction model for the identification of zeolite minerals using SEM-EDS data. For this purpose, an artificial neural network and rule based decision tree algorithm were employed. Throughout the analyses, a total of 1850 chemical data were collected from four distinct zeolite species, (Clinoptilolite-Heulandite, Erionite, Analcime and Mordenite) observed in various rocks (e.g. coals, pyroclastics). In order to obtain a representative training data set for each minerals, a selection procedure for reference mineral analyses was applied. During the selection procedure, SEM based crystal morphology data, XRD spectra and re-calculated cationic distribution, obtained by EDS have been used for the characterization of the training set. Consequently, for each zeolite species 250 EDS data (as elemental intensities) used for training and 200 ±50 analyses were tested. Finally, two prediction models were developed. The constructed models with various cross-correlation values (r) yielded an average accuracy of >91% for the best predictions using C5.0 Decision Tree algorithm and back propagation artificial neural network. Despite having similar accuracies, the developed models exhibit different prediction behaviors for some zeolite minerals. The results demonstrate that artificial neural network as a nonlinear tool and decision tree algorithm as a rule based prediction model would be employed to provide considerably efficient and reliable identification/classification of some zeolite minerals regardless of their similar elemental compositions. Keywords: mineral identification; zeolites; energy dispersive spectrometry; artificial neural networks; decision tree.

  18. Uncertainties in obtaining high reliability from stress-strength models

    NASA Technical Reports Server (NTRS)

    Neal, Donald M.; Matthews, William T.; Vangel, Mark G.

    1992-01-01

    There has been a recent interest in determining high statistical reliability in risk assessment of aircraft components. The potential consequences are identified of incorrectly assuming a particular statistical distribution for stress or strength data used in obtaining the high reliability values. The computation of the reliability is defined as the probability of the strength being greater than the stress over the range of stress values. This method is often referred to as the stress-strength model. A sensitivity analysis was performed involving a comparison of reliability results in order to evaluate the effects of assuming specific statistical distributions. Both known population distributions, and those that differed slightly from the known, were considered. Results showed substantial differences in reliability estimates even for almost nondetectable differences in the assumed distributions. These differences represent a potential problem in using the stress-strength model for high reliability computations, since in practice it is impossible to ever know the exact (population) distribution. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability.

  19. Software reliability models for fault-tolerant avionics computers and related topics

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1987-01-01

    Software reliability research is briefly described. General research topics are reliability growth models, quality of software reliability prediction, the complete monotonicity property of reliability growth, conceptual modelling of software failure behavior, assurance of ultrahigh reliability, and analysis techniques for fault-tolerant systems.

  20. Reliability of unaided naked-eye examination as a screening test for cervical lesions in a developing country setup.

    PubMed

    Darwish, Atef M; Abdulla, Sayed A; Zahran, Kamal M; Abdel-Fattah, Nermat A

    2013-04-01

    This study aimed to test the reliability of unaided naked-eye examination (UNEE) of the cervix as a sole cervical cancer screening test in a developing country setup compared with the standard cervical cytology. A total of 3,500 nonpregnant women aged between 25 and 55 years were included. An unlubricated bivalve speculum was inserted into the vagina under good light to visualize the cervix. A thorough UNEE of the cervix was done to detect any apparent lesions. Cervical smears were obtained using the long tip of an Ayre spatula. An additional endocervical sample was obtained by cytobrush. Women with abnormal Pap smears or visible cervical lesions by UNEE were scheduled for colposcopic examination. A biopsy specimen was obtained in every abnormal colposcopic examination. Of 3,500 cases, there were 9 (2.57%) preinvasive cervical lesions (cervical intraepithelial neoplasia 1-3) diagnosed with various diagnostic tools used in the study and confirmed by histopathologic examination. Of 3,500 cases, invasive cervical lesions were diagnosed in 6 (1.71%). The sensitivity of UNEE is much better than that of Pap smear (80% vs 60%) but less than that of colposcopy (86.7%). However, the specificity of UNEE (100%) is lower than that of Pap smear (91.16%) and better than that colposcopy (83.12%). The UNEE has a poor positive predictive value (3.75%) when compared with Pap smear (100%) and colposcopy (20%). The negative predictive values of the 3 tests were nearly comparable. Whenever access to Pap smear is limited, UNEE performed by general gynecologists and well-trained nurses is an acceptable alternative for detecting cervical premalignant or malignant lesions especially in low-resource settings.

  1. Influencial factors in thermographic analysis in substations

    NASA Astrophysics Data System (ADS)

    Zarco-Periñán, Pedro J.; Martínez-Ramos, José L.

    2018-05-01

    Thermography is one of the best predictive maintenance tools available due to its low cost, fast implementation and effectiveness of the results obtained. The detected hot spots enable serious incidents to be prevented, both in the facilities and equipment where they have been located. In accordance with the criticality of such points, the repair is carried out with greater or lesser urgency. However, for detection to remain reliable, the facility must meet a set of requirements that are normally assumed, otherwise hot spots cannot be detected correctly and will subsequently cause unwanted defects. This paper analyses three aspects that influence the reliability of the results obtained: the minimum percentage of load that a circuit must contain in order to be able to locate all the hot spots therein; the minimum waiting time from when an item of equipment or facility is energized until a thermographic inspection can be carried out with a complete guarantee of hot spot detection; and the influence on the generation of hot spots exerted by the tightening torque realized in the assembly process.

  2. Determination of the coronal magnetic field from vector magnetograph data

    NASA Technical Reports Server (NTRS)

    Mikic, Zoran

    1991-01-01

    A new algorithm was developed, tested, and applied to determine coronal magnetic fields above solar active regions. The coronal field above NOAA active region AR5747 was successfully estimated on 20 Oct. 1989 from data taken at the Mees Solar Observatory of the Univ. of Hawaii. It was shown that observational data can be used to obtain realistic estimates of coronal magnetic fields. The model has significantly extended the realism with which the coronal magnetic field can be inferred from observations. The understanding of coronal phenomena will be greatly advanced by a reliable technique, such as the one presented, for deducing the detailed spatial structure of the coronal field. The payoff from major current and proposed NASA observational efforts is heavily dependent on the success with which the coronal field can be inferred from vector magnetograms. In particular, the present inability to reliably obtain the coronal field has been a major obstacle to the theoretical advancement of solar flare theory and prediction. The results have shown that the evolutional algorithm can be used to estimate coronal magnetic fields.

  3. NATO IST 124 Experimentation Instructions

    DTIC Science & Technology

    2016-11-10

    more reliable and predictable network performance through adaptive and efficient control schemes . This report provides guidance and instructions for...tactical heterogeneous networks for more reliable and predictable network performance through adaptive and efficient control schemes . This report

  4. How useful are ARFI elastography cut-off values proposed by meta-analysis for predicting the significant fibrosis and compensated liver cirrhosis?

    PubMed

    Bota, Simona; Sporea, Ioan; Sirli, Roxana; Popescu, Alina; Gradinaru-Tascau, Oana

    2015-06-01

    To evaluate how often do we "miss" chronic hepatitis C patients with at least significant fibrosis (F>/=2) and those with compensated cirrhosis, by using Acoustic Radiation Force Impulse (ARFI) elastography cut-off values proposed by meta-analysis. Our study included 132 patients with chronic hepatitis C, evaluated by means of ARFI and liver biopsy (LB), in the same session. Reliable measurements were defined as: median value of 10 liver stiffness (LS) measurements with a success rate>/=60% and an interquartile range interval<30%. For predicting F>/=2 and F=4 we used the LS cut-offs proposed in the last published meta-analysis: 1.35 m/s and 1.87 m/s, respectively. Reliable LS measurements by means of ARFI were obtained in 117 patients (87.9%). In our study, 58 patients (49.6%) had LS values <1.35 m/s; from these 75.8% had F>/=2 in LB. From the 59 patients (50.4%) with LS values>/=1.35 m/s, only 6.8% had F0 or F1 in LB. Also, in our study, 88 patients (75.3%) had LS values <1.87 m/s; from these only 2.2 % had F4 in LB. From the 29 patients (24.7%) with LS values>/=1.87 m/s, 41.3% had F4 in LB. Both for prediction of at least significant fibrosis and liver cirrhosis, higher aminotransferases levels were associated with wrongly classified patients, in univariate and multivariate analysis. ARFI elastography had a very good positive predictive value (93.2%) for predicting the presence of significant fibrosis and excellent negative predictive value (97.8%) for excluding the presence of compensated liver cirrhosis.

  5. Functional classification of protein structures by local structure matching in graph representation.

    PubMed

    Mills, Caitlyn L; Garg, Rohan; Lee, Joslynn S; Tian, Liang; Suciu, Alexandru; Cooperman, Gene; Beuning, Penny J; Ondrechen, Mary Jo

    2018-03-31

    As a result of high-throughput protein structure initiatives, over 14,400 protein structures have been solved by structural genomics (SG) centers and participating research groups. While the totality of SG data represents a tremendous contribution to genomics and structural biology, reliable functional information for these proteins is generally lacking. Better functional predictions for SG proteins will add substantial value to the structural information already obtained. Our method described herein, Graph Representation of Active Sites for Prediction of Function (GRASP-Func), predicts quickly and accurately the biochemical function of proteins by representing residues at the predicted local active site as graphs rather than in Cartesian coordinates. We compare the GRASP-Func method to our previously reported method, structurally aligned local sites of activity (SALSA), using the ribulose phosphate binding barrel (RPBB), 6-hairpin glycosidase (6-HG), and Concanavalin A-like Lectins/Glucanase (CAL/G) superfamilies as test cases. In each of the superfamilies, SALSA and the much faster method GRASP-Func yield similar correct classification of previously characterized proteins, providing a validated benchmark for the new method. In addition, we analyzed SG proteins using our SALSA and GRASP-Func methods to predict function. Forty-one SG proteins in the RPBB superfamily, nine SG proteins in the 6-HG superfamily, and one SG protein in the CAL/G superfamily were successfully classified into one of the functional families in their respective superfamily by both methods. This improved, faster, validated computational method can yield more reliable predictions of function that can be used for a wide variety of applications by the community. © 2018 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  6. Spatial epidemiology of bovine tuberculosis in Mexico.

    PubMed

    Martínez, Horacio Zendejas; Suazo, Feliciano Milián; Cuador Gil, José Quintín; Bello, Gustavo Cruz; Anaya Escalera, Ana María; Márquez, Gabriel Huitrón; Casanova, Leticia García

    2007-01-01

    The purpose of this study was to use geographic information systems (GIS) and geo-statistical methods of ordinary kriging to predict the prevalence and distribution of bovine tuberculosis (TB) in Jalisco, Mexico. A random sample of 2 287 herds selected from a set of 48 766 was used for the analysis. Spatial location of herds was obtained by either a personal global positioning system (GPS), a database from the Instituto Nacional de Estadìstica Geografìa e Informàtica (INEGI) or Google Earth. Information on TB prevalence was provided by the Jalisco Commission for the Control and Eradication of Tuberculosis (COEETB). Prediction of TB was obtained using ordinary kriging in the geostatistical analyst module in ArcView8. A predicted high prevalence area of TB matching the distribution of dairy cattle was observed. This prediction was in agreement with the prevalence calculated on the total 48 766 herds. Validation was performed taking estimated values of TB prevalence at each municipality, extracted from the kriging surface and then compared with the real prevalence values using a correlation test, giving a value of 0.78, indicating that GIS and kriging are reliable tools for the estimation of TB distribution based on a random sample. This resulted in a significant savings of resources.

  7. Time-Variant Reliability Analysis for Rubber O-Ring Seal Considering Both Material Degradation and Random Load

    PubMed Central

    Liao, Baopeng; Yan, Meichen; Zhang, Weifang; Zhou, Kun

    2017-01-01

    Due to the increase in working hours, the reliability of rubber O-ring seals used in hydraulic systems of transfer machines will change. While traditional methods can only analyze one of the material properties or seal properties, the failure of the O-ring is caused by these two factors together. In this paper, two factors are mainly analyzed: the degradation of material properties and load randomization by processing technology. Firstly, the two factors are defined in terms of material failure and seal failure, before the experimental methods of rubber materials are studied. Following this, the time-variant material properties through experiments and load distribution by monitoring the processing can be obtained. Thirdly, compressive stress and contact stress have been calculated, which was combined with the reliability model to acquire the time-variant reliability for the O-ring. Finally, the life prediction and effect of oil pressure were discussed, then compared with the actual situation. The results show a lifetime of 12 months for the O-ring calculated in this paper, and compared with the replacement records from the maintenance workshop, the result is credible. PMID:29053597

  8. Time Dependent Dielectric Breakdown in Copper Low-k Interconnects: Mechanisms and Reliability Models

    PubMed Central

    Wong, Terence K.S.

    2012-01-01

    The time dependent dielectric breakdown phenomenon in copper low-k damascene interconnects for ultra large-scale integration is reviewed. The loss of insulation between neighboring interconnects represents an emerging back end-of-the-line reliability issue that is not fully understood. After describing the main dielectric leakage mechanisms in low-k materials (Poole-Frenkel and Schottky emission), the major dielectric reliability models that had appeared in the literature are discussed, namely: the Lloyd model, 1/E model, thermochemical E model, E1/2 models, E2 model and the Haase model. These models can be broadly categorized into those that consider only intrinsic breakdown (Lloyd, 1/E, E and Haase) and those that take into account copper migration in low-k materials (E1/2, E2). For each model, the physical assumptions and the proposed breakdown mechanism will be discussed, together with the quantitative relationship predicting the time to breakdown and supporting experimental data. Experimental attempts on validation of dielectric reliability models using data obtained from low field stressing are briefly discussed. The phenomenon of soft breakdown, which often precedes hard breakdown in porous ultra low-k materials, is highlighted for future research.

  9. Prediction of the characteristics of two types of pressure waves in the cochlea: Theoretical considerations

    NASA Astrophysics Data System (ADS)

    Andoh, Masayoshi; Wada, Hiroshi

    2004-07-01

    The aim of this study was to predict the characteristics of two types of cochlear pressure waves, so-called fast and slow waves. A two-dimensional finite-element model of the organ of Corti (OC), including fluid-structure interaction with the surrounding lymph fluid, was constructed. The geometry of the OC at the basal turn was determined from morphological measurements of others in the gerbil hemicochlea. As far as mechanical properties of the materials within the OC are concerned, previously determined mechanical properties of portions within the OC were adopted, and unknown mechanical features were determined from the published measurements of static stiffness. Time advance of the fluid-structure scheme was achieved by a staggered approach. Using the model, the magnitude and phase of the fast and slow waves were predicted so as to fit the numerically obtained pressure distribution in the scala tympani with what is known about intracochlear pressure measurement. When the predicted pressure waves were applied to the model, the numerical result of the velocity of the basilar membrane showed good agreement with the experimentally obtained velocity of the basilar membrane documented by others. Thus, the predicted pressure waves appeared to be reliable. Moreover, it was found that the fluid-structure interaction considerably influences the dynamic behavior of the OC at frequencies near the characteristic frequency.

  10. Prediction of brain tissue temperature using near-infrared spectroscopy.

    PubMed

    Holper, Lisa; Mitra, Subhabrata; Bale, Gemma; Robertson, Nicola; Tachtsidis, Ilias

    2017-04-01

    Broadband near-infrared spectroscopy (NIRS) can provide an endogenous indicator of tissue temperature based on the temperature dependence of the water absorption spectrum. We describe a first evaluation of the calibration and prediction of brain tissue temperature obtained during hypothermia in newborn piglets (animal dataset) and rewarming in newborn infants (human dataset) based on measured body (rectal) temperature. The calibration using partial least squares regression proved to be a reliable method to predict brain tissue temperature with respect to core body temperature in the wavelength interval of 720 to 880 nm with a strong mean predictive power of [Formula: see text] (animal dataset) and [Formula: see text] (human dataset). In addition, we applied regression receiver operating characteristic curves for the first time to evaluate the temperature prediction, which provided an overall mean error bias between NIRS predicted brain temperature and body temperature of [Formula: see text] (animal dataset) and [Formula: see text] (human dataset). We discuss main methodological aspects, particularly the well-known aspect of over- versus underestimation between brain and body temperature, which is relevant for potential clinical applications.

  11. Predicted reliability of aerospace electronics: Application of two advanced probabilistic concepts

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    Two advanced probabilistic design-for-reliability (PDfR) concepts are addressed and discussed in application to the prediction, quantification and assurance of the aerospace electronics reliability: 1) Boltzmann-Arrhenius-Zhurkov (BAZ) model, which is an extension of the currently widely used Arrhenius model and, in combination with the exponential law of reliability, enables one to obtain a simple, easy-to-use and physically meaningful formula for the evaluation of the probability of failure (PoF) of a material or a device after the given time in operation at the given temperature and under the given stress (not necessarily mechanical), and 2) Extreme Value Distribution (EVD) technique that can be used to assess the number of repetitive loadings that result in the material/device degradation and eventually lead to its failure by closing, in a step-wise fashion, the gap between the bearing capacity (stress-free activation energy) of the material or the device and the demand (loading). It is shown that the material degradation (aging, damage accumulation, flaw propagation, etc.) can be viewed, when BAZ model is considered, as a Markovian process, and that the BAZ model can be obtained as the ultimate steady-state solution to the well-known Fokker-Planck equation in the theory of Markovian processes. It is shown also that the BAZ model addresses the worst, but a reasonably conservative, situation. It is suggested therefore that the transient period preceding the condition addressed by the steady-state BAZ model need not be accounted for in engineering evaluations. However, when there is an interest in understanding the transient degradation process, the obtained solution to the Fokker-Planck equation can be used for this purpose. As to the EVD concept, it attributes the degradation process to the accumulation of damages caused by a train of repetitive high-level loadings, while loadings of levels that are considerably lower than their extreme values do not contribute- appreciably to the finite lifetime of a material or a device. In our probabilistic risk management (PRM) based analysis we treat the stress-free activation energy (capacity) as a normally distributed random variable, and choose, for the sake of simplicity, the (single-parametric) Rayleigh law as the basic distribution underlying the EVD. The general concepts addressed and discussed are illustrated by numerical examples. It is concluded that the application of the PDfR approach and particularly the above two advanced models should be considered as a natural, physically meaningful, informative, comprehensive, and insightful technique that reflects well the physics underlying the degradation processes in materials, devices and systems. It is the author's belief that they will be widely used in engineering practice, when high reliability is imperative, and the ability to quantify it is highly desirable.

  12. Managing Reliability in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dellin, T.A.

    1998-11-23

    The rapid pace of change at Ike end of the 20th Century should continue unabated well into the 21st Century. The driver will be the marketplace imperative of "faster, better, cheaper." This imperative has already stimulated a revolution-in-engineering in design and manufacturing. In contrast, to date, reliability engineering has not undergone a similar level of change. It is critical that we implement a corresponding revolution-in-reliability-engineering as we enter the new millennium. If we are still using 20th Century reliability approaches in the 21st Century, then reliability issues will be the limiting factor in faster, better, and cheaper. At the heartmore » of this reliability revolution will be a science-based approach to reliability engineering. Science-based reliability will enable building-in reliability, application-specific products, virtual qualification, and predictive maintenance. The purpose of this paper is to stimulate a dialogue on the future of reliability engineering. We will try to gaze into the crystal ball and predict some key issues that will drive reliability programs in the new millennium. In the 21st Century, we will demand more of our reliability programs. We will need the ability to make accurate reliability predictions that will enable optimizing cost, performance and time-to-market to meet the needs of every market segment. We will require that all of these new capabilities be in place prior to the stint of a product development cycle. The management of reliability programs will be driven by quantifiable metrics of value added to the organization business objectives.« less

  13. Reliability of an ordinal rating system for assessing the amount of mud and feces (tag) on cattle hides at slaughter.

    PubMed

    Jordan, D; McEwen, S A; Wilson, J B; McNab, W B; Lammerding, A M

    1999-05-01

    A study was conducted to provide a quantitative description of the amount of tag (mud, soil, and bedding) adhered to the hides of feedlot beef cattle and to appraise the statistical reliability of a subjective rating system for assessing this trait. Initially, a single rater obtained baseline data by assessing 2,417 cattle for 1 month at an Ontario beef processing plant. Analysis revealed that there was a strong tendency for animals within sale-lots to have a similar total tag score (intralot correlation = 0.42). Baseline data were summarized by fitting a linear model describing an individual's total tag score as the sum of their lot mean tag score (LMTS) plus an amount representing normal variation within the lot. LMTSs predicted by the linear model were adequately described by a beta distribution with parameters nu = 3.12 and omega = 5.82 scaled to fit on the 0-to-9 interval. Five raters, trained in use of the tag scoring system, made 1,334 tag score observations in a commercial abattoir, allowing reliability to be assessed at the individual level and at the lot level. High values for reliability were obtained for individual total tag score (0.84) and lot total tag score (0.83); these values suggest that the tag scoring system could be used in the marketing and slaughter of Ontario beef cattle to improve the cleanliness of animals presented for slaughter in an effort to control the entry of microbial contamination into abattoirs. Implications for the use of the tag scoring system in research are discussed.

  14. Feasibility and reliability of pocket-size ultrasound examinations of the pleural cavities and vena cava inferior performed by nurses in an outpatient heart failure clinic.

    PubMed

    Dalen, Havard; Gundersen, Guri H; Skjetne, Kyrre; Haug, Hilde H; Kleinau, Jens O; Norekval, Tone M; Graven, Torbjorn

    2015-08-01

    Routine assessment of volume state by ultrasound may improve follow-up of heart failure patients. We aimed to study the feasibility and reliability of focused pocket-size ultrasound examinations of the pleural cavities and the inferior vena cava performed by nurses to assess volume state at an outpatient heart failure clinic. Ultrasound examinations were performed in 62 included heart failure patients by specialized nurses with a pocket-size imaging device (PSID). Patients were then re-examined by a cardiologist with a high-end scanner for reference within 1 h. Specialized nurses were able to obtain and interpret images from both pleural cavities and the inferior vena cava and estimate the volume status in all patients. Time consumption for focused ultrasound examination was median 5 min. In total 26 patients had any kind of pleural effusion (in 39 pleural cavities) by reference. The sensitivity, specificity, positive and negative predictive values were high, all ≥ 92%. The correlations with reference were high for all measurements, all r ≥ 0.79. Coefficients of variation for end-expiratory dimension of inferior vena cava and quantification of pleural effusion were 10.8% and 12.7%, respectively. Specialized nurses were, after a dedicated training protocol, able to obtain reliable recordings of both pleural cavities and the inferior vena cava by PSID and interpret the images in a reliable way. Implementing focused ultrasound examinations to assess volume status by nurses in an outpatient heart failure clinic may improve diagnostics, and thus improve therapy. © The European Society of Cardiology 2014.

  15. HitPredict version 4: comprehensive reliability scoring of physical protein-protein interactions from more than 100 species.

    PubMed

    López, Yosvany; Nakai, Kenta; Patil, Ashwini

    2015-01-01

    HitPredict is a consolidated resource of experimentally identified, physical protein-protein interactions with confidence scores to indicate their reliability. The study of genes and their inter-relationships using methods such as network and pathway analysis requires high quality protein-protein interaction information. Extracting reliable interactions from most of the existing databases is challenging because they either contain only a subset of the available interactions, or a mixture of physical, genetic and predicted interactions. Automated integration of interactions is further complicated by varying levels of accuracy of database content and lack of adherence to standard formats. To address these issues, the latest version of HitPredict provides a manually curated dataset of 398 696 physical associations between 70 808 proteins from 105 species. Manual confirmation was used to resolve all issues encountered during data integration. For improved reliability assessment, this version combines a new score derived from the experimental information of the interactions with the original score based on the features of the interacting proteins. The combined interaction score performs better than either of the individual scores in HitPredict as well as the reliability score of another similar database. HitPredict provides a web interface to search proteins and visualize their interactions, and the data can be downloaded for offline analysis. Data usability has been enhanced by mapping protein identifiers across multiple reference databases. Thus, the latest version of HitPredict provides a significantly larger, more reliable and usable dataset of protein-protein interactions from several species for the study of gene groups. Database URL: http://hintdb.hgc.jp/htp. © The Author(s) 2015. Published by Oxford University Press.

  16. A systematic review of studies on forecasting the dynamics of influenza outbreaks

    PubMed Central

    Nsoesie, Elaine O; Brownstein, John S; Ramakrishnan, Naren; Marathe, Madhav V

    2014-01-01

    Forecasting the dynamics of influenza outbreaks could be useful for decision-making regarding the allocation of public health resources. Reliable forecasts could also aid in the selection and implementation of interventions to reduce morbidity and mortality due to influenza illness. This paper reviews methods for influenza forecasting proposed during previous influenza outbreaks and those evaluated in hindsight. We discuss the various approaches, in addition to the variability in measures of accuracy and precision of predicted measures. PubMed and Google Scholar searches for articles on influenza forecasting retrieved sixteen studies that matched the study criteria. We focused on studies that aimed at forecasting influenza outbreaks at the local, regional, national, or global level. The selected studies spanned a wide range of regions including USA, Sweden, Hong Kong, Japan, Singapore, United Kingdom, Canada, France, and Cuba. The methods were also applied to forecast a single measure or multiple measures. Typical measures predicted included peak timing, peak height, daily/weekly case counts, and outbreak magnitude. Due to differences in measures used to assess accuracy, a single estimate of predictive error for each of the measures was difficult to obtain. However, collectively, the results suggest that these diverse approaches to influenza forecasting are capable of capturing specific outbreak measures with some degree of accuracy given reliable data and correct disease assumptions. Nonetheless, several of these approaches need to be evaluated and their performance quantified in real-time predictions. PMID:24373466

  17. A systematic review of studies on forecasting the dynamics of influenza outbreaks.

    PubMed

    Nsoesie, Elaine O; Brownstein, John S; Ramakrishnan, Naren; Marathe, Madhav V

    2014-05-01

    Forecasting the dynamics of influenza outbreaks could be useful for decision-making regarding the allocation of public health resources. Reliable forecasts could also aid in the selection and implementation of interventions to reduce morbidity and mortality due to influenza illness. This paper reviews methods for influenza forecasting proposed during previous influenza outbreaks and those evaluated in hindsight. We discuss the various approaches, in addition to the variability in measures of accuracy and precision of predicted measures. PubMed and Google Scholar searches for articles on influenza forecasting retrieved sixteen studies that matched the study criteria. We focused on studies that aimed at forecasting influenza outbreaks at the local, regional, national, or global level. The selected studies spanned a wide range of regions including USA, Sweden, Hong Kong, Japan, Singapore, United Kingdom, Canada, France, and Cuba. The methods were also applied to forecast a single measure or multiple measures. Typical measures predicted included peak timing, peak height, daily/weekly case counts, and outbreak magnitude. Due to differences in measures used to assess accuracy, a single estimate of predictive error for each of the measures was difficult to obtain. However, collectively, the results suggest that these diverse approaches to influenza forecasting are capable of capturing specific outbreak measures with some degree of accuracy given reliable data and correct disease assumptions. Nonetheless, several of these approaches need to be evaluated and their performance quantified in real-time predictions. © 2013 The Authors. Influenza and Other Respiratory Viruses Published by John Wiley & Sons Ltd.

  18. Revealing chemophoric sites in organophosphorus insecticides through the MIA-QSPR modeling of soil sorption data.

    PubMed

    Daré, Joyce K; Silva, Cristina F; Freitas, Matheus P

    2017-10-01

    Soil sorption of insecticides employed in agriculture is an important parameter to probe the environmental fate of organic chemicals. Therefore, methods for the prediction of soil sorption of new agrochemical candidates, as well as for the rationalization of the molecular characteristics responsible for a given sorption profile, are extremely beneficial for the environment. A quantitative structure-property relationship method based on chemical structure images as molecular descriptors provided a reliable model for the soil sorption prediction of 24 widely used organophosphorus insecticides. By means of contour maps obtained from the partial least squares regression coefficients and the variable importance in projection scores, key molecular moieties were targeted for possible structural modification, in order to obtain novel and more environmentally friendly insecticide candidates. The image-based descriptors applied encode molecular arrangement, atoms connectivity, groups size, and polarity; consequently, the findings in this work cannot be achieved by a simple relationship with hydrophobicity, usually described by the octanol-water partition coefficient. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Automated chemical kinetic modeling via hybrid reactive molecular dynamics and quantum chemistry simulations.

    PubMed

    Döntgen, Malte; Schmalz, Felix; Kopp, Wassja A; Kröger, Leif C; Leonhard, Kai

    2018-06-13

    An automated scheme for obtaining chemical kinetic models from scratch using reactive molecular dynamics and quantum chemistry simulations is presented. This methodology combines the phase space sampling of reactive molecular dynamics with the thermochemistry and kinetics prediction capabilities of quantum mechanics. This scheme provides the NASA polynomial and modified Arrhenius equation parameters for all species and reactions that are observed during the simulation and supplies them in the ChemKin format. The ab initio level of theory for predictions is easily exchangeable and the presently used G3MP2 level of theory is found to reliably reproduce hydrogen and methane oxidation thermochemistry and kinetics data. Chemical kinetic models obtained with this approach are ready-to-use for, e.g., ignition delay time simulations, as shown for hydrogen combustion. The presented extension of the ChemTraYzer approach can be used as a basis for methodologically advancing chemical kinetic modeling schemes and as a black-box approach to generate chemical kinetic models.

  20. Prediction of Geomagnetic Activity and Key Parameters in High-Latitude Ionosphere-Basic Elements

    NASA Technical Reports Server (NTRS)

    Lyatsky, W.; Khazanov, G. V.

    2007-01-01

    Prediction of geomagnetic activity and related events in the Earth's magnetosphere and ionosphere is an important task of the Space Weather program. Prediction reliability is dependent on the prediction method and elements included in the prediction scheme. Two main elements are a suitable geomagnetic activity index and coupling function -- the combination of solar wind parameters providing the best correlation between upstream solar wind data and geomagnetic activity. The appropriate choice of these two elements is imperative for any reliable prediction model. The purpose of this work was to elaborate on these two elements -- the appropriate geomagnetic activity index and the coupling function -- and investigate the opportunity to improve the reliability of the prediction of geomagnetic activity and other events in the Earth's magnetosphere. The new polar magnetic index of geomagnetic activity and the new version of the coupling function lead to a significant increase in the reliability of predicting the geomagnetic activity and some key parameters, such as cross-polar cap voltage and total Joule heating in high-latitude ionosphere, which play a very important role in the development of geomagnetic and other activity in the Earth s magnetosphere, and are widely used as key input parameters in modeling magnetospheric, ionospheric, and thermospheric processes.

  1. Validity and Reliability Study of the Korean Tinetti Mobility Test for Parkinson's Disease.

    PubMed

    Park, Jinse; Koh, Seong-Beom; Kim, Hee Jin; Oh, Eungseok; Kim, Joong-Seok; Yun, Ji Young; Kwon, Do-Young; Kim, Younsoo; Kim, Ji Seon; Kwon, Kyum-Yil; Park, Jeong-Ho; Youn, Jinyoung; Jang, Wooyoung

    2018-01-01

    Postural instability and gait disturbance are the cardinal symptoms associated with falling among patients with Parkinson's disease (PD). The Tinetti mobility test (TMT) is a well-established measurement tool used to predict falls among elderly people. However, the TMT has not been established or widely used among PD patients in Korea. The purpose of this study was to evaluate the reliability and validity of the Korean version of the TMT for PD patients. Twenty-four patients diagnosed with PD were enrolled in this study. For the interrater reliability test, thirteen clinicians scored the TMT after watching a video clip. We also used the test-retest method to determine intrarater reliability. For concurrent validation, the unified Parkinson's disease rating scale, Hoehn and Yahr staging, Berg Balance Scale, Timed-Up and Go test, 10-m walk test, and gait analysis by three-dimensional motion capture were also used. We analyzed receiver operating characteristic curve to predict falling. The interrater reliability and intrarater reliability of the Korean Tinetti balance scale were 0.97 and 0.98, respectively. The interrater reliability and intra-rater reliability of the Korean Tinetti gait scale were 0.94 and 0.96, respectively. The Korean TMT scores were significantly correlated with the other clinical scales and three-dimensional motion capture. The cutoff values for predicting falling were 14 points (balance subscale) and 10 points (gait subscale). We found that the Korean version of the TMT showed excellent validity and reliability for gait and balance and had high sensitivity and specificity for predicting falls among patients with PD.

  2. Reliability analysis and initial requirements for FC systems and stacks

    NASA Astrophysics Data System (ADS)

    Åström, K.; Fontell, E.; Virtanen, S.

    In the year 2000 Wärtsilä Corporation started an R&D program to develop SOFC systems for CHP applications. The program aims to bring to the market highly efficient, clean and cost competitive fuel cell systems with rated power output in the range of 50-250 kW for distributed generation and marine applications. In the program Wärtsilä focuses on system integration and development. System reliability and availability are key issues determining the competitiveness of the SOFC technology. In Wärtsilä, methods have been implemented for analysing the system in respect to reliability and safety as well as for defining reliability requirements for system components. A fault tree representation is used as the basis for reliability prediction analysis. A dynamic simulation technique has been developed to allow for non-static properties in the fault tree logic modelling. Special emphasis has been placed on reliability analysis of the fuel cell stacks in the system. A method for assessing reliability and critical failure predictability requirements for fuel cell stacks in a system consisting of several stacks has been developed. The method is based on a qualitative model of the stack configuration where each stack can be in a functional, partially failed or critically failed state, each of the states having different failure rates and effects on the system behaviour. The main purpose of the method is to understand the effect of stack reliability, critical failure predictability and operating strategy on the system reliability and availability. An example configuration, consisting of 5 × 5 stacks (series of 5 sets of 5 parallel stacks) is analysed in respect to stack reliability requirements as a function of predictability of critical failures and Weibull shape factor of failure rate distributions.

  3. Numerical study of the defect adamantine compound CuGaGeSe4

    NASA Astrophysics Data System (ADS)

    Shen, Kesheng; Zhang, Xianzhou; Lu, Hai; Jiao, Zhaoyong

    2018-06-01

    The electronic structure, elastic and optical properties of the defect adamantine compound CuGaGeSe4 in ? structure are systematically investigated using first-principles calculations. Through detailed calculation and comparison, we obtain three independent atomic arrangements and predict the most stable atomic arrangement according to the lattice constants and enthalpy formation energies. The elastic constants are calculated, which can be used to predict the axial thermal expansion coefficients accurately. The optical properties of compound CuGaGeSe4, including the dielectric function, refractive index and absorption spectrum, are depicted for a more intuitive understanding. Our calculated zero-frequency limits ɛ1(0) and n(0) are very close to the other theoretical values, which proves that our calculations are reliable.

  4. Early experiences building a software quality prediction model

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  5. Prediction of true test scores from observed item scores and ancillary data.

    PubMed

    Haberman, Shelby J; Yao, Lili; Sinharay, Sandip

    2015-05-01

    In many educational tests which involve constructed responses, a traditional test score is obtained by adding together item scores obtained through holistic scoring by trained human raters. For example, this practice was used until 2008 in the case of GRE(®) General Analytical Writing and until 2009 in the case of TOEFL(®) iBT Writing. With use of natural language processing, it is possible to obtain additional information concerning item responses from computer programs such as e-rater(®). In addition, available information relevant to examinee performance may include scores on related tests. We suggest application of standard results from classical test theory to the available data to obtain best linear predictors of true traditional test scores. In performing such analysis, we require estimation of variances and covariances of measurement errors, a task which can be quite difficult in the case of tests with limited numbers of items and with multiple measurements per item. As a consequence, a new estimation method is suggested based on samples of examinees who have taken an assessment more than once. Such samples are typically not random samples of the general population of examinees, so that we apply statistical adjustment methods to obtain the needed estimated variances and covariances of measurement errors. To examine practical implications of the suggested methods of analysis, applications are made to GRE General Analytical Writing and TOEFL iBT Writing. Results obtained indicate that substantial improvements are possible both in terms of reliability of scoring and in terms of assessment reliability. © 2015 The British Psychological Society.

  6. A Compact Forearm Crutch Based on Force Sensors for Aided Gait: Reliability and Validity.

    PubMed

    Chamorro-Moriana, Gema; Sevillano, José Luis; Ridao-Fernández, Carmen

    2016-06-21

    Frequently, patients who suffer injuries in some lower member require forearm crutches in order to partially unload weight-bearing. These lesions cause pain in lower limb unloading and their progression should be controlled objectively to avoid significant errors in accuracy and, consequently, complications and after effects in lesions. The design of a new and feasible tool that allows us to control and improve the accuracy of loads exerted on crutches during aided gait is necessary, so as to unburden the lower limbs. In this paper, we describe such a system based on a force sensor, which we have named the GCH System 2.0. Furthermore, we determine the validity and reliability of measurements obtained using this tool via a comparison with the validated AMTI (Advanced Mechanical Technology, Inc., Watertown, MA, USA) OR6-7-2000 Platform. An intra-class correlation coefficient demonstrated excellent agreement between the AMTI Platform and the GCH System. A regression line to determine the predictive ability of the GCH system towards the AMTI Platform was found, which obtained a precision of 99.3%. A detailed statistical analysis is presented for all the measurements and also segregated for several requested loads on the crutches (10%, 25% and 50% of body weight). Our results show that our system, designed for assessing loads exerted by patients on forearm crutches during assisted gait, provides valid and reliable measurements of loads.

  7. A Compact Forearm Crutch Based on Force Sensors for Aided Gait: Reliability and Validity

    PubMed Central

    Chamorro-Moriana, Gema; Sevillano, José Luis; Ridao-Fernández, Carmen

    2016-01-01

    Frequently, patients who suffer injuries in some lower member require forearm crutches in order to partially unload weight-bearing. These lesions cause pain in lower limb unloading and their progression should be controlled objectively to avoid significant errors in accuracy and, consequently, complications and after effects in lesions. The design of a new and feasible tool that allows us to control and improve the accuracy of loads exerted on crutches during aided gait is necessary, so as to unburden the lower limbs. In this paper, we describe such a system based on a force sensor, which we have named the GCH System 2.0. Furthermore, we determine the validity and reliability of measurements obtained using this tool via a comparison with the validated AMTI (Advanced Mechanical Technology, Inc., Watertown, MA, USA) OR6-7-2000 Platform. An intra-class correlation coefficient demonstrated excellent agreement between the AMTI Platform and the GCH System. A regression line to determine the predictive ability of the GCH system towards the AMTI Platform was found, which obtained a precision of 99.3%. A detailed statistical analysis is presented for all the measurements and also segregated for several requested loads on the crutches (10%, 25% and 50% of body weight). Our results show that our system, designed for assessing loads exerted by patients on forearm crutches during assisted gait, provides valid and reliable measurements of loads. PMID:27338396

  8. Development of a calibrated software reliability model for flight and supporting ground software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1991-01-01

    The object of this project was to develop and calibrate quantitative models for predicting the quality of software. Reliable flight and supporting ground software is a highly important factor in the successful operation of the space shuttle program. The models used in the present study consisted of SMERFS (Statistical Modeling and Estimation of Reliability Functions for Software). There are ten models in SMERFS. For a first run, the results obtained in modeling the cumulative number of failures versus execution time showed fairly good results for our data. Plots of cumulative software failures versus calendar weeks were made and the model results were compared with the historical data on the same graph. If the model agrees with actual historical behavior for a set of data then there is confidence in future predictions for this data. Considering the quality of the data, the models have given some significant results, even at this early stage. With better care in data collection, data analysis, recording of the fixing of failures and CPU execution times, the models should prove extremely helpful in making predictions regarding the future pattern of failures, including an estimate of the number of errors remaining in the software and the additional testing time required for the software quality to reach acceptable levels. It appears that there is no one 'best' model for all cases. It is for this reason that the aim of this project was to test several models. One of the recommendations resulting from this study is that great care must be taken in the collection of data. When using a model, the data should satisfy the model assumptions.

  9. Voluntarily Reported Immunization Registry Data: Reliability and Feasibility to Predict Immunization Rates, San Diego, California, 2013.

    PubMed

    Madewell, Zachary J; Wester, Robert B; Wang, Wendy W; Smith, Tyler C; Peddecord, K Michael; Morris, Jessica; DeGuzman, Heidi; Sawyer, Mark H; McDonald, Eric C

    Accurate data on immunization coverage levels are essential to public health program planning. Reliability of coverage estimates derived from immunization information systems (IISs) in states where immunization reporting by medical providers is not mandated by the state may be compromised by low rates of participation. To overcome this problem, data on coverage rates are often acquired through random-digit-dial telephone surveys, which require substantial time and resources. This project tested both the reliability of voluntarily reported IIS data and the feasibility of using these data to estimate regional immunization rates. We matched telephone survey records for 553 patients aged 19-35 months obtained in 2013 to 430 records in the San Diego County IIS. We assessed concordance between survey data and IIS data using κ to measure the degree of nonrandom agreement. We used multivariable logistic regression models to investigate differences among demographic variables between the 2 data sets. These models were used to construct weights that enabled us to predict immunization rates in areas where reporting is not mandated. We found moderate agreement between the telephone survey and the IIS for the diphtheria, tetanus, and acellular pertussis (κ = 0.49), pneumococcal conjugate (κ = 0.49), and Haemophilus influenzae type b (κ = 0.46) vaccines; fair agreement for the varicella (κ = 0.39), polio (κ = 0.39), and measles, mumps, and rubella (κ = 0.35) vaccines; and slight agreement for the hepatitis B vaccine (κ = 0.17). Consistency in factors predicting immunization coverage levels in a telephone survey and IIS data confirmed the feasibility of using voluntarily reported IIS data to assess immunization rates in children aged 19-35 months.

  10. Voluntarily Reported Immunization Registry Data: Reliability and Feasibility to Predict Immunization Rates, San Diego, California, 2013

    PubMed Central

    Wester, Robert B.; Wang, Wendy W.; Smith, Tyler C.; Peddecord, K. Michael; Morris, Jessica; DeGuzman, Heidi; Sawyer, Mark H.; McDonald, Eric C.

    2017-01-01

    Objectives: Accurate data on immunization coverage levels are essential to public health program planning. Reliability of coverage estimates derived from immunization information systems (IISs) in states where immunization reporting by medical providers is not mandated by the state may be compromised by low rates of participation. To overcome this problem, data on coverage rates are often acquired through random-digit-dial telephone surveys, which require substantial time and resources. This project tested both the reliability of voluntarily reported IIS data and the feasibility of using these data to estimate regional immunization rates. Methods: We matched telephone survey records for 553 patients aged 19-35 months obtained in 2013 to 430 records in the San Diego County IIS. We assessed concordance between survey data and IIS data using κ to measure the degree of nonrandom agreement. We used multivariable logistic regression models to investigate differences among demographic variables between the 2 data sets. These models were used to construct weights that enabled us to predict immunization rates in areas where reporting is not mandated. Results: We found moderate agreement between the telephone survey and the IIS for the diphtheria, tetanus, and acellular pertussis (κ = 0.49), pneumococcal conjugate (κ = 0.49), and Haemophilus influenzae type b (κ = 0.46) vaccines; fair agreement for the varicella (κ = 0.39), polio (κ = 0.39), and measles, mumps, and rubella (κ = 0.35) vaccines; and slight agreement for the hepatitis B vaccine (κ = 0.17). Conclusions: Consistency in factors predicting immunization coverage levels in a telephone survey and IIS data confirmed the feasibility of using voluntarily reported IIS data to assess immunization rates in children aged 19-35 months. PMID:28379785

  11. Improving quality in healthcare: What makes a satisfied patient?

    PubMed

    Más, A; Parra, P; Bermejo, R M; Hidalgo, M D; Calle, J E

    2016-01-01

    To update the metric properties of a perceived quality questionnaire for patients admitted to hospital medical departments, to determine the level of patient satisfaction achieved, and to identify the variables which predict satisfaction. Self-administered questionnaire completed at home following patient discharge, using a questionnaire prepared by the authors on a sample of 7207 users of medical departments in 9 public hospitals during the years 2006-2009. A principal component analysis with varimax rotation was performed. Reliability was assessed using internal consistency coefficient. An analysis was made of the compliance with each indicator reported by respondents. A logistic regression analysis was performed to determine the perceived quality dimensions which predicted overall patient satisfaction. The results of the reliability analysis indicated good coefficients for interpersonal manner (0.94) and professional competence (0.85) dimensions, and moderate values for the other dimensions (comfort 0.55, information 0.38, and organisation 0.37). Factor analyses showed single factors in each of the perceived quality dimensions, with a percentage of explained variance greater than 35% for information, interpersonal manner, professional competence, and comfort, and less than 30% for organisation. The dimensions which predicted satisfaction were interpersonal manner of healthcare staff, professional competence, and information. The metric properties of the questionnaire used have been updated, yielding a valid and reliable questionnaire for assessing patient satisfaction in quality management programmes, both for internal purposes and for conducting external comparisons. A positive relationship was obtained between the level of patient satisfaction and level of professional competence, interpersonal manner of healthcare staff, and information received. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  12. Dissociating Word Frequency and Predictability Effects in Reading: Evidence from Coregistration of Eye Movements and EEG

    ERIC Educational Resources Information Center

    Kretzschmar, Franziska; Schlesewsky, Matthias; Staub, Adrian

    2015-01-01

    Two very reliable influences on eye fixation durations in reading are word frequency, as measured by corpus counts, and word predictability, as measured by cloze norming. Several studies have reported strictly additive effects of these 2 variables. Predictability also reliably influences the amplitude of the N400 component in event-related…

  13. Technique for Early Reliability Prediction of Software Components Using Behaviour Models

    PubMed Central

    Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad

    2016-01-01

    Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748

  14. [Phenotypic trends and breeding values for canine congenital sensorineural deafness in Dalmatian dogs].

    PubMed

    Blum, Meike; Distl, Ottmar

    2014-01-01

    In the present study, breeding values for canine congenital sensorineural deafness, the presence of blue eyes and patches have been predicted using multivariate animal models to test the reliability of the breeding values for planned matings. The dataset consisted of 6669 German Dalmatian dogs born between 1988 and 2009. Data were provided by the Dalmatian kennel clubs which are members of the German Association for Dog Breeding and Husbandry (VDH). The hearing status for all dogs was evaluated using brainstem auditory evoked potentials. The reliability using the prediction error variance of breeding values and the realized reliability of the prediction of the phenotype of future progeny born in each one year between 2006 and 2009 were used as parameters to evaluate the goodness of prediction through breeding values. All animals from the previous birth years were used for prediction of the breeding values of the progeny in each of the up-coming birth years. The breeding values based on pedigree records achieved an average reliability of 0.19 for the future 1951 progeny. The predictive accuracy (R2) for the hearing status of single future progeny was at 1.3%. Combining breeding values for littermates increased the predictive accuracy to 3.5%. Corresponding values for maternal and paternal half-sib groups were at 3.2 and 7.3%. The use of breeding values for planned matings increases the phenotypic selection response over mass selection. The breeding values of sires may be used for planned matings because reliabilities and predictive accuracies for future paternal progeny groups were highest.

  15. Evaluation of ceramics for stator application: Gas turbine engine report

    NASA Technical Reports Server (NTRS)

    Trela, W.; Havstad, P. H.

    1978-01-01

    Current ceramic materials, component fabrication processes, and reliability prediction capability for ceramic stators in an automotive gas turbine engine environment are assessed. Simulated engine duty cycle testing of stators conducted at temperatures up to 1093 C is discussed. Materials evaluated are SiC and Si3N4 fabricated from two near-net-shape processes: slip casting and injection molding. Stators for durability cycle evaluation and test specimens for material property characterization, and reliability prediction model prepared to predict stator performance in the simulated engine environment are considered. The status and description of the work performed for the reliability prediction modeling, stator fabrication, material property characterization, and ceramic stator evaluation efforts are reported.

  16. Machine learning classification with confidence: application of transductive conformal predictors to MRI-based diagnostic and prognostic markers in depression.

    PubMed

    Nouretdinov, Ilia; Costafreda, Sergi G; Gammerman, Alexander; Chervonenkis, Alexey; Vovk, Vladimir; Vapnik, Vladimir; Fu, Cynthia H Y

    2011-05-15

    There is rapidly accumulating evidence that the application of machine learning classification to neuroimaging measurements may be valuable for the development of diagnostic and prognostic prediction tools in psychiatry. However, current methods do not produce a measure of the reliability of the predictions. Knowing the risk of the error associated with a given prediction is essential for the development of neuroimaging-based clinical tools. We propose a general probabilistic classification method to produce measures of confidence for magnetic resonance imaging (MRI) data. We describe the application of transductive conformal predictor (TCP) to MRI images. TCP generates the most likely prediction and a valid measure of confidence, as well as the set of all possible predictions for a given confidence level. We present the theoretical motivation for TCP, and we have applied TCP to structural and functional MRI data in patients and healthy controls to investigate diagnostic and prognostic prediction in depression. We verify that TCP predictions are as accurate as those obtained with more standard machine learning methods, such as support vector machine, while providing the additional benefit of a valid measure of confidence for each prediction. Copyright © 2010 Elsevier Inc. All rights reserved.

  17. The accuracy of body mass prediction for elderly specimens: Implications for paleoanthropology and legal medicine.

    PubMed

    Chevalier, Tony; Lefèvre, Philippe; Clarys, Jan Pieter; Beauthier, Jean-Pol

    2016-10-01

    Different practices in paleoanthropology and legal medicine raise questions concerning the robustness of body mass (BM) prediction. Integrating personal identification from body mass estimation with skeleton is not a classic approach in legal medicine. The originality of our study is the use of an elderly sample in order to push prediction methods to their limits and to discuss about implications in paleoanthropology and legal medicine. The aim is to observe the accuracy of BM prediction in relation to the body mass index (BMI, index of classification) using five femoral head (FH) methods and one shaft (FSH) method. The sample is composed of 41 dry femurs obtained from dissection where age (c. 82 years) and gender are known, and weight (c. 59.5 kg) and height are measured upon admission to the body leg service. We show that the estimation of the mean BM of the elderly sample is not significantly different to the real mean BM when the appropriate formula is used for the femoral head diameter. In fact, the best prediction is obtained with the McHenry formula (1992), which was based on a sample with an equivalent average mass to that of our sample. In comparison, external shaft diameters, which are known to be more influenced by mechanical stimuli than femoral head diameters, yield less satisfactory results with the McHenry formula (1992) for shaft diameters. Based on all the methods used and the distinctive selected sample, overestimation (always observed with the different femoral head methods) can be restricted to 1.1%. The observed overestimation with the shaft method can be restricted to 7%. However, the estimation of individual BM is much less reliable. The BMI has a strong impact on the accuracy of individual BM prediction, and is unquestionably more reliable for individuals with normal BMI (9.6% vs 16.7% for the best prediction error). In this case, the FH method is also the better predictive method but not if we integrate the total sample (i.e., the FSH method is better with more varied BMI). Finally, the estimation of the mean BM of a sample can be used with more confidence compared to the estimation of individual BM. The former is very useful in an evolutionary perspective whereas the latter should be used in keeping with the information gathered on the studied specimen in order to reduce prediction errors. Finally, the BM estimation can be a parameter to consider for personal identification. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  18. Determining Functional Reliability of Pyrotechnic Mechanical Devices

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.; Multhaup, Herbert A.

    1997-01-01

    This paper describes a new approach for evaluating mechanical performance and predicting the mechanical functional reliability of pyrotechnic devices. Not included are other possible failure modes, such as the initiation of the pyrotechnic energy source. The requirement of hundreds or thousands of consecutive, successful tests on identical components for reliability predictions, using the generally accepted go/no-go statistical approach routinely ignores physics of failure. The approach described in this paper begins with measuring, understanding and controlling mechanical performance variables. Then, the energy required to accomplish the function is compared to that delivered by the pyrotechnic energy source to determine mechanical functional margin. Finally, the data collected in establishing functional margin is analyzed to predict mechanical functional reliability, using small-sample statistics. A careful application of this approach can provide considerable cost improvements and understanding over that of go/no-go statistics. Performance and the effects of variables can be defined, and reliability predictions can be made by evaluating 20 or fewer units. The application of this approach to a pin puller used on a successful NASA mission is provided as an example.

  19. The transparency, reliability and utility of tropical rainforest land-use and land-cover change models.

    PubMed

    Rosa, Isabel M D; Ahmed, Sadia E; Ewers, Robert M

    2014-06-01

    Land-use and land-cover (LULC) change is one of the largest drivers of biodiversity loss and carbon emissions globally. We use the tropical rainforests of the Amazon, the Congo basin and South-East Asia as a case study to investigate spatial predictive models of LULC change. Current predictions differ in their modelling approaches, are highly variable and often poorly validated. We carried out a quantitative review of 48 modelling methodologies, considering model spatio-temporal scales, inputs, calibration and validation methods. In addition, we requested model outputs from each of the models reviewed and carried out a quantitative assessment of model performance for tropical LULC predictions in the Brazilian Amazon. We highlight existing shortfalls in the discipline and uncover three key points that need addressing to improve the transparency, reliability and utility of tropical LULC change models: (1) a lack of openness with regard to describing and making available the model inputs and model code; (2) the difficulties of conducting appropriate model validations; and (3) the difficulty that users of tropical LULC models face in obtaining the model predictions to help inform their own analyses and policy decisions. We further draw comparisons between tropical LULC change models in the tropics and the modelling approaches and paradigms in other disciplines, and suggest that recent changes in the climate change and species distribution modelling communities may provide a pathway that tropical LULC change modellers may emulate to further improve the discipline. Climate change models have exerted considerable influence over public perceptions of climate change and now impact policy decisions at all political levels. We suggest that tropical LULC change models have an equally high potential to influence public opinion and impact the development of land-use policies based on plausible future scenarios, but, to do that reliably may require further improvements in the discipline. © 2014 John Wiley & Sons Ltd.

  20. Robust artificial neural network for reliability and sensitivity analyses of complex non-linear systems.

    PubMed

    Oparaji, Uchenna; Sheu, Rong-Jiun; Bankhead, Mark; Austin, Jonathan; Patelli, Edoardo

    2017-12-01

    Artificial Neural Networks (ANNs) are commonly used in place of expensive models to reduce the computational burden required for uncertainty quantification, reliability and sensitivity analyses. ANN with selected architecture is trained with the back-propagation algorithm from few data representatives of the input/output relationship of the underlying model of interest. However, different performing ANNs might be obtained with the same training data as a result of the random initialization of the weight parameters in each of the network, leading to an uncertainty in selecting the best performing ANN. On the other hand, using cross-validation to select the best performing ANN based on the ANN with the highest R 2 value can lead to biassing in the prediction. This is as a result of the fact that the use of R 2 cannot determine if the prediction made by ANN is biased. Additionally, R 2 does not indicate if a model is adequate, as it is possible to have a low R 2 for a good model and a high R 2 for a bad model. Hence, in this paper, we propose an approach to improve the robustness of a prediction made by ANN. The approach is based on a systematic combination of identical trained ANNs, by coupling the Bayesian framework and model averaging. Additionally, the uncertainties of the robust prediction derived from the approach are quantified in terms of confidence intervals. To demonstrate the applicability of the proposed approach, two synthetic numerical examples are presented. Finally, the proposed approach is used to perform a reliability and sensitivity analyses on a process simulation model of a UK nuclear effluent treatment plant developed by National Nuclear Laboratory (NNL) and treated in this study as a black-box employing a set of training data as a test case. This model has been extensively validated against plant and experimental data and used to support the UK effluent discharge strategy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Comparison of Mean Climate Trends in the Northern Hemisphere Between N.C.E.P. and Two Atmosphere-Ocean Model Forced Runs

    NASA Technical Reports Server (NTRS)

    Lucarini, Valerio; Russell, Gary L.; Hansen, James E. (Technical Monitor)

    2002-01-01

    Results are presented for two greenhouse gas experiments of the Goddard Institute for Space Studies Atmosphere-Ocean Model (AOM). The computed trends of surface pressure, surface temperature, 850, 500 and 200 mb geopotential heights and related temperatures of the model for the time frame 1960-2000 are compared to those obtained from the National Centers for Environmental Prediction observations. A spatial correlation analysis and mean value comparison are performed, showing good agreement. A brief general discussion about the statistics of trend detection is presented. The domain of interest is the Northern Hemisphere (NH) because of the higher reliability of both the model results and the observations. The accuracy that this AOM has in describing the observed regional and NH climate trends makes it reliable in forecasting future climate changes.

  2. 3D planning in orthognathic surgery: CAD/CAM surgical splints and prediction of the soft and hard tissues results - our experience in 16 cases.

    PubMed

    Aboul-Hosn Centenero, Samir; Hernández-Alfaro, Federico

    2012-02-01

    The aim of this article is to determine the advantages of 3D planning in predicting postoperative results and manufacturing surgical splints using CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) technology in orthognathic surgery when the software program Simplant OMS 10.1 (Materialise(®), Leuven, Belgium) was used for the purpose of this study which was carried out on 16 patients. A conventional preoperative treatment plan was devised for each patient following our Centre's standard protocol, and surgical splints were manufactured. These splints were used as study controls. The preoperative treatment plans devised were then transferred to a 3D-virtual environment on a personal computer (PC). Surgery was simulated, the prediction of results on soft and hard tissue produced, and surgical splints manufactured using CAD/CAM technology. In the operating room, both types of surgical splints were compared and the degree of similitude in results obtained in three planes was calculated. The maxillary osteotomy line was taken as the point of reference. The level of concordance was used to compare the surgical splints. Three months after surgery a second set of 3D images were obtained and used to obtain linear and angular measurements on screen. Using the Intraclass Correlation Coefficient these postoperative measurements were compared with the measurements obtained when predicting postoperative results. Results showed that a high degree of correlation in 15 of the 16 cases. A high coefficient of correlation was obtained in the majority of predictions of results in hard tissue, although less precise results were obtained in measurements in soft tissue in the labial area. The study shows that the software program used in the study is reliable for 3D planning and for the manufacture of surgical splints using CAD/CAM technology. Nevertheless, further progress in the development of technologies for the acquisition of 3D images, new versions of software programs, and further studies of objective data are necessary to increase precision in computerised 3D planning. Copyright © 2011 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  3. Spectroscopic investigations of microwave generated plasmas

    NASA Technical Reports Server (NTRS)

    Hawley, Martin C.; Haraburda, Scott S.; Dinkel, Duane W.

    1991-01-01

    The study deals with the plasma behavior as applied to spacecraft propulsion from the perspective of obtaining better design and modeling capabilities. The general theory of spectroscopy is reviewed, and existing methods for converting emission-line intensities into such quantities as temperatures and densities are outlined. Attention is focused on the single-atomic-line and two-line radiance ratio methods, atomic Boltzmann plot, and species concentration. Electronic temperatures for a helium plasma are determined as a function of pressure and a gas-flow rate using these methods, and the concentrations of ions and electrons are predicted from the Saha-Eggert equations using the sets of temperatures obtained as a function of the gas-flow rate. It is observed that the atomic Boltzmann method produces more reliable results for the electronic temperature, while the results obtained from the single-line method reflect the electron temperatures accurately.

  4. The reliability, validity, sensitivity, specificity and predictive values of the Chinese version of the Rowland Universal Dementia Assessment Scale.

    PubMed

    Chen, Chia-Wei; Chu, Hsin; Tsai, Chia-Fen; Yang, Hui-Ling; Tsai, Jui-Chen; Chung, Min-Huey; Liao, Yuan-Mei; Chi, Mei-Ju; Chou, Kuei-Ru

    2015-11-01

    The purpose of this study was to translate the Rowland Universal Dementia Assessment Scale into Chinese and to evaluate the psychometric properties (reliability and validity) and the diagnostic properties (sensitivity, specificity and predictive values) of the Chinese version of the Rowland Universal Dementia Assessment Scale. The accurate detection of early dementia requires screening tools with favourable cross-cultural linguistic and appropriate sensitivity, specificity, and predictive values, particularly for Chinese-speaking populations. This was a cross-sectional, descriptive study. Overall, 130 participants suspected to have cognitive impairment were enrolled in the study. A test-retest for determining reliability was scheduled four weeks after the initial test. Content validity was determined by five experts, whereas construct validity was established by using contrasted group technique. The participants' clinical diagnoses were used as the standard in calculating the sensitivity, specificity, positive predictive value and negative predictive value. The study revealed that the Chinese version of the Rowland Universal Dementia Assessment Scale exhibited a test-retest reliability of 0.90, an internal consistency reliability of 0.71, an inter-rater reliability (kappa value) of 0.88 and a content validity index of 0.97. Both the patients and healthy contrast group exhibited significant differences in their cognitive ability. The optimal cut-off points for the Chinese version of the Rowland Universal Dementia Assessment Scale in the test for mild cognitive impairment and dementia were 24 and 22, respectively; moreover, for these two conditions, the sensitivities of the scale were 0.79 and 0.76, the specificities were 0.91 and 0.81, the areas under the curve were 0.85 and 0.78, the positive predictive values were 0.99 and 0.83 and the negative predictive values were 0.96 and 0.91 respectively. The Chinese version of the Rowland Universal Dementia Assessment Scale exhibited sound reliability, validity, sensitivity, specificity and predictive values. This scale can help clinical staff members to quickly and accurately diagnose cognitive impairment and provide appropriate treatment as early as possible. © 2015 John Wiley & Sons Ltd.

  5. Predicting geomagnetic reversals via data assimilation: a feasibility study

    NASA Astrophysics Data System (ADS)

    Morzfeld, Matthias; Fournier, Alexandre; Hulot, Gauthier

    2014-05-01

    The system of three ordinary differential equations (ODE) presented by Gissinger in [1] was shown to exhibit chaotic reversals whose statistics compared well with those from the paleomagnetic record. We explore the geophysical relevance of this low-dimensional model via data assimilation, i.e. we update the solution of the ODE with information from data of the dipole variable. The data set we use is 'SINT' (Valet et al. [2]), and it provides the signed virtual axial dipole moment over the past 2 millions years. We can obtain an accurate reconstruction of these dipole data using implicit sampling (a fully nonlinear Monte Carlo sampling strategy) and assimilating 5 kyr of data per sweep. We confirm our calibration of the model using the PADM2M dipole data set of Ziegler et al. [3]. The Monte Carlo sampling strategy provides us with quantitative information about the uncertainty of our estimates, and -in principal- we can use this information for making (robust) predictions under uncertainty. We perform synthetic data experiments to explore the predictive capability of the ODE model updated by data assimilation. For each experiment, we produce 2 Myr of synthetic data (with error levels similar to the ones found in the SINT data), calibrate the model to this record, and then check if this calibrated model can reliably predict a reversal within the next 5 kyr. By performing a large number of such experiments, we can estimate the statistics that describe how reliably our calibrated model can predict a reversal of the geomagnetic field. It is found that the 1 kyr-ahead predictions of reversals produced by the model appear to be accurate and reliable. These encouraging results prompted us to also test predictions of the five reversals of the SINT (and PADM2M) data set, using a similarly calibrated model. Results will be presented and discussed. References Gissinger, C., 2012, A new deterministic model for chaotic reversals, European Physical Journal B, 85:137 Valet, J.P., Maynadier,L and Guyodo, Y., 2005, Geomagnetic field strength and reversal rate over the past 2 Million years, Nature, 435, 802-805. Ziegler, L.B., Constable, C.G., Johnson, C.L. and Tauxe, L., 2011, PADM2M: a penalized maximum likelihood moidel of the 0-2 Ma paleomagnetic axial dipole moment, Geophysical Journal International, 184, 1069-1089.

  6. Comparison of Predicted Thermoelectric Energy Conversion Efficiency by Cumulative Properties and Reduced Variables Approaches

    NASA Astrophysics Data System (ADS)

    Linker, Thomas M.; Lee, Glenn S.; Beekman, Matt

    2018-06-01

    The semi-analytical methods of thermoelectric energy conversion efficiency calculation based on the cumulative properties approach and reduced variables approach are compared for 21 high performance thermoelectric materials. Both approaches account for the temperature dependence of the material properties as well as the Thomson effect, thus the predicted conversion efficiencies are generally lower than that based on the conventional thermoelectric figure of merit ZT for nearly all of the materials evaluated. The two methods also predict material energy conversion efficiencies that are in very good agreement which each other, even for large temperature differences (average percent difference of 4% with maximum observed deviation of 11%). The tradeoff between obtaining a reliable assessment of a material's potential for thermoelectric applications and the complexity of implementation of the three models, as well as the advantages of using more accurate modeling approaches in evaluating new thermoelectric materials, are highlighted.

  7. DFT and 3D-QSAR Studies of Anti-Cancer Agents m-(4-Morpholinoquinazolin-2-yl) Benzamide Derivatives for Novel Compounds Design

    NASA Astrophysics Data System (ADS)

    Zhao, Siqi; Zhang, Guanglong; Xia, Shuwei; Yu, Liangmin

    2018-06-01

    As a group of diversified frameworks, quinazolin derivatives displayed a broad field of biological functions, especially as anticancer. To investigate the quantitative structure-activity relationship, 3D-QSAR models were generated with 24 quinazolin scaffold molecules. The experimental and predicted pIC50 values for both training and test set compounds showed good correlation, which proved the robustness and reliability of the generated QSAR models. The most effective CoMFA and CoMSIA were obtained with correlation coefficient r 2 ncv of 1.00 (both) and leave-one-out coefficient q 2 of 0.61 and 0.59, respectively. The predictive abilities of CoMFA and CoMSIA were quite good with the predictive correlation coefficients ( r 2 pred ) of 0.97 and 0.91. In addition, the statistic results of CoMFA and CoMSIA were used to design new quinazolin molecules.

  8. Preliminary study of soil permeability properties using principal component analysis

    NASA Astrophysics Data System (ADS)

    Yulianti, M.; Sudriani, Y.; Rustini, H. A.

    2018-02-01

    Soil permeability measurement is undoubtedly important in carrying out soil-water research such as rainfall-runoff modelling, irrigation water distribution systems, etc. It is also known that acquiring reliable soil permeability data is rather laborious, time-consuming, and costly. Therefore, it is desirable to develop the prediction model. Several studies of empirical equations for predicting permeability have been undertaken by many researchers. These studies derived the models from areas which soil characteristics are different from Indonesian soil, which suggest a possibility that these permeability models are site-specific. The purpose of this study is to identify which soil parameters correspond strongly to soil permeability and propose a preliminary model for permeability prediction. Principal component analysis (PCA) was applied to 16 parameters analysed from 37 sites consist of 91 samples obtained from Batanghari Watershed. Findings indicated five variables that have strong correlation with soil permeability, and we recommend a preliminary permeability model, which is potential for further development.

  9. Artificial neural network modeling of the water quality index using land use areas as predictors.

    PubMed

    Gazzaz, Nabeel M; Yusoff, Mohd Kamil; Ramli, Mohammad Firuz; Juahir, Hafizan; Aris, Ahmad Zaharin

    2015-02-01

    This paper describes the design of an artificial neural network (ANN) model to predict the water quality index (WQI) using land use areas as predictors. Ten-year records of land use statistics and water quality data for Kinta River (Malaysia) were employed in the modeling process. The most accurate WQI predictions were obtained with the network architecture 7-23-1; the back propagation training algorithm; and a learning rate of 0.02. The WQI forecasts of this model had significant (p < 0.01), positive, very high correlation (ρs = 0.882) with the measured WQI values. Sensitivity analysis revealed that the relative importance of the land use classes to WQI predictions followed the order: mining > rubber > forest > logging > urban areas > agriculture > oil palm. These findings show that the ANNs are highly reliable means of relating water quality to land use, thus integrating land use development with river water quality management.

  10. Detrended cross-correlation coefficient: Application to predict apoptosis protein subcellular localization.

    PubMed

    Liang, Yunyun; Liu, Sanyang; Zhang, Shengli

    2016-12-01

    Apoptosis, or programed cell death, plays a central role in the development and homeostasis of an organism. Obtaining information on subcellular location of apoptosis proteins is very helpful for understanding the apoptosis mechanism. The prediction of subcellular localization of an apoptosis protein is still a challenging task, and existing methods mainly based on protein primary sequences. In this paper, we introduce a new position-specific scoring matrix (PSSM)-based method by using detrended cross-correlation (DCCA) coefficient of non-overlapping windows. Then a 190-dimensional (190D) feature vector is constructed on two widely used datasets: CL317 and ZD98, and support vector machine is adopted as classifier. To evaluate the proposed method, objective and rigorous jackknife cross-validation tests are performed on the two datasets. The results show that our approach offers a novel and reliable PSSM-based tool for prediction of apoptosis protein subcellular localization. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Reliability analysis of structural ceramics subjected to biaxial flexure

    NASA Technical Reports Server (NTRS)

    Chao, Luen-Yuan; Shetty, Dinesh K.

    1991-01-01

    The reliability of alumina disks subjected to biaxial flexure is predicted on the basis of statistical fracture theory using a critical strain energy release rate fracture criterion. Results on a sintered silicon nitride are consistent with reliability predictions based on pore-initiated penny-shaped cracks with preferred orientation normal to the maximum principal stress. Assumptions with regard to flaw types and their orientations in each ceramic can be justified by fractography. It is shown that there are no universal guidelines for selecting fracture criteria or assuming flaw orientations in reliability analyses.

  12. Reliability Prediction Approaches For Domestic Intelligent Electric Energy Meter Based on IEC62380

    NASA Astrophysics Data System (ADS)

    Li, Ning; Tong, Guanghua; Yang, Jincheng; Sun, Guodong; Han, Dongjun; Wang, Guixian

    2018-01-01

    The reliability of intelligent electric energy meter is a crucial issue considering its large calve application and safety of national intelligent grid. This paper developed a procedure of reliability prediction for domestic intelligent electric energy meter according to IEC62380, especially to identify the determination of model parameters combining domestic working conditions. A case study was provided to show the effectiveness and validation.

  13. Understanding Interrater Reliability and Validity of Risk Assessment Tools Used to Predict Adverse Clinical Events.

    PubMed

    Siedlecki, Sandra L; Albert, Nancy M

    This article will describe how to assess interrater reliability and validity of risk assessment tools, using easy-to-follow formulas, and to provide calculations that demonstrate principles discussed. Clinical nurse specialists should be able to identify risk assessment tools that provide high-quality interrater reliability and the highest validity for predicting true events of importance to clinical settings. Making best practice recommendations for assessment tool use is critical to high-quality patient care and safe practices that impact patient outcomes and nursing resources. Optimal risk assessment tool selection requires knowledge about interrater reliability and tool validity. The clinical nurse specialist will understand the reliability and validity issues associated with risk assessment tools, and be able to evaluate tools using basic calculations. Risk assessment tools are developed to objectively predict quality and safety events and ultimately reduce the risk of event occurrence through preventive interventions. To ensure high-quality tool use, clinical nurse specialists must critically assess tool properties. The better the tool's ability to predict adverse events, the more likely that event risk is mediated. Interrater reliability and validity assessment is relatively an easy skill to master and will result in better decisions when selecting or making recommendations for risk assessment tool use.

  14. A recursive Bayesian approach for fatigue damage prognosis: An experimental validation at the reliability component level

    NASA Astrophysics Data System (ADS)

    Gobbato, Maurizio; Kosmatka, John B.; Conte, Joel P.

    2014-04-01

    Fatigue-induced damage is one of the most uncertain and highly unpredictable failure mechanisms for a large variety of mechanical and structural systems subjected to cyclic and random loads during their service life. A health monitoring system capable of (i) monitoring the critical components of these systems through non-destructive evaluation (NDE) techniques, (ii) assessing their structural integrity, (iii) recursively predicting their remaining fatigue life (RFL), and (iv) providing a cost-efficient reliability-based inspection and maintenance plan (RBIM) is therefore ultimately needed. In contribution to these objectives, the first part of the paper provides an overview and extension of a comprehensive reliability-based fatigue damage prognosis methodology — previously developed by the authors — for recursively predicting and updating the RFL of critical structural components and/or sub-components in aerospace structures. In the second part of the paper, a set of experimental fatigue test data, available in the literature, is used to provide a numerical verification and an experimental validation of the proposed framework at the reliability component level (i.e., single damage mechanism evolving at a single damage location). The results obtained from this study demonstrate (i) the importance and the benefits of a nearly continuous NDE monitoring system, (ii) the efficiency of the recursive Bayesian updating scheme, and (iii) the robustness of the proposed framework in recursively updating and improving the RFL estimations. This study also demonstrates that the proposed methodology can lead to either an extent of the RFL (with a consequent economical gain without compromising the minimum safety requirements) or an increase of safety by detecting a premature fault and therefore avoiding a very costly catastrophic failure.

  15. Bayes Analysis and Reliability Implications of Stress-Rupture Testing a Kevlar/Epoxy COPV using Temperature and Pressure Acceleration

    NASA Technical Reports Server (NTRS)

    Phoenix, S. Leigh; Kezirian, Michael T.; Murthy, Pappu L. N.

    2009-01-01

    Composite Overwrapped Pressure Vessel (COPVs) that have survived a long service time under pressure generally must be recertified before service is extended. Sometimes lifetime testing is performed on an actual COPV in service in an effort to validate the reliability model that is the basis for certifying the continued flight worthiness of its sisters. Currently, testing of such a Kevlar49(registered TradeMark)/epoxy COPV is nearing completion. The present paper focuses on a Bayesian statistical approach to analyze the possible failure time results of this test and to assess the implications in choosing between possible model parameter values that in the past have had significant uncertainty. The key uncertain parameters in this case are the actual fiber stress ratio at operating pressure, and the Weibull shape parameter for lifetime; the former has been uncertain due to ambiguities in interpreting the original and a duplicate burst test. The latter has been uncertain due to major differences between COPVs in the data base and the actual COPVs in service. Any information obtained that clarifies and eliminates uncertainty in these parameters will have a major effect on the predicted reliability of the service COPVs going forward. The key result is that the longer the vessel survives, the more likely the more optimistic stress ratio is correct. At the time of writing, the resulting effect on predicted future reliability is dramatic, increasing it by about one nine , that is, reducing the probability of failure by an order of magnitude. However, testing one vessel does not change the uncertainty on the Weibull shape parameter for lifetime since testing several would be necessary.

  16. Is the necrosis/wall ADC ratio useful for the differentiation of benign and malignant breast lesions?

    PubMed

    Durur-Subasi, Irmak; Durur-Karakaya, Afak; Karaman, Adem; Seker, Mehmet; Demirci, Elif; Alper, Fatih

    2017-05-01

    To determine whether the necrosis/wall apparent diffusion coefficient (ADC) ratio is useful for the malignant-benign differentiation of necrotic breast lesions. Breast MRI was performed using a 3-T system. In this retrospective study, calculation of the necrosis/wall ADC ratio was based on ADC values measured from the necrosis and from the wall of malignant and benign breast lesions by diffusion-weighted imaging (DWI). By synchronizing post-contrast T 1 weighted images, the separate parts of wall and necrosis were maintained. All the diagnoses were pathologically confirmed. Statistical analyses were conducted using an independent sample t-test and receiver operating characteristic analysis. The intraclass and interclass correlations were evaluated. A total of 66 female patients were enrolled, 38 of whom had necrotic breast carcinomas and 28 of whom had breast abscesses. The ADC values were obtained from both the wall and necrosis. The mean necrosis/wall ADC ratio (± standard deviation) was 1.61 ± 0.51 in carcinomas, and it was 0.65 ± 0.33 in abscesses. The area under the curve values for necrosis ADC, wall ADC and the necrosis/wall ADC ratio were 0.680, 0.068 and 0.942, respectively. A wall/necrosis ADC ratio cut-off value of 1.18 demonstrated a sensitivity of 97%, specificity of 93%, a positive-predictive value of 95%, a negative-predictive value of 96% and an accuracy of 95% in determining the malignant nature of necrotic breast lesions. There was a good intra- and interclass reliability for the ADC values of both necrosis and wall. The necrosis/wall ADC ratio appears to be a reliable and promising tool for discriminating breast carcinomas from abscesses using DWI. Advances in knowledge: ADC values of the necrosis obtained by DWI are valuable for malignant-benign differentiation in necrotic breast lesions. The necrosis/wall ADC ratio appears to be a reliable and promising tool in the breast imaging field.

  17. Conformal Prediction Based on K-Nearest Neighbors for Discrimination of Ginsengs by a Home-Made Electronic Nose

    PubMed Central

    Sun, Xiyang; Miao, Jiacheng; Wang, You; Luo, Zhiyuan; Li, Guang

    2017-01-01

    An estimate on the reliability of prediction in the applications of electronic nose is essential, which has not been paid enough attention. An algorithm framework called conformal prediction is introduced in this work for discriminating different kinds of ginsengs with a home-made electronic nose instrument. Nonconformity measure based on k-nearest neighbors (KNN) is implemented separately as underlying algorithm of conformal prediction. In offline mode, the conformal predictor achieves a classification rate of 84.44% based on 1NN and 80.63% based on 3NN, which is better than that of simple KNN. In addition, it provides an estimate of reliability for each prediction. In online mode, the validity of predictions is guaranteed, which means that the error rate of region predictions never exceeds the significance level set by a user. The potential of this framework for detecting borderline examples and outliers in the application of E-nose is also investigated. The result shows that conformal prediction is a promising framework for the application of electronic nose to make predictions with reliability and validity. PMID:28805721

  18. HomPPI: a class of sequence homology based protein-protein interface prediction methods

    PubMed Central

    2011-01-01

    Background Although homology-based methods are among the most widely used methods for predicting the structure and function of proteins, the question as to whether interface sequence conservation can be effectively exploited in predicting protein-protein interfaces has been a subject of debate. Results We studied more than 300,000 pair-wise alignments of protein sequences from structurally characterized protein complexes, including both obligate and transient complexes. We identified sequence similarity criteria required for accurate homology-based inference of interface residues in a query protein sequence. Based on these analyses, we developed HomPPI, a class of sequence homology-based methods for predicting protein-protein interface residues. We present two variants of HomPPI: (i) NPS-HomPPI (Non partner-specific HomPPI), which can be used to predict interface residues of a query protein in the absence of knowledge of the interaction partner; and (ii) PS-HomPPI (Partner-specific HomPPI), which can be used to predict the interface residues of a query protein with a specific target protein. Our experiments on a benchmark dataset of obligate homodimeric complexes show that NPS-HomPPI can reliably predict protein-protein interface residues in a given protein, with an average correlation coefficient (CC) of 0.76, sensitivity of 0.83, and specificity of 0.78, when sequence homologs of the query protein can be reliably identified. NPS-HomPPI also reliably predicts the interface residues of intrinsically disordered proteins. Our experiments suggest that NPS-HomPPI is competitive with several state-of-the-art interface prediction servers including those that exploit the structure of the query proteins. The partner-specific classifier, PS-HomPPI can, on a large dataset of transient complexes, predict the interface residues of a query protein with a specific target, with a CC of 0.65, sensitivity of 0.69, and specificity of 0.70, when homologs of both the query and the target can be reliably identified. The HomPPI web server is available at http://homppi.cs.iastate.edu/. Conclusions Sequence homology-based methods offer a class of computationally efficient and reliable approaches for predicting the protein-protein interface residues that participate in either obligate or transient interactions. For query proteins involved in transient interactions, the reliability of interface residue prediction can be improved by exploiting knowledge of putative interaction partners. PMID:21682895

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perret, Gregory

    The critical decay constant (B/A), delayed neutron fraction (B) and generation time (A) of the Minerve reactor were measured by the Paul Scherrer Institut (PSI) and the Commissariat a l'Energie Atomique (CEA) in September 2014 using the Feynman-alpha and Power Spectral Density neutron noise measurement techniques. Three slightly subcritical configuration were measured using two 1-g {sup 235}U fission chambers. This paper reports on the results obtained by PSI in the near critical configuration (-2g). The most reliable and precise results were obtained with the Cross-Power Spectral Density technique: B = 708.4±9.2 pcm, B/A = 79.0±0.6 s{sup -1} and A 89.7±1.4more » micros. Predictions of the same kinetic parameters were obtained with MCNP5-v1.6 and the JEFF-3.1 and ENDF/B-VII.1 nuclear data libraries. On average the predictions for B and B/A overestimate the experimental results by 5% and 11%, respectively. The discrepancy is suspected to come from either a corruption of the data or from the inadequacy of the point kinetic equations to interpret the measurements in the Minerve driven system. (authors)« less

  20. Nonlinear Viscoelastic Characterization of the Porcine Spinal Cord

    PubMed Central

    Shetye, Snehal; Troyer, Kevin; Streijger, Femke; Lee, Jae H. T.; Kwon, Brian K.; Cripton, Peter; Puttlitz, Christian M.

    2014-01-01

    Although quasi-static and quasi-linear viscoelastic properties of the spinal cord have been reported previously, there are no published studies that have investigated the fully (strain-dependent) nonlinear viscoelastic properties of the spinal cord. In this study, stress relaxation experiments and dynamic cycling were performed on six fresh porcine lumbar cord specimens to examine their viscoelastic mechanical properties. The stress relaxation data were fitted to a modified superposition formulation and a novel finite ramp time correction technique was applied. The parameters obtained from this fitting methodology were used to predict the average dynamic cyclic viscoelastic behavior of the porcine cord. The data indicate that the porcine spinal cord exhibited fully nonlinear viscoelastic behavior. The average weighted RMSE for a Heaviside ramp fit was 2.8kPa, which was significantly greater (p < 0.001) than that of the nonlinear (comprehensive viscoelastic characterization (CVC) method) fit (0.365kPa). Further, the nonlinear mechanical parameters obtained were able to accurately predict the dynamic behavior, thus exemplifying the reliability of the obtained nonlinear parameters. These parameters will be important for future studies investigating various damage mechanisms of the spinal cord and studies developing high resolution finite elements models of the spine. PMID:24211612

  1. Coronary artery calcification score by multislice computed tomography predicts the outcome of dobutamine cardiovascular magnetic resonance imaging.

    PubMed

    Janssen, Caroline H C; Kuijpers, Dirkjan; Vliegenthart, Rozemarijn; Overbosch, Jelle; van Dijkman, Paul R M; Zijlstra, Felix; Oudkerk, Matthijs

    2005-06-01

    The aim of this study was to determine whether a coronary artery calcium (CAC) score of less than 11 can reliably rule out myocardial ischemia detected by dobutamine cardiovascular magnetic resonance imaging (CMR) in patients suspected of having myocardial ischemia. In 114 of 136 consecutive patients clinically suspected of myocardial ischemia with an inconclusive diagnosis of myocardial ischemia, dobutamine CMR was performed and the CAC score was determined. The CAC score was obtained by 16-row multidetector compued tomography (MDCT) and was calculated according to the method of Agatston. The CAC score and the results of the dobutamine CMR were correlated and the positive predictive value (PPV) and the negative predictive value (NPV) of the CAC score for dobutamine CMR were calculated. A total of 114 (87%) of the patients were eligible for this study. There was a significant correlation between the CAC score and dobutamine CMR (p<0.001). Patients with a CAC score of less than 11 showed no signs of inducible ischemia during dobutamine CMR. For a CAC score of less than 101, the NPV and the PPV of the CAC score for the outcome of dobutamine CMR were, respectively, 0.96 and 0.29. In patients with an inconclusive diagnosis of myocardial ischemia a MDCT CAC score of less than 11 reliably rules out myocardial ischemia detected by dobutamine CMR.

  2. Evaluating and comparing methods of sinkhole susceptibility mapping in the Ebro Valley evaporite karst (NE Spain)

    NASA Astrophysics Data System (ADS)

    Galve, J. P.; Gutiérrez, F.; Remondo, J.; Bonachea, J.; Lucha, P.; Cendrero, A.

    2009-10-01

    Multiple sinkhole susceptibility models have been generated in three study areas of the Ebro Valley evaporite karst (NE Spain) applying different methods (nearest neighbour distance, sinkhole density, heuristic scoring system and probabilistic analysis) for each sinkhole type separately (cover collapse sinkholes, cover and bedrock collapse sinkholes and cover and bedrock sagging sinkholes). The quantitative and independent evaluation of the predictive capability of the models reveals that: (1) The most reliable susceptibility models are those derived from the nearest neighbour distance and sinkhole density. These models can be generated in a simple and rapid way from detailed geomorphological maps. (2) The reliability of the nearest neighbour distance and density models is conditioned by the degree of clustering of the sinkholes. Consequently, the karst areas in which sinkholes show a higher clustering are a priori more favourable for predicting new occurrences. (3) The predictive capability of the best models obtained in this research is significantly higher (12.5-82.5%) than that of the heuristic sinkhole susceptibility model incorporated into the General Urban Plan for the municipality of Zaragoza. Although the probabilistic approach provides lower quality results than the methods based on sinkhole proximity and density, it helps to identify the most significant factors and select the most effective mitigation strategies and may be applied to model susceptibility in different future scenarios.

  3. Sum-over-states density functional perturbation theory: Prediction of reliable 13C, 15N, and 17O nuclear magnetic resonance chemical shifts

    NASA Astrophysics Data System (ADS)

    Olsson, Lars; Cremer, Dieter

    1996-11-01

    Sum-over-states density functional perturbation theory (SOS-DFPT) has been used to calculate 13C, 15N, and 17O NMR chemical shifts of 20 molecules, for which accurate experimental gas-phase values are available. Compared to Hartree-Fock (HF), SOS-DFPT leads to improved chemical shift values and approaches the degree of accuracy obtained with second order Møller-Plesset perturbation theory (MP2). This is particularly true in the case of 15N chemical shifts where SOS-DFPT performs even better than MP2. Additional improvements of SOS-DFPT chemical shifts can be obtained by empirically correcting diamagnetic and paramagnetic contributions to compensate for deficiencies which are typical of DFT.

  4. Evaluation of tactual displays for flight control

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Tanner, R. B.; Triggs, T. J.

    1973-01-01

    Manual tracking experiments were conducted to determine the suitability of tactual displays for presenting flight-control information in multitask situations. Although tracking error scores are considerably greater than scores obtained with a continuous visual display, preliminary results indicate that inter-task interference effects are substantially less with the tactual display in situations that impose high visual scanning workloads. The single-task performance degradation found with the tactual display appears to be a result of the coding scheme rather than the use of the tactual sensory mode per se. Analysis with the state-variable pilot/vehicle model shows that reliable predictions of tracking errors can be obtained for wide-band tracking systems once the pilot-related model parameters have been adjusted to reflect the pilot-display interaction.

  5. Molecular Dynamics Simulations and Kinetic Measurements to Estimate and Predict Protein-Ligand Residence Times.

    PubMed

    Mollica, Luca; Theret, Isabelle; Antoine, Mathias; Perron-Sierra, Françoise; Charton, Yves; Fourquez, Jean-Marie; Wierzbicki, Michel; Boutin, Jean A; Ferry, Gilles; Decherchi, Sergio; Bottegoni, Giovanni; Ducrot, Pierre; Cavalli, Andrea

    2016-08-11

    Ligand-target residence time is emerging as a key drug discovery parameter because it can reliably predict drug efficacy in vivo. Experimental approaches to binding and unbinding kinetics are nowadays available, but we still lack reliable computational tools for predicting kinetics and residence time. Most attempts have been based on brute-force molecular dynamics (MD) simulations, which are CPU-demanding and not yet particularly accurate. We recently reported a new scaled-MD-based protocol, which showed potential for residence time prediction in drug discovery. Here, we further challenged our procedure's predictive ability by applying our methodology to a series of glucokinase activators that could be useful for treating type 2 diabetes mellitus. We combined scaled MD with experimental kinetics measurements and X-ray crystallography, promptly checking the protocol's reliability by directly comparing computational predictions and experimental measures. The good agreement highlights the potential of our scaled-MD-based approach as an innovative method for computationally estimating and predicting drug residence times.

  6. Biochemical methane potential prediction of plant biomasses: Comparing chemical composition versus near infrared methods and linear versus non-linear models.

    PubMed

    Godin, Bruno; Mayer, Frédéric; Agneessens, Richard; Gerin, Patrick; Dardenne, Pierre; Delfosse, Philippe; Delcarte, Jérôme

    2015-01-01

    The reliability of different models to predict the biochemical methane potential (BMP) of various plant biomasses using a multispecies dataset was compared. The most reliable prediction models of the BMP were those based on the near infrared (NIR) spectrum compared to those based on the chemical composition. The NIR predictions of local (specific regression and non-linear) models were able to estimate quantitatively, rapidly, cheaply and easily the BMP. Such a model could be further used for biomethanation plant management and optimization. The predictions of non-linear models were more reliable compared to those of linear models. The presentation form (green-dried, silage-dried and silage-wet form) of biomasses to the NIR spectrometer did not influence the performances of the NIR prediction models. The accuracy of the BMP method should be improved to enhance further the BMP prediction models. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Using temperature-switching approach to evaluate the ELDRS of bipolar devices

    NASA Astrophysics Data System (ADS)

    Li, Xiaolong; Lu, Wu; Wang, Xin; Guo, Qi; Yu, Xin; He, Chengfa; Sun, Jing; Liu, Mohan; Yao, Shuai; Wei, Xinyu

    2017-12-01

    Enhanced low-dose rate sensitivity (ELDRS) exhibited at low-dose rates (LDRs) by most bipolar devices is considered as one of the main concerns for spacecraft reliability. In this work, a time-saving and conservative approach - temperature-switching approach (TSA) - to simulate the ELDRS of bipolar devices is presented. Good agreement is observed between the predictive curve obtained with the TSA and the LDR data, and TSA provides us with a new insight into the test technique for ELDRS. Additionally, the mechanisms of TSA are analyzed in this paper.

  8. Adaptive finite element method for turbulent flow near a propeller

    NASA Astrophysics Data System (ADS)

    Pelletier, Dominique; Ilinca, Florin; Hetu, Jean-Francois

    1994-11-01

    This paper presents an adaptive finite element method based on remeshing to solve incompressible turbulent free shear flow near a propeller. Solutions are obtained in primitive variables using a highly accurate finite element approximation on unstructured grids. Turbulence is modeled by a mixing length formulation. Two general purpose error estimators, which take into account swirl and the variation of the eddy viscosity, are presented and applied to the turbulent wake of a propeller. Predictions compare well with experimental measurements. The proposed adaptive scheme is robust, reliable and cost effective.

  9. Working papers: applicability of Box Jenkins techniques to gasoline consumption forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Reliable consumption forecasts are needed, however, traditional linear time-series techniques don't adequately account for an environment so subject to change. This report evaluates the use of Box Jenkins techniques for gasoline consumption forecasting. Box Jenkins methods were applied to data obtained from the Colorado Petroleum Association and the Colorado Highway Users Fund to ''predict'' 1978 and 1979 consumption. These results prove the Box Jenkins techniques to be quite effective. Forecasts for 1980-81 are included along with suggestions for continuous use of the technique to monitor consumption.

  10. Development and validation of the brief esophageal dysphagia questionnaire.

    PubMed

    Taft, T H; Riehl, M; Sodikoff, J B; Kahrilas, P J; Keefer, L; Doerfler, B; Pandolfino, J E

    2016-12-01

    Esophageal dysphagia is common in gastroenterology practice and has multiple etiologies. A complication for some patients with dysphagia is food impaction. A valid and reliable questionnaire to rapidly evaluate esophageal dysphagia and impaction symptoms can aid the gastroenterologist in gathering information to inform treatment approach and further evaluation, including endoscopy. 1638 patients participated over two study phases. 744 participants completed the Brief Esophageal Dysphagia Questionnaire (BEDQ) for phase 1; 869 completed the BEDQ, Visceral Sensitivity Index, Gastroesophageal Reflux Disease Questionnaire, and Hospital Anxiety and Depression Scale for phase 2. Demographic and clinical data were obtained via the electronic medical record. The BEDQ was evaluated for internal consistency, split-half reliability, ceiling and floor effects, and construct validity. The BEDQ demonstrated excellent internal consistency, reliability, and construct validity. The symptom frequency and severity scales scored above the standard acceptable cutoffs for reliability while the impaction subscale yielded poor internal consistency and split-half reliability; thus the impaction items were deemed qualifiers only and removed from the total score. No significant ceiling or floor effects were found with the exception of 1 item, and inter-item correlations fell within accepted ranges. Construct validity was supported by moderate yet significant correlations with other measures. The predictive ability of the BEDQ was small but significant. The BEDQ represents a rapid, reliable, and valid assessment tool for esophageal dysphagia with food impaction for clinical practice that differentiates between patients with major motor dysfunction and mechanical obstruction. © 2016 John Wiley & Sons Ltd.

  11. Development and Validation of the Brief Esophageal Dysphagia Questionnaire

    PubMed Central

    Taft, Tiffany H.; Riehl, Megan; Sodikoff, Jamie B.; Kahrilas, Peter J.; Keefer, Laurie; Doerfler, Bethany; Pandolfino, John E.

    2017-01-01

    Background Esophageal dysphagia is common in gastroenterology practice and has multiple etiologies. A complication for some patients with dysphagia is food impaction. A valid and reliable questionnaire to rapidly evaluate esophageal dysphagia and impaction symptoms can aid the gastroenterologist in gathering information to inform treatment approach and further evaluation, including endoscopy. Methods 1,638 patients participated over two study phases. 744 participants completed the Brief Esophageal Dysphagia Questionnaire (BEDQ) for phase 1; 869 completed the BEDQ, Visceral Sensitivity Index, Gastroesophageal Reflux Disease Questionnaire, and Hospital Anxiety and Depression Scale for phase 2. Demographic and clinical data were obtained via the electronic medical record. The BEDQ was evaluated for internal consistency, split-half reliability, ceiling and floor effects, and construct validity. Key Results The BEDQ demonstrated excellent internal consistency, reliability, and construct validity. The symptom frequency and severity scales scored above the standard acceptable cutoffs for reliability while the impaction subscale yielded poor internal consistency and split-half reliability; thus the impaction items were deemed qualifiers only and removed from the total score. No significant ceiling or floor effects were found with the exception of 1 item, and inter-item correlations fell within accepted ranges. Construct validity was supported by moderate yet significant correlations with other measures. The predictive ability of the BEDQ was small but significant. Conclusions & Inferences The BEDQ represents a rapid, reliable and valid assessment tool for esophageal dysphagia with food impaction for clinical practice that differentiates between patients with major motor dysfunction and mechanical obstruction. PMID:27380834

  12. Plain film measurement error in acute displaced midshaft clavicle fractures

    PubMed Central

    Archer, Lori Anne; Hunt, Stephen; Squire, Daniel; Moores, Carl; Stone, Craig; O’Dea, Frank; Furey, Andrew

    2016-01-01

    Background Clavicle fractures are common and optimal treatment remains controversial. Recent literature suggests operative fixation of acute displaced mid-shaft clavicle fractures (DMCFs) shortened more than 2 cm improves outcomes. We aimed to identify correlation between plain film and computed tomography (CT) measurement of displacement and the inter- and intraobserver reliability of repeated radiographic measurements. Methods We obtained radiographs and CT scans of patients with acute DMCFs. Three orthopedic staff and 3 residents measured radiographic displacement at time zero and 2 weeks later. The CT measurements identified absolute shortening in 3 dimensions (by subtracting the length of the fractured from the intact clavicle). We then compared shortening measured on radiographs and shortening measured in 3 dimensions on CT. Interobserver and intraobserver reliability were calculated. Results We reviewed the fractures of 22 patients. Bland–Altman repeatability coefficient calculations indicated that radiograph and CT measurements of shortening could not be correlated owing to an unacceptable amount of measurement error (6 cm). Interobserver reliability for plain radiograph measurements was excellent (Cronbach α = 0.90). Likewise, intraobserver reliabilities for plain radiograph measurements as calculated with paired t tests indicated excellent correlation (p > 0.05 in all but 1 observer [p = 0.04]). Conclusion To establish shortening as an indication for DMCF fixation, reliable measurement tools are required. The low correlation between plain film and CT measurements we observed suggests further research is necessary to establish what imaging modality reliably predicts shortening. Our results indicate weak correlation between radiograph and CT measurement of acute DMCF shortening. PMID:27438054

  13. Heroic Reliability Improvement in Manned Space Systems

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2017-01-01

    System reliability can be significantly improved by a strong continued effort to identify and remove all the causes of actual failures. Newly designed systems often have unexpected high failure rates which can be reduced by successive design improvements until the final operational system has an acceptable failure rate. There are many causes of failures and many ways to remove them. New systems may have poor specifications, design errors, or mistaken operations concepts. Correcting unexpected problems as they occur can produce large early gains in reliability. Improved technology in materials, components, and design approaches can increase reliability. The reliability growth is achieved by repeatedly operating the system until it fails, identifying the failure cause, and fixing the problem. The failure rate reduction that can be obtained depends on the number and the failure rates of the correctable failures. Under the strong assumption that the failure causes can be removed, the decline in overall failure rate can be predicted. If a failure occurs at the rate of lambda per unit time, the expected time before the failure occurs and can be corrected is 1/lambda, the Mean Time Before Failure (MTBF). Finding and fixing a less frequent failure with the rate of lambda/2 per unit time requires twice as long, time of 1/(2 lambda). Cutting the failure rate in half requires doubling the test and redesign time and finding and eliminating the failure causes.Reducing the failure rate significantly requires a heroic reliability improvement effort.

  14. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    PubMed

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  15. The paradox of verbal autopsy in cause of death assignment: symptom question unreliability but predictive accuracy.

    PubMed

    Serina, Peter; Riley, Ian; Hernandez, Bernardo; Flaxman, Abraham D; Praveen, Devarsetty; Tallo, Veronica; Joshi, Rohina; Sanvictores, Diozele; Stewart, Andrea; Mooney, Meghan D; Murray, Christopher J L; Lopez, Alan D

    2016-01-01

    We believe that it is important that governments understand the reliability of the mortality data which they have at their disposable to guide policy debates. In many instances, verbal autopsy (VA) will be the only source of mortality data for populations, yet little is known about how the accuracy of VA diagnoses is affected by the reliability of the symptom responses. We previously described the effect of the duration of time between death and VA administration on VA validity. In this paper, using the same dataset, we assess the relationship between the reliability and completeness of symptom responses and the reliability and accuracy of cause of death (COD) prediction. The study was based on VAs in the Population Health Metrics Research Consortium (PHMRC) VA Validation Dataset from study sites in Bohol and Manila, Philippines and Andhra Pradesh, India. The initial interview was repeated within 3-52 months of death. Question responses were assessed for reliability and completeness between the two survey rounds. COD was predicted by Tariff Method. A sample of 4226 VAs was collected for 2113 decedents, including 1394 adults, 349 children, and 370 neonates. Mean question reliability was unexpectedly low ( kappa  = 0.447): 42.5 % of responses positive at the first interview were negative at the second, and 47.9 % of responses positive at the second had been negative at the first. Question reliability was greater for the short form of the PHMRC instrument ( kappa  = 0.497) and when analyzed at the level of the individual decedent ( kappa  = 0.610). Reliability at the level of the individual decedent was associated with COD predictive reliability and predictive accuracy. Families give coherent accounts of events leading to death but the details vary from interview to interview for the same case. Accounts are accurate but inconsistent; different subsets of symptoms are identified on each occasion. However, there are sufficient accurate and consistent subsets of symptoms to enable the Tariff Method to assign a COD. Questions which contributed most to COD prediction were also the most reliable and consistent across repeat interviews; these have been included in the short form VA questionnaire. Accuracy and reliability of diagnosis for an individual death depend on the quality of interview. This has considerable implications for the progressive roll out of VAs into civil registration and vital statistics (CRVS) systems.

  16. Predictive capacity of a non-radioisotopic local lymph node assay using flow cytometry, LLNA:BrdU-FCM: Comparison of a cutoff approach and inferential statistics.

    PubMed

    Kim, Da-Eun; Yang, Hyeri; Jang, Won-Hee; Jung, Kyoung-Mi; Park, Miyoung; Choi, Jin Kyu; Jung, Mi-Sook; Jeon, Eun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Park, Jung Eun; Sohn, Soo Jung; Kim, Tae Sung; Ahn, Il Young; Jeong, Tae-Cheon; Lim, Kyung-Min; Bae, SeungJin

    2016-01-01

    In order for a novel test method to be applied for regulatory purposes, its reliability and relevance, i.e., reproducibility and predictive capacity, must be demonstrated. Here, we examine the predictive capacity of a novel non-radioisotopic local lymph node assay, LLNA:BrdU-FCM (5-bromo-2'-deoxyuridine-flow cytometry), with a cutoff approach and inferential statistics as a prediction model. 22 reference substances in OECD TG429 were tested with a concurrent positive control, hexylcinnamaldehyde 25%(PC), and the stimulation index (SI) representing the fold increase in lymph node cells over the vehicle control was obtained. The optimal cutoff SI (2.7≤cutoff <3.5), with respect to predictive capacity, was obtained by a receiver operating characteristic curve, which produced 90.9% accuracy for the 22 substances. To address the inter-test variability in responsiveness, SI values standardized with PC were employed to obtain the optimal percentage cutoff (42.6≤cutoff <57.3% of PC), which produced 86.4% accuracy. A test substance may be diagnosed as a sensitizer if a statistically significant increase in SI is elicited. The parametric one-sided t-test and non-parametric Wilcoxon rank-sum test produced 77.3% accuracy. Similarly, a test substance could be defined as a sensitizer if the SI means of the vehicle control, and of the low, middle, and high concentrations were statistically significantly different, which was tested using ANOVA or Kruskal-Wallis, with post hoc analysis, Dunnett, or DSCF (Dwass-Steel-Critchlow-Fligner), respectively, depending on the equal variance test, producing 81.8% accuracy. The absolute SI-based cutoff approach produced the best predictive capacity, however the discordant decisions between prediction models need to be examined further. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Comparison of mean climate trends in the Northern Hemisphere between National Centers for Environmental Prediction and two atmosphere-ocean model forced runs

    NASA Astrophysics Data System (ADS)

    Lucarini, Valerio; Russell, Gary L.

    2002-08-01

    Results are presented for two greenhouse gas experiments of the Goddard Institute for Space Studies atmosphere-ocean model (AOM). The computed trends of surface pressure; surface temperature; 850, 500, and 200 mbar geopotential heights; and related temperatures of the model for the time frame 1960-2000 are compared with those obtained from the National Centers for Enviromental Prediction (NCEP) observations. The domain of interest is the Northern Hemisphere because of the higher reliability of both the model results and the observations. A spatial correlation analysis and a mean value comparison are performed, showing good agreement in terms of statistical significance for most of the variables considered in the winter and annual means. However, the 850 mbar temperature trends do not show significant positive correlation, and the surface pressure and 850 mbar geopotential height mean trends confidence intervals do not overlap. A brief general discussion about the statistics of trend detection is presented. The accuracy that this AOM has in describing the regional and NH mean climate trends inferred from NCEP through the atmosphere suggests that it may be reliable in forecasting future climate changes.

  18. Critical evaluation of methods to incorporate entropy loss upon binding in high-throughput docking.

    PubMed

    Salaniwal, Sumeet; Manas, Eric S; Alvarez, Juan C; Unwalla, Rayomand J

    2007-02-01

    Proper accounting of the positional/orientational/conformational entropy loss associated with protein-ligand binding is important to obtain reliable predictions of binding affinity. Herein, we critically examine two simplified statistical mechanics-based approaches, namely a constant penalty per rotor method, and a more rigorous method, referred to here as the partition function-based scoring (PFS) method, to account for such entropy losses in high-throughput docking calculations. Our results on the estrogen receptor beta and dihydrofolate reductase proteins demonstrate that, while the constant penalty method over-penalizes molecules for their conformational flexibility, the PFS method behaves in a more "DeltaG-like" manner by penalizing different rotors differently depending on their residual entropy in the bound state. Furthermore, in contrast to no entropic penalty or the constant penalty approximation, the PFS method does not exhibit any bias towards either rigid or flexible molecules in the hit list. Preliminary enrichment studies using a lead-like random molecular database suggest that an accurate representation of the "true" energy landscape of the protein-ligand complex is critical for reliable predictions of relative binding affinities by the PFS method. Copyright 2006 Wiley-Liss, Inc.

  19. Reliable resonance assignments of selected residues of proteins with known structure based on empirical NMR chemical shift prediction

    NASA Astrophysics Data System (ADS)

    Li, Da-Wei; Meng, Dan; Brüschweiler, Rafael

    2015-05-01

    A robust NMR resonance assignment method is introduced for proteins whose 3D structure has previously been determined by X-ray crystallography. The goal of the method is to obtain a subset of correct assignments from a parsimonious set of 3D NMR experiments of 15N, 13C labeled proteins. Chemical shifts of sequential residue pairs are predicted from static protein structures using PPM_One, which are then compared with the corresponding experimental shifts. Globally optimized weighted matching identifies the assignments that are robust with respect to small changes in NMR cross-peak positions. The method, termed PASSPORT, is demonstrated for 4 proteins with 100-250 amino acids using 3D NHCA and a 3D CBCA(CO)NH experiments as input producing correct assignments with high reliability for 22% of the residues. The method, which works best for Gly, Ala, Ser, and Thr residues, provides assignments that serve as anchor points for additional assignments by both manual and semi-automated methods or they can be directly used for further studies, e.g. on ligand binding, protein dynamics, or post-translational modification, such as phosphorylation.

  20. Reliable Resonance Assignments of Selected Residues of Proteins with Known Structure Based on Empirical NMR Chemical Shift Prediction

    PubMed Central

    Li, Da-Wei; Meng, Dan; Brüschweiler, Rafael

    2015-01-01

    A robust NMR resonance assignment method is introduced for proteins whose 3D structure has previously been determined by X-ray crystallography. The goal of the method is to obtain a subset of correct assignments from a parsimonious set of 3D NMR experiments of 15N, 13C labeled proteins. Chemical shifts of sequential residue pairs are predicted from static protein structures using PPM_One, which are then compared with the corresponding experimental shifts. Globally optimized weighted matching identifies the assignments that are robust with respect to small changes in NMR cross-peak positions. The method, termed PASSPORT, is demonstrated for 4 proteins with 100 – 250 amino acids using 3D NHCA and a 3D CBCA(CO)NH experiments as input producing correct assignments with high reliability for 22% of the residues. The method, which works best for Gly, Ala, Ser, and Thr residues, provides assignments that serve as anchor points for additional assignments by both manual and semi-automated methods or they can be directly used for further studies, e.g. on ligand binding, protein dynamics, or post-translational modification, such as phosphorylation. PMID:25863893

  1. Salivary pH and Buffering Capacity as Risk Markers for Early Childhood Caries: A Clinical Study.

    PubMed

    Jayaraj, D; Ganesan, S

    2015-01-01

    The diagnostic utility of saliva is currently being explored in various branches of dentistry, remarkably in the field of caries research. This study was aimed to determine if assessment of salivary pH and buffering capacity would serve as reliable tools in risk prediction of early childhood caries (ECC). Paraffin-stimulated salivary samples were collected from 50 children with ECC (group I) and 50 caries free children (group II). Salivary pH and buffering capacity (by titration with 0.1 N hydrochloric acid) were assessed using a handheld digital pH meter in both groups. The data obtained were subjected to statistical analysis. Statistically, no significant difference was observed between both the groups for all salivary parameters assessed, except for the buffering capacity level at 150 μl titration of 0.1 N hydrochloric acid (p = 0.73; significant at 1% level). Salivary pH and buffering capacity may not serve as reliable markers for risk prediction of ECC. How to cite this article: Jayaraj D, Ganesan S. Salivary pH and Buffering Capacity as Risk Markers for Early Childhood Caries: A Clinical Study. Int J Clin Pediatr Dent 2015;8(3):167-171.

  2. A Systematic Review of the Reliability and Validity of Behavioural Tests Used to Assess Behavioural Characteristics Important in Working Dogs.

    PubMed

    Brady, Karen; Cracknell, Nina; Zulch, Helen; Mills, Daniel Simon

    2018-01-01

    Working dogs are selected based on predictions from tests that they will be able to perform specific tasks in often challenging environments. However, withdrawal from service in working dogs is still a big problem, bringing into question the reliability of the selection tests used to make these predictions. A systematic review was undertaken aimed at bringing together available information on the reliability and predictive validity of the assessment of behavioural characteristics used with working dogs to establish the quality of selection tests currently available for use to predict success in working dogs. The search procedures resulted in 16 papers meeting the criteria for inclusion. A large range of behaviour tests and parameters were used in the identified papers, and so behaviour tests and their underpinning constructs were grouped on the basis of their relationship with positive core affect (willingness to work, human-directed social behaviour, object-directed play tendencies) and negative core affect (human-directed aggression, approach withdrawal tendencies, sensitivity to aversives). We then examined the papers for reports of inter-rater reliability, within-session intra-rater reliability, test-retest validity and predictive validity. The review revealed a widespread lack of information relating to the reliability and validity of measures to assess behaviour and inconsistencies in terminologies, study parameters and indices of success. There is a need to standardise the reporting of these aspects of behavioural tests in order to improve the knowledge base of what characteristics are predictive of optimal performance in working dog roles, improving selection processes and reducing working dog redundancy. We suggest the use of a framework based on explaining the direct or indirect relationship of the test with core affect.

  3. Open EFTs, IR effects & late-time resummations: systematic corrections in stochastic inflation

    DOE PAGES

    Burgess, C. P.; Holman, R.; Tasinato, G.

    2016-01-26

    Though simple inflationary models describe the CMB well, their corrections are often plagued by infrared effects that obstruct a reliable calculation of late-time behaviour. Here we adapt to cosmology tools designed to address similar issues in other physical systems with the goal of making reliable late-time inflationary predictions. The main such tool is Open EFTs which reduce in the inflationary case to Stochastic Inflation plus calculable corrections. We apply this to a simple inflationary model that is complicated enough to have dangerous IR behaviour yet simple enough to allow the inference of late-time behaviour. We find corrections to standard Stochasticmore » Inflationary predictions for the noise and drift, and we find these corrections ensure the IR finiteness of both these quantities. The late-time probability distribution, P(Φ), for super-Hubble field fluctuations are obtained as functions of the noise and drift and so these too are IR finite. We compare our results to other methods (such as large-N models) and find they agree when these models are reliable. In all cases we can explore in detail we find IR secular effects describe the slow accumulation of small perturbations to give a big effect: a significant distortion of the late-time probability distribution for the field. But the energy density associated with this is only of order H 4 at late times and so does not generate a dramatic gravitational back-reaction.« less

  4. Open EFTs, IR effects & late-time resummations: systematic corrections in stochastic inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgess, C. P.; Holman, R.; Tasinato, G.

    Though simple inflationary models describe the CMB well, their corrections are often plagued by infrared effects that obstruct a reliable calculation of late-time behaviour. Here we adapt to cosmology tools designed to address similar issues in other physical systems with the goal of making reliable late-time inflationary predictions. The main such tool is Open EFTs which reduce in the inflationary case to Stochastic Inflation plus calculable corrections. We apply this to a simple inflationary model that is complicated enough to have dangerous IR behaviour yet simple enough to allow the inference of late-time behaviour. We find corrections to standard Stochasticmore » Inflationary predictions for the noise and drift, and we find these corrections ensure the IR finiteness of both these quantities. The late-time probability distribution, P(Φ), for super-Hubble field fluctuations are obtained as functions of the noise and drift and so these too are IR finite. We compare our results to other methods (such as large-N models) and find they agree when these models are reliable. In all cases we can explore in detail we find IR secular effects describe the slow accumulation of small perturbations to give a big effect: a significant distortion of the late-time probability distribution for the field. But the energy density associated with this is only of order H 4 at late times and so does not generate a dramatic gravitational back-reaction.« less

  5. Comparative Analysis of NOAA REFM and SNB3GEO Tools for the Forecast of the Fluxes of High-Energy Electrons at GEO

    NASA Technical Reports Server (NTRS)

    Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, Homayon; Sibeck, D. G.; Billings, S. A.

    2016-01-01

    Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field B(sub z) observations at L1. The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.

  6. Performance of 2-D shear wave elastography in liver fibrosis assessment compared with serologic tests and transient elastography in clinical routine.

    PubMed

    Bota, Simona; Paternostro, Rafael; Etschmaier, Alexandra; Schwarzer, Remy; Salzl, Petra; Mandorfer, Mattias; Kienbacher, Christian; Ferlitsch, Monika; Reiberger, Thomas; Trauner, Michael; Peck-Radosavljevic, Markus; Ferlitsch, Arnulf

    2015-09-01

    Liver stiffness values assessed with 2-D shear wave elastography (SWE), transient elastography (TE) and simple serologic tests were compared with respect to non-invasive assessment in a cohort of 127 consecutive patients with chronic liver diseases. The rate of reliable liver stiffness measurements was significantly higher with 2-D SWE than with TE: 99.2% versus 74.8%, p < 0.0001 (different reliability criteria used, according to current recommendations). In univariate analysis, liver stiffness measured with 2-D SWE correlated best with fibrosis stage estimated with TE (r = 0.699, p < 0.0001), followed by Forns score (r = 0.534, p < 0.0001) and King's score (r = 0.512, p < 0.0001). However, in multivariate analysis, only 2-D SWE-measured values remained correlated with fibrosis stage (p < 0.0001). The optimal 2-D SWE cutoff values for predicting significant fibrosis were 8.03 kPa for fibrosis stage ≥2 (area under the receiver operating characteristic curve = 0.832) and 13.1 kPa for fibrosis stage 4 (area under the receiver operating characteristic curve = 0.915), respectively. In conclusion, 2-D SWE can be used to obtain reliable liver stiffness measurements in almost all patients and performs very well in predicting the presence of liver cirrhosis. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  7. Comparative analysis of NOAA REFM and SNB3GEO tools for the forecast of the fluxes of high-energy electrons at GEO.

    PubMed

    Balikhin, M A; Rodriguez, J V; Boynton, R J; Walker, S N; Aryan, H; Sibeck, D G; Billings, S A

    2016-01-01

    Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB 3 GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB 3 GEO forecasts use solar wind density and interplanetary magnetic field B z observations at L1.The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB 3 GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB 3 GEO forecast.

  8. Semen molecular and cellular features: these parameters can reliably predict subsequent ART outcome in a goat model

    PubMed Central

    Berlinguer, Fiammetta; Madeddu, Manuela; Pasciu, Valeria; Succu, Sara; Spezzigu, Antonio; Satta, Valentina; Mereu, Paolo; Leoni, Giovanni G; Naitana, Salvatore

    2009-01-01

    Currently, the assessment of sperm function in a raw or processed semen sample is not able to reliably predict sperm ability to withstand freezing and thawing procedures and in vivo fertility and/or assisted reproductive biotechnologies (ART) outcome. The aim of the present study was to investigate which parameters among a battery of analyses could predict subsequent spermatozoa in vitro fertilization ability and hence blastocyst output in a goat model. Ejaculates were obtained by artificial vagina from 3 adult goats (Capra hircus) aged 2 years (A, B and C). In order to assess the predictive value of viability, computer assisted sperm analyzer (CASA) motility parameters and ATP intracellular concentration before and after thawing and of DNA integrity after thawing on subsequent embryo output after an in vitro fertility test, a logistic regression analysis was used. Individual differences in semen parameters were evident for semen viability after thawing and DNA integrity. Results of IVF test showed that spermatozoa collected from A and B lead to higher cleavage rates (0 < 0.01) and blastocysts output (p < 0.05) compared with C. Logistic regression analysis model explained a deviance of 72% (p < 0.0001), directly related with the mean percentage of rapid spermatozoa in fresh semen (p < 0.01), semen viability after thawing (p < 0.01), and with two of the three comet parameters considered, i.e tail DNA percentage and comet length (p < 0.0001). DNA integrity alone had a high predictive value on IVF outcome with frozen/thawed semen (deviance explained: 57%). The model proposed here represents one of the many possible ways to explain differences found in embryo output following IVF with different semen donors and may represent a useful tool to select the most suitable donors for semen cryopreservation. PMID:19900288

  9. Lack of utility of arteriojugular venous differences of lactate as a reliable indicator of increased brain anaerobic metabolism in traumatic brain injury.

    PubMed

    Poca, Maria A; Sahuquillo, Juan; Vilalta, Anna; Garnacho, Angel

    2007-04-01

    Ischemic lesions are highly prevalent in patients with traumatic brain injuries (TBIs) and are the single most important cause of secondary brain damage. The prevention and early treatment of these lesions is the primary aim in the modem treatment of these patients. One of the most widely used monitoring techniques at the bedside is quantification of brain extracellular level of lactate by using arteriojugular venous differences of lactate (AVDL). The purpose of this study was to determine the sensitivity, specificity, and predictive value of AVDL as an indicator of increases in brain lactate production in patients with TBIs. Arteriojugular venous differences of lactate were calculated every 6 hours using samples obtained though a catheter placed in the jugular bulb in 45 patients with diffuse head injuries (57.8%) or evacuated brain lesions (42.2%). Cerebral lactate concentration obtained with a 20-kD microdialysis catheter implanted in undamaged tissue was used as the de facto gold standard. Six hundred seventy-three AVDL determinations and cerebral microdialysis samples were obtained simultaneously; 543 microdialysis samples (81%) showed lactate values greater than 2 mmol/L, but only 21 AVDL determinations (3.1%) showed an increase in brain lactate. No correlation was found between AVDL and cerebral lactate concentration (p = 0.014, p = 0.719). Arteriojugular venous differences of lactate had a sensitivity and specificity of 3.3 and 97.7%, respectively, with a false-negative rate of 96.7% and a false-positive rate of 2.3%. Arteriojugular venous differences of lactate do not reliably reflect increased cerebral lactate production and consequently are not reliable in ruling out brain ischemia in patients with TBIs. The clinical use of this monitoring method in neurocritical care should be reconsidered.

  10. Fatigue criterion to system design, life and reliability

    NASA Technical Reports Server (NTRS)

    Zaretsky, E. V.

    1985-01-01

    A generalized methodology to structural life prediction, design, and reliability based upon a fatigue criterion is advanced. The life prediction methodology is based in part on work of W. Weibull and G. Lundberg and A. Palmgren. The approach incorporates the computed life of elemental stress volumes of a complex machine element to predict system life. The results of coupon fatigue testing can be incorporated into the analysis allowing for life prediction and component or structural renewal rates with reasonable statistical certainty.

  11. 78 FR 73424 - Retirement of Requirements in Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-06

    ... predicated on the view that many violations of requirements currently included in Reliability Standards pose... for Bulk-Power System reliability or may be redundant. The Commission is interested in obtaining views... redundant. The Commission is interested in obtaining views on whether such requirements could be removed...

  12. The (un)reliability of item-level semantic priming effects.

    PubMed

    Heyman, Tom; Bruninx, Anke; Hutchison, Keith A; Storms, Gert

    2018-04-05

    Many researchers have tried to predict semantic priming effects using a myriad of variables (e.g., prime-target associative strength or co-occurrence frequency). The idea is that relatedness varies across prime-target pairs, which should be reflected in the size of the priming effect (e.g., cat should prime dog more than animal does). However, it is only insightful to predict item-level priming effects if they can be measured reliably. Thus, in the present study we examined the split-half and test-retest reliabilities of item-level priming effects under conditions that should discourage the use of strategies. The resulting priming effects proved extremely unreliable, and reanalyses of three published priming datasets revealed similar cases of low reliability. These results imply that previous attempts to predict semantic priming were unlikely to be successful. However, one study with an unusually large sample size yielded more favorable reliability estimates, suggesting that big data, in terms of items and participants, should be the future for semantic priming research.

  13. Two-dimensional echo-cardiographic estimation of left atrial volume and volume load in patients with congenital heart disease.

    PubMed

    Kawaguchi, A; Linde, L M; Imachi, T; Mizuno, H; Akutsu, H

    1983-12-01

    To estimate the left atrial volume (LAV) and pulmonary blood flow in patients with congenital heart disease (CHD), we employed two-dimensional echocardiography (TDE). The LAV was measured in dimensions other than those obtained in conventional M-mode echocardiography (M-mode echo). Mathematical and geometrical models for LAV calculation using the standard long-axis, short-axis and apical four-chamber planes were devised and found to be reliable in a preliminary study using porcine heart preparations, although length (10%), area (20%) and volume (38%) were significantly and consistently underestimated with echocardiography. Those models were then applied and correlated with angiocardiograms (ACG) in 25 consecutive patients with suspected CHD. In terms of the estimation of the absolute LAV, accuracy seemed commensurate with the number of the dimensions measured. The correlation between data obtained by TDE and ACG varied with changing hemodynamics such as cardiac cycle, absolute LAV and presence or absence of volume load. The left atrium was found to become spherical and progressively underestimated with TDE at ventricular endsystole, in larger LAV and with increased volume load. Since this tendency became less pronounced in measuring additional dimensions, reliable estimation of the absolute LAV and volume load was possible when 2 or 3 dimensions were measured. Among those calculation models depending on 2 or 3 dimensional measurements, there was only a small difference in terms of accuracy and predictability, although algorithm used varied from one model to another. This suggests that accurate cross-sectional area measurement is critically important for volume estimation rather than any particular algorithm involved. Cross-sectional area measurement by TDE integrated into a three dimensional equivalent allowed a reliable estimate of the LAV or volume load in a variety of hemodynamic situations where M-mode echo was not reliable.

  14. Earthquake Prediction in a Big Data World

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. The examples of independent expertize of "seismic hazard maps", "precursors", and "forecast/prediction methods" are provided.

  15. Obtaining Reliable Predictions of Terrestrial Energy Coupling From Real-Time Solar Wind Measurement

    NASA Technical Reports Server (NTRS)

    Weimer, Daniel R.

    2001-01-01

    The first draft of a manuscript titled "Variable time delays in the propagation of the interplanetary magnetic field" has been completed, for submission to the Journal of Geophysical Research. In the preparation of this manuscript all data and analysis programs had been updated to the highest temporal resolution possible, at 16 seconds or better. The program which computes the "measured" IMF propagation time delays from these data has also undergone another improvement. In another significant development, a technique has been developed in order to predict IMF phase plane orientations, and the resulting time delays, using only measurements from a single satellite at L1. The "minimum variance" method is used for this computation. Further work will be done on optimizing the choice of several parameters for the minimum variance calculation.

  16. Optimized model tuning in medical systems.

    PubMed

    Kléma, Jirí; Kubalík, Jirí; Lhotská, Lenka

    2005-12-01

    In medical systems it is often advantageous to utilize specific problem situations (cases) in addition to or instead of a general model. Decisions are then based on relevant past cases retrieved from a case memory. The reliability of such decisions depends directly on the ability to identify cases of practical relevance to the current situation. This paper discusses issues of automated tuning in order to obtain a proper definition of mutual case similarity in a specific medical domain. The main focus is on a reasonably time-consuming optimization of the parameters that determine case retrieval and further utilization in decision making/ prediction. The two case studies - mortality prediction after cardiological intervention, and resource allocation at a spa - document that the optimization process is influenced by various characteristics of the problem domain.

  17. Life prediction for white OLED based on LSM under lognormal distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Jianping; Liu, Fang; Liu, Yu; Wu, Helen; Zhu, Wenqing; Wu, Wenli; Wu, Liang

    2012-09-01

    In order to acquire the reliability information of White Organic Light Emitting Display (OLED), three groups of OLED constant stress accelerated life tests (CSALTs) were carried out to obtain failure data of samples. Lognormal distribution function was applied to describe OLED life distribution, and the accelerated life equation was determined by Least square method (LSM). The Kolmogorov-Smirnov test was performed to verify whether the white OLED life meets lognormal distribution or not. Author-developed software was employed to predict the average life and the median life. The numerical results indicate that the white OLED life submits to lognormal distribution, and that the accelerated life equation meets inverse power law completely. The estimated life information of the white OLED provides manufacturers and customers with important guidelines.

  18. Optimization of marine waste based-growth media for microbial lipase production using mixture design methodology.

    PubMed

    Sellami, Mohamed; Kedachi, Samiha; Frikha, Fakher; Miled, Nabil; Ben Rebah, Faouzi

    2013-01-01

    Lipase production by Staphylococcus xylosus and Rhizopus oryzae was investigated using a culture medium based on a mixture of synthetic medium and supernatants generated from tuna by-products and Ulva rigida biomass. The proportion of the three medium components was optimized using the simplex-centroid mixture design method (SCMD). Results indicated that the experimental data were in good agreement with predicted values, indicating that SCMD was a reliable method for determining the optimum mixture proportion of the growth medium. Maximal lipase activities of 12.5 and 23.5 IU/mL were obtained with a 50:50 (v:v) mixture of synthetic medium and tuna by-product supernatant for Staphylococcus xylosus and Rhizopus oryzae, respectively. The predicted responses from these mixture proportions were also validated experimentally.

  19. Testing the reliability of ice-cream cone model

    NASA Astrophysics Data System (ADS)

    Pan, Z.; Shen, C.; Wang, Y.; Liu, K.

    2013-12-01

    Coronal Mass Ejections (CME)'s properties are important to not only the physical scene itself but spaceweather prediction. Several models(such as cone model, GCS model, and so on) have been raised to get rid of the projection effects within the properties observated by spacecraft. According to SOHO/ LASCO observations, we obtain the 'real' 3D parameters of 33 FFHCMEs (front-side full halo Coronal Mass Ejections) within the 24th solar cycle by the ice-cream cone model. Considering that the method to obtain 3D parameters from the CME observations by multi-satellite and multi-angle has higher accuracy, we use the GCS model to obtain the real propagation parameters of these CMEs in 3D space and compare the results with which by ice-cream cone model. It was demonstrated that the correlation coefficient for the speeds by using these both methods is 0.97.

  20. Isomorphic red blood cells using automated urine flow cytometry is a reliable method in diagnosis of bladder cancer.

    PubMed

    Muto, Satoru; Sugiura, Syo-Ichiro; Nakajima, Akiko; Horiuchi, Akira; Inoue, Masahiro; Saito, Keisuke; Isotani, Shuji; Yamaguchi, Raizo; Ide, Hisamitsu; Horie, Shigeo

    2014-10-01

    We aimed to identify patients with a chief complaint of hematuria who could safely avoid unnecessary radiation and instrumentation in the diagnosis of bladder cancer (BC), using automated urine flow cytometry to detect isomorphic red blood cells (RBCs) in urine. We acquired urine samples from 134 patients over the age of 35 years with a chief complaint of hematuria and a positive urine occult blood test or microhematuria. The data were analyzed using the UF-1000i (®) (Sysmex Co., Ltd., Kobe, Japan) automated urine flow cytometer to determine RBC morphology, which was classified as isomorphic or dysmorphic. The patients were divided into two groups (BC versus non-BC) for statistical analysis. Multivariate logistic regression analysis was used to determine the predictive value of flow cytometry versus urine cytology, the bladder tumor antigen test, occult blood in urine test, and microhematuria test. BC was confirmed in 26 of 134 patients (19.4 %). The area under the curve for RBC count using the automated urine flow cytometer was 0.94, representing the highest reference value obtained in this study. Isomorphic RBCs were detected in all patients in the BC group. On multivariate logistic regression analysis, only isomorphic RBC morphology was significantly predictive for BC (p < 0.001). Analytical parameters such as sensitivity, specificity, positive predictive value, and negative predictive value of isomorphic RBCs in urine were 100.0, 91.7, 74.3, and 100.0 %, respectively. Detection of urinary isomorphic RBCs using automated urine flow cytometry is a reliable method in the diagnosis of BC with hematuria.

  1. The mere exposure effect is differentially sensitive to different judgment tasks.

    PubMed

    Seamon, J G; McKenna, P A; Binder, N

    1998-03-01

    The mere exposure effect is the increase in positive affect that results from the repeated exposure to previously novel stimuli. We sought to determine if judgments other than affective preference could reliably produce a mere exposure effect for two-dimensional random shapes. In two experiments, we found that brighter and darker judgments did not differentiate target from distracter shapes, liking judgments led to target selection greater than chance, and disliking judgments led to distracter selection greater than chance. These results for brighter, darker, and liking judgments were obtained regardless of whether shape recognition was greater (Experiment 1) or not greater (Experiment 2) than chance. Effects of prior exposure to novel shapes were reliably observed only for affective judgment tasks. These results are inconsistent with general predictions made by the nonspecific activation hypothesis, but not the affective primacy or perceptual fluency hypotheses which were discussed in terms of cognitive neuroscience research. Copyright 1998 Academic Press.

  2. Detection of antibodies against hepatitis A in blood spots dried on filter paper. Is this a reliable method for epidemiological studies?

    PubMed Central

    Gil, A.; González, A.; Dal-Ré, R.; Dominguez, V.; Astasio, P.; Aguilar, L.

    1997-01-01

    Diluted dried blood drops on filter paper were compared with serum samples as a specimen source for qualitative anti-HAV antibody determination by ELISA. A total of 298 serum samples and dried blood drops were collected from a population of healthy adolescents (15.3 +/- 1.2 years old). The prevalence of anti-HAV antibody obtained by testing serum samples was 7.7% (95% CI:4.8 10.1). Compared with serum sampling the sensitivity and specificity of diluted dried blood drops were 91.3 and 99.3%. The positive and negative predictive values were 91.3 and 99.3%, respectively, and the likelihood ratios of positive and negative results were 91 and 0.09. It is proposed that this test represents a reliable procedure for anti-HAV antibody testing. PMID:9129596

  3. Reliability and Validity of the Work and Well-Being Inventory (WBI) for Employees.

    PubMed

    Vendrig, A A; Schaafsma, F G

    2018-06-01

    Purpose The purpose of this study is to measure the psychometric properties of the Work and Wellbeing Inventory (WBI) (in Dutch: VAR-2), a screening tool that is used within occupational health care and rehabilitation. Our research question focused on the reliability and validity of this inventory. Methods Over the years seven different samples of workers, patients and sick listed workers varying in size between 89 and 912 participants (total: 2514), were used to measure the test-retest reliability, the internal consistency, the construct and concurrent validity, and the criterion and predictive validity. Results The 13 scales displayed good internal consistency and test-retest reliability. The constructive validity of the WBI could clearly be demonstrated in both patients and healthy workers. Confirmative factor analyses revealed a CFI >.90 for all scales. The depression scale predicted future work absenteeism (>6 weeks) because of a common mental disorder in healthy workers. The job strain scale and the illness behavior scale predicted long term absenteeism (>3 months) in workers with short-term absenteeism. The illness behavior scale moderately predicted return to work in rehab patients attending an intensive multidisciplinary program. Conclusions The WBI is a valid and reliable tool for occupational health practitioners to screen for risk factors for prolonged or future sickness absence. With this tool they will have reliable indications for further advice and interventions to restore the work ability.

  4. Motion resistant pulse oximetry in neonates

    PubMed Central

    Sahni, R; Gupta, A; Ohira-Kist, K; Rosen, T

    2003-01-01

    Background: Pulse oximetry is widely used in neonates. However, its reliability is often affected by motion artefact. Clinicians confronted with questionable oxygen saturation (SpO2) values often estimate the reliability by correlating heart rate (HR) obtained with the oximeter with that obtained by electrocardiogram. Objective: To compare the effects of motion on SpO2 and HR measurements made with Masimo signal extraction technology and those made with a Nellcor N-200. Design: Continuous pulse oximetry and HR monitoring were performed in 15 healthy, term infants (mean (SD) birth weight 3408 (458) g) undergoing circumcision, using Masimo and Nellcor pulse oximeters and a standard HR monitor (Hewlett-Packard). Simultaneous minute by minute behavioural activity codes were also assigned. Baseline data were collected for 10 minutes when the infant was quietly asleep and then continued during and after circumcision for a total duration of one hour. The oximeter HR and SpO2 values were compared and related to HR values obtained by ECG during all three periods. The effect of behavioural activity on SpO2 and HR was also evaluated. Results: When compared with results obtained with the Nellcor, the mean SpO2 and HR were higher and the incidence of artefact lower with the Masimo during all three periods. Masimo HR more accurately predicted HR obtained with a standard monitor, with lower residual error. SpO2 and HR values obtained with the Nellcor were lower and more variable during all behavioural states, especially crying, when excessive motion artefact was most likely. Conclusions: The data suggest that Masimo signal extraction technology may offer improvement in pulse oximetry performance, particularly in clinical situations in which extreme motion artefacts are likely. PMID:14602699

  5. The Effect of Cognitive Load and Outcome Congruency on the Learned Predictiveness Effect in Human Predictive Learning

    ERIC Educational Resources Information Center

    Jorge A. Pinto,; Vogel, Edgar H.; Núñez, Daniel E.

    2017-01-01

    The learned predictiveness effect or LPE is the finding that when people learn that certain cues are reliable predictors of an outcome in an initial stage of training (phase 1), they exhibit a learning bias in favor of these cues in a subsequent training involving new outcomes (phase 2) despite all cues being equally reliable in phase 2. In…

  6. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis

    PubMed Central

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed. PMID:26167524

  7. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.

    PubMed

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.

  8. Objective estimation of patient age through a new composite scale for facial aging assessment: The face - Objective assessment scale.

    PubMed

    La Padula, Simone; Hersant, Barbara; SidAhmed, Mounia; Niddam, Jeremy; Meningaud, Jean Paul

    2016-07-01

    Most patients requesting aesthetic rejuvenation treatment expect to look healthier and younger. Some scales for ageing assessment have been proposed, but none is focused on patient age prediction. The aim of this study was to develop and validate a new facial rating scale assessing facial ageing sign severity. One thousand Caucasian patients were included and assessed. The Rasch model was used as part of the validation process. A score was attributed to each patient, based on the scales we developed. The correlation between the real age and scores obtained, the inter-rater reliability and test-retest reliability were analysed. The objective was to develop a tool enabling the assigning of a patient to a specific age range based on the calculated score. All scales exceeded criteria for acceptability, reliability and validity. The real age strongly correlated with the total facial score in both sex groups. The test-retest reliability confirmed this strong correlation. We developed a facial ageing scale which could be a useful tool to assess patients before and after rejuvenation treatment and an important new metrics to be used in facial rejuvenation and regenerative clinical research. Copyright © 2016 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  9. How to quantify exposure to traumatic stress? Reliability and predictive validity of measures for cumulative trauma exposure in a post-conflict population.

    PubMed

    Wilker, Sarah; Pfeiffer, Anett; Kolassa, Stephan; Koslowski, Daniela; Elbert, Thomas; Kolassa, Iris-Tatjana

    2015-01-01

    While studies with survivors of single traumatic experiences highlight individual response variation following trauma, research from conflict regions shows that almost everyone develops posttraumatic stress disorder (PTSD) if trauma exposure reaches extreme levels. Therefore, evaluating the effects of cumulative trauma exposure is of utmost importance in studies investigating risk factors for PTSD. Yet, little research has been devoted to evaluate how this important environmental risk factor can be best quantified. We investigated the retest reliability and predictive validity of different trauma measures in a sample of 227 Ugandan rebel war survivors. Trauma exposure was modeled as the number of traumatic event types experienced or as a score considering traumatic event frequencies. In addition, we investigated whether age at trauma exposure can be reliably measured and improves PTSD risk prediction. All trauma measures showed good reliability. While prediction of lifetime PTSD was most accurate from the number of different traumatic event types experienced, inclusion of event frequencies slightly improved the prediction of current PTSD. As assessing the number of traumatic events experienced is the least stressful and time-consuming assessment and leads to the best prediction of lifetime PTSD, we recommend this measure for research on PTSD etiology.

  10. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  11. Calculating system reliability with SRFYDO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for themore » system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.« less

  12. General inattentiveness is a long-term reliable trait independently predictive of psychological health: Danish validation studies of the Mindful Attention Awareness Scale.

    PubMed

    Jensen, Christian Gaden; Niclasen, Janni; Vangkilde, Signe Allerup; Petersen, Anders; Hasselbalch, Steen Gregers

    2016-05-01

    The Mindful Attention Awareness Scale (MAAS) measures perceived degree of inattentiveness in different contexts and is often used as a reversed indicator of mindfulness. MAAS is hypothesized to reflect a psychological trait or disposition when used outside attentional training contexts, but the long-term test-retest reliability of MAAS scores is virtually untested. It is unknown whether MAAS predicts psychological health after controlling for standardized socioeconomic status classifications. First, MAAS translated to Danish was validated psychometrically within a randomly invited healthy adult community sample (N = 490). Factor analysis confirmed that MAAS scores quantified a unifactorial construct of excellent composite reliability and consistent convergent validity. Structural equation modeling revealed that MAAS scores contributed independently to predicting psychological distress and mental health, after controlling for age, gender, income, socioeconomic occupational class, stressful life events, and social desirability (β = 0.32-.42, ps < .001). Second, MAAS scores showed satisfactory short-term test-retest reliability in 100 retested healthy university students. Finally, MAAS sample mean scores as well as individuals' scores demonstrated satisfactory test-retest reliability across a 6 months interval in the adult community (retested N = 407), intraclass correlations ≥ .74. MAAS scores displayed significantly stronger long-term test-retest reliability than scores measuring psychological distress (z = 2.78, p = .005). Test-retest reliability estimates did not differ within demographic and socioeconomic strata. Scores on the Danish MAAS were psychometrically validated in healthy adults. MAAS's inattentiveness scores reflected a unidimensional construct, long-term reliable disposition, and a factor of independent significance for predicting psychological health. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. A damage mechanics based approach to structural deterioration and reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattcharya, B.; Ellingwood, B.

    1998-02-01

    Structural deterioration often occurs without perceptible manifestation. Continuum damage mechanics defines structural damage in terms of the material microstructure, and relates the damage variable to the macroscopic strength or stiffness of the structure. This enables one to predict the state of damage prior to the initiation of a macroscopic flaw, and allows one to estimate residual strength/service life of an existing structure. The accumulation of damage is a dissipative process that is governed by the laws of thermodynamics. Partial differential equations for damage growth in terms of the Helmholtz free energy are derived from fundamental thermodynamical conditions. Closed-form solutions tomore » the equations are obtained under uniaxial loading for ductile deformation damage as a function of plastic strain, for creep damage as a function of time, and for fatigue damage as function of number of cycles. The proposed damage growth model is extended into the stochastic domain by considering fluctuations in the free energy, and closed-form solutions of the resulting stochastic differential equation are obtained in each of the three cases mentioned above. A reliability analysis of a ring-stiffened cylindrical steel shell subjected to corrosion, accidental pressure, and temperature is performed.« less

  14. Strategies for potential age dating of fingerprints through the diffusion of sebum molecules on a nonporous surface analyzed using time-of-flight secondary ion mass spectrometry.

    PubMed

    Muramoto, Shin; Sisco, Edward

    2015-08-18

    Age dating of fingerprints could have a significant impact in forensic science, as it has the potential to facilitate the judicial process by assessing the relevance of a fingerprint found at a crime scene. However, no method currently exists that can reliably predict the age of a latent fingerprint. In this manuscript, time-of-flight secondary ion imaging mass spectrometry (TOF-SIMS) was used to measure the diffusivity of saturated fatty acid molecules from a fingerprint on a silicon wafer. It was found that their diffusion from relatively fresh fingerprints (t ≤ 96 h) could be modeled using an error function, with diffusivities (mm(2)/h) that followed a power function when plotted against molecular weight. The equation x = 0.02t(0.5) was obtained for palmitic acid that could be used to find its position in millimeters (where the concentration is 50% of its initial value or c0/2) as a function of time in hours. The results show that on a clean silicon substrate, the age of a fingerprint (t ≤ 96 h) could reliably be obtained through the extent of diffusion of palmitic acid.

  15. Reliability Assessment of a Robust Design Under Uncertainty for a 3-D Flexible Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J. -W.; Newman, Perry A.

    2003-01-01

    The paper presents reliability assessment results for the robust designs under uncertainty of a 3-D flexible wing previously reported by the authors. Reliability assessments (additional optimization problems) of the active constraints at the various probabilistic robust design points are obtained and compared with the constraint values or target constraint probabilities specified in the robust design. In addition, reliability-based sensitivity derivatives with respect to design variable mean values are also obtained and shown to agree with finite difference values. These derivatives allow one to perform reliability based design without having to obtain second-order sensitivity derivatives. However, an inner-loop optimization problem must be solved for each active constraint to find the most probable point on that constraint failure surface.

  16. Enhancing Flood Prediction Reliability Using Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Merwade, V.

    2017-12-01

    Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.

  17. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    NASA Astrophysics Data System (ADS)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  18. Prediction of brain tissue temperature using near-infrared spectroscopy

    PubMed Central

    Holper, Lisa; Mitra, Subhabrata; Bale, Gemma; Robertson, Nicola; Tachtsidis, Ilias

    2017-01-01

    Abstract. Broadband near-infrared spectroscopy (NIRS) can provide an endogenous indicator of tissue temperature based on the temperature dependence of the water absorption spectrum. We describe a first evaluation of the calibration and prediction of brain tissue temperature obtained during hypothermia in newborn piglets (animal dataset) and rewarming in newborn infants (human dataset) based on measured body (rectal) temperature. The calibration using partial least squares regression proved to be a reliable method to predict brain tissue temperature with respect to core body temperature in the wavelength interval of 720 to 880 nm with a strong mean predictive power of R2=0.713±0.157 (animal dataset) and R2=0.798±0.087 (human dataset). In addition, we applied regression receiver operating characteristic curves for the first time to evaluate the temperature prediction, which provided an overall mean error bias between NIRS predicted brain temperature and body temperature of 0.436±0.283°C (animal dataset) and 0.162±0.149°C (human dataset). We discuss main methodological aspects, particularly the well-known aspect of over- versus underestimation between brain and body temperature, which is relevant for potential clinical applications. PMID:28630878

  19. Suboptimal choice in rats: incentive salience attribution promotes maladaptive decision-making

    PubMed Central

    Chow, Jonathan J; Smith, Aaron P; Wilson, A George; Zentall, Thomas R; Beckmann, Joshua S

    2016-01-01

    Stimuli that are more predictive of subsequent reward also function as better conditioned reinforcers. Moreover, stimuli attributed with incentive salience function as more robust conditioned reinforcers. Some theories have suggested that conditioned reinforcement plays an important role in promoting suboptimal choice behavior, like gambling. The present experiments examined how different stimuli, those attributed with incentive salience versus those without, can function in tandem with stimulus-reward predictive utility to promote maladaptive decision-making in rats. One group of rats had lights associated with goal-tracking as the reward-predictive stimuli and another had levers associated with sign-tracking as the reward-predictive stimuli. All rats were first trained on a choice procedure in which the expected value across both alternatives was equivalent but differed in their stimulus-reward predictive utility. Next, the expected value across both alternatives was systematically changed so that the alternative with greater stimulus-reward predictive utility was suboptimal in regard to primary reinforcement. The results demonstrate that in order to obtain suboptimal choice behavior, incentive salience alongside strong stimulus-reward predictive utility may be necessary; thus, maladaptive decision-making can be driven more by the value attributed to stimuli imbued with incentive salience that reliably predict a reward rather than the reward itself. PMID:27993692

  20. Suboptimal choice in rats: Incentive salience attribution promotes maladaptive decision-making.

    PubMed

    Chow, Jonathan J; Smith, Aaron P; Wilson, A George; Zentall, Thomas R; Beckmann, Joshua S

    2017-03-01

    Stimuli that are more predictive of subsequent reward also function as better conditioned reinforcers. Moreover, stimuli attributed with incentive salience function as more robust conditioned reinforcers. Some theories have suggested that conditioned reinforcement plays an important role in promoting suboptimal choice behavior, like gambling. The present experiments examined how different stimuli, those attributed with incentive salience versus those without, can function in tandem with stimulus-reward predictive utility to promote maladaptive decision-making in rats. One group of rats had lights associated with goal-tracking as the reward-predictive stimuli and another had levers associated with sign-tracking as the reward-predictive stimuli. All rats were first trained on a choice procedure in which the expected value across both alternatives was equivalent but differed in their stimulus-reward predictive utility. Next, the expected value across both alternatives was systematically changed so that the alternative with greater stimulus-reward predictive utility was suboptimal in regard to primary reinforcement. The results demonstrate that in order to obtain suboptimal choice behavior, incentive salience alongside strong stimulus-reward predictive utility may be necessary; thus, maladaptive decision-making can be driven more by the value attributed to stimuli imbued with incentive salience that reliably predict a reward rather than the reward itself. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Comparison of macro and micro Raman measurement for reliable quantitative analysis of pharmaceutical polymorphs.

    PubMed

    Paiva, Eduardo M; da Silva, Vitor H; Poppi, Ronei J; Pereira, Claudete F; Rohwedder, Jarbas J R

    2018-05-12

    This work reports on the use of micro- and macro-Raman measurements for quantification of mebendazole (MBZ) polymorphs A, B, and C in mixtures. Three Raman spectrophotometers were studied with a laser spot size of 3, 80 and 100 μm and spectral resolutions of 3.9, 9 and 4 cm -1 , respectively. The samples studied were ternary mixtures varying the MBZ polymorphs A and C from 0 to 100% and polymorph B from 0 to 30%. Partial Least Squares (PLS) regression models were developed using the pre-processing spectra (2nd derivative) of the ternary mixtures. The best performance was obtained when the macro-Raman configuration was applied, obtaining RMSEP values of 1.68%, 1.24% and 2.03% w/w for polymorphs A, B, and C, respectively. In general, micro-Raman presented worst results for MBZ polymorphs prediction because the spectra obtained with this configuration does not represent the bulk proportion of mixtures, which have different particle morphologies and sizes. In addition, the influence of these particle features on micro-Raman measurements was also studied. Finally, the results demonstrated that reliable analytical quantifying of MBZ polymorphs can be reached using a laser with wider area illuminated, thus enabling acquisition of more reproductive and representative spectra of the mixtures. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Assessment of metal(loid)s phytoavailability in intensive agricultural soils by the application of single extractions to rhizosphere soil.

    PubMed

    Pinto, Edgar; Almeida, Agostinho A; Ferreira, Isabel M P L V O

    2015-03-01

    The influence of soil properties on the phytoavailability of metal(loid)s in a soil-plant system was evaluated. The content of extractable metal(loid)s obtained by using different extraction methods was also compared. To perform this study, a test plant (Lactuca sativa) and rhizosphere soil were sampled at 5 different time points (2, 4, 6, 8 and 10 weeks of plant growth). Four extraction methods (Mehlich 3, DTPA, NH4NO3 and CaCl2) were used. Significant positive correlations between the soil extractable content and lettuce shoot content were obtained for several metal(loid)s. The extraction with NH4NO3 showed the higher number of strong positive correlations indicating the suitability of this method to estimate metal(loid)s phytoavailability. The soil CEC, OM, pH, texture and oxides content significantly influenced the distribution of metal(loid)s between the phytoavailable and non-phytoavailable fractions. A reliable prediction model for Cr, V, Ni, As, Pb, Co, Cd, and Sb phytoavailability was obtained considering the amount of metal(loid) extracted by the NH4NO3 method and the main soil properties. This work shows that the analysis of rhizosphere soil by single extractions methods is a reliable approach to estimate metal(loid)s phytoavailability. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Soil erosion assessment on hillslope of GCE using RUSLE model

    NASA Astrophysics Data System (ADS)

    Islam, Md. Rabiul; Jaafar, Wan Zurina Wan; Hin, Lai Sai; Osman, Normaniza; Din, Moktar Aziz Mohd; Zuki, Fathiah Mohamed; Srivastava, Prashant; Islam, Tanvir; Adham, Md. Ibrahim

    2018-06-01

    A new method for obtaining the C factor (i.e., vegetation cover and management factor) of the RUSLE model is proposed. The method focuses on the derivation of the C factor based on the vegetation density to obtain a more reliable erosion prediction. Soil erosion that occurs on the hillslope along the highway is one of the major problems in Malaysia, which is exposed to a relatively high amount of annual rainfall due to the two different monsoon seasons. As vegetation cover is one of the important factors in the RUSLE model, a new method that accounts for a vegetation density is proposed in this study. A hillslope near the Guthrie Corridor Expressway (GCE), Malaysia, is chosen as an experimental site whereby eight square plots with the size of 8× 8 and 5× 5 m are set up. A vegetation density available on these plots is measured by analyzing the taken image followed by linking the C factor with the measured vegetation density using several established formulas. Finally, erosion prediction is computed based on the RUSLE model in the Geographical Information System (GIS) platform. The C factor obtained by the proposed method is compared with that of the soil erosion guideline Malaysia, thereby predicted erosion is determined by both the C values. Result shows that the C value from the proposed method varies from 0.0162 to 0.125, which is lower compared to the C value from the soil erosion guideline, i.e., 0.8. Meanwhile predicted erosion computed from the proposed C value is between 0.410 and 3.925 t ha^{-1 } yr^{-1} compared to 9.367 to 34.496 t ha^{-1} yr^{-1 } range based on the C value of 0.8. It can be concluded that the proposed method of obtaining a reasonable C value is acceptable as the computed predicted erosion is found to be classified as a very low zone, i.e. less than 10 t ha^{-1 } yr^{-1} whereas the predicted erosion based on the guideline has classified the study area as a low zone of erosion, i.e., between 10 and 50 t ha^{-1 } yr^{-1}.

  4. Estimation of Reliability Coefficients Using the Test Information Function and Its Modifications.

    ERIC Educational Resources Information Center

    Samejima, Fumiko

    1994-01-01

    The reliability coefficient is predicted from the test information function (TIF) or two modified TIF formulas and a specific trait distribution. Examples illustrate the variability of the reliability coefficient across different trait distributions, and results are compared with empirical reliability coefficients. (SLD)

  5. Methods to approximate reliabilities in single-step genomic evaluation

    USDA-ARS?s Scientific Manuscript database

    Reliability of predictions from single-step genomic BLUP (ssGBLUP) can be calculated by inversion, but that is not feasible for large data sets. Two methods of approximating reliability were developed based on decomposition of a function of reliability into contributions from records, pedigrees, and...

  6. Genomic selection in a commercial winter wheat population.

    PubMed

    He, Sang; Schulthess, Albert Wilhelm; Mirdita, Vilson; Zhao, Yusheng; Korzun, Viktor; Bothe, Reiner; Ebmeyer, Erhard; Reif, Jochen C; Jiang, Yong

    2016-03-01

    Genomic selection models can be trained using historical data and filtering genotypes based on phenotyping intensity and reliability criterion are able to increase the prediction ability. We implemented genomic selection based on a large commercial population incorporating 2325 European winter wheat lines. Our objectives were (1) to study whether modeling epistasis besides additive genetic effects results in enhancement on prediction ability of genomic selection, (2) to assess prediction ability when training population comprised historical or less-intensively phenotyped lines, and (3) to explore the prediction ability in subpopulations selected based on the reliability criterion. We found a 5 % increase in prediction ability when shifting from additive to additive plus epistatic effects models. In addition, only a marginal loss from 0.65 to 0.50 in accuracy was observed using the data collected from 1 year to predict genotypes of the following year, revealing that stable genomic selection models can be accurately calibrated to predict subsequent breeding stages. Moreover, prediction ability was maximized when the genotypes evaluated in a single location were excluded from the training set but subsequently decreased again when the phenotyping intensity was increased above two locations, suggesting that the update of the training population should be performed considering all the selected genotypes but excluding those evaluated in a single location. The genomic prediction ability was substantially higher in subpopulations selected based on the reliability criterion, indicating that phenotypic selection for highly reliable individuals could be directly replaced by applying genomic selection to them. We empirically conclude that there is a high potential to assist commercial wheat breeding programs employing genomic selection approaches.

  7. Development of confidence limits by pivotal functions for estimating software reliability

    NASA Technical Reports Server (NTRS)

    Dotson, Kelly J.

    1987-01-01

    The utility of pivotal functions is established for assessing software reliability. Based on the Moranda geometric de-eutrophication model of reliability growth, confidence limits for attained reliability and prediction limits for the time to the next failure are derived using a pivotal function approach. Asymptotic approximations to the confidence and prediction limits are considered and are shown to be inadequate in cases where only a few bugs are found in the software. Departures from the assumed exponentially distributed interfailure times in the model are also investigated. The effect of these departures is discussed relative to restricting the use of the Moranda model.

  8. Solar-cell interconnect design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-01-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  9. Solar-cell interconnect design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-11-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  10. Mathematical model to estimate risk of calcium-containing renal stones

    NASA Technical Reports Server (NTRS)

    Pietrzyk, R. A.; Feiveson, A. H.; Whitson, P. A.

    1999-01-01

    BACKGROUND/AIMS: Astronauts exposed to microgravity during the course of spaceflight undergo physiologic changes that alter the urinary environment so as to increase the risk of renal stone formation. This study was undertaken to identify a simple method with which to evaluate the potential risk of renal stone development during spaceflight. METHOD: We used a large database of urinary risk factors obtained from 323 astronauts before and after spaceflight to generate a mathematical model with which to predict the urinary supersaturation of calcium stone forming salts. RESULT: This model, which involves the fewest possible analytical variables (urinary calcium, citrate, oxalate, phosphorus, and total volume), reliably and accurately predicted the urinary supersaturation of the calcium stone forming salts when compared to results obtained from a group of 6 astronauts who collected urine during flight. CONCLUSIONS: The use of this model will simplify both routine medical monitoring during spaceflight as well as the evaluation of countermeasures designed to minimize renal stone development. This model also can be used for Earth-based applications in which access to analytical resources is limited.

  11. The anharmonic quartic force field infrared spectra of three polycyclic aromatic hydrocarbons: Naphthalene, anthracene, and tetracene.

    PubMed

    Mackie, Cameron J; Candian, Alessandra; Huang, Xinchuan; Maltseva, Elena; Petrignani, Annemieke; Oomens, Jos; Buma, Wybren Jan; Lee, Timothy J; Tielens, Alexander G G M

    2015-12-14

    Current efforts to characterize and study interstellar polycyclic aromatic hydrocarbons (PAHs) rely heavily on theoretically predicted infrared (IR) spectra. Generally, such studies use the scaled harmonic frequencies for band positions and double harmonic approximation for intensities of species, and then compare these calculated spectra with experimental spectra obtained under matrix isolation conditions. High-resolution gas-phase experimental spectroscopic studies have recently revealed that the double harmonic approximation is not sufficient for reliable spectra prediction. In this paper, we present the anharmonic theoretical spectra of three PAHs: naphthalene, anthracene, and tetracene, computed with a locally modified version of the SPECTRO program using Cartesian derivatives transformed from Gaussian 09 normal coordinate force constants. Proper treatments of Fermi resonances lead to an impressive improvement on the agreement between the observed and theoretical spectra, especially in the C-H stretching region. All major IR absorption features in the full-scale matrix-isolated spectra, the high-temperature gas-phase spectra, and the most recent high-resolution gas-phase spectra obtained under supersonically cooled molecular beam conditions in the CH-stretching region are assigned.

  12. The anharmonic quartic force field infrared spectra of three polycyclic aromatic hydrocarbons: Naphthalene, anthracene, and tetracene

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mackie, Cameron J., E-mail: mackie@strw.leidenuniv.nl; Candian, Alessandra; Tielens, Alexander G. G. M.

    2015-12-14

    Current efforts to characterize and study interstellar polycyclic aromatic hydrocarbons (PAHs) rely heavily on theoretically predicted infrared (IR) spectra. Generally, such studies use the scaled harmonic frequencies for band positions and double harmonic approximation for intensities of species, and then compare these calculated spectra with experimental spectra obtained under matrix isolation conditions. High-resolution gas-phase experimental spectroscopic studies have recently revealed that the double harmonic approximation is not sufficient for reliable spectra prediction. In this paper, we present the anharmonic theoretical spectra of three PAHs: naphthalene, anthracene, and tetracene, computed with a locally modified version of the SPECTRO program using Cartesianmore » derivatives transformed from Gaussian 09 normal coordinate force constants. Proper treatments of Fermi resonances lead to an impressive improvement on the agreement between the observed and theoretical spectra, especially in the C–H stretching region. All major IR absorption features in the full-scale matrix-isolated spectra, the high-temperature gas-phase spectra, and the most recent high-resolution gas-phase spectra obtained under supersonically cooled molecular beam conditions in the CH-stretching region are assigned.« less

  13. Evaluation of ensemble precipitation forecasts generated through post-processing in a Canadian catchment

    NASA Astrophysics Data System (ADS)

    Jha, Sanjeev K.; Shrestha, Durga L.; Stadnyk, Tricia A.; Coulibaly, Paulin

    2018-03-01

    Flooding in Canada is often caused by heavy rainfall during the snowmelt period. Hydrologic forecast centers rely on precipitation forecasts obtained from numerical weather prediction (NWP) models to enforce hydrological models for streamflow forecasting. The uncertainties in raw quantitative precipitation forecasts (QPFs) are enhanced by physiography and orography effects over a diverse landscape, particularly in the western catchments of Canada. A Bayesian post-processing approach called rainfall post-processing (RPP), developed in Australia (Robertson et al., 2013; Shrestha et al., 2015), has been applied to assess its forecast performance in a Canadian catchment. Raw QPFs obtained from two sources, Global Ensemble Forecasting System (GEFS) Reforecast 2 project, from the National Centers for Environmental Prediction, and Global Deterministic Forecast System (GDPS), from Environment and Climate Change Canada, are used in this study. The study period from January 2013 to December 2015 covered a major flood event in Calgary, Alberta, Canada. Post-processed results show that the RPP is able to remove the bias and reduce the errors of both GEFS and GDPS forecasts. Ensembles generated from the RPP reliably quantify the forecast uncertainty.

  14. Lightweight design and analysis of automobile wheel based on bending and radial loads

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Lyu, R.; Fukushima, Y.; Otake, M.; Ju, D. Y.

    2018-06-01

    Lightweighting of automobile vehicle is a significant application trends, using magnesium alloy wheels is a valuable way. This article discusses design of a new model of automobile wheel. Then bending test and radial test finite element model were established. Considering three different materials namely magnesium alloy, aluminium alloy and steel, the stress and strain performances of each material can be obtained. Through evaluating and analyzing model in bending test and radial test, we obtained the reasonable and superior results for magnesium alloy wheel. The results of the equivalent stress and deformation were compared, the magnesium alloy wheel practicality has been confirmed. This research predicts the reliability of the structural design, some valuable references are provided for the design and development of magnesium alloy wheel.

  15. Study on validation method for femur finite element model under multiple loading conditions

    NASA Astrophysics Data System (ADS)

    Guan, Fengjiao; Zhang, Guanjun; Liu, Jie; Wang, Shujing; Luo, Xu

    2018-03-01

    Acquisition of accurate and reliable constitutive parameters related to bio-tissue materials was beneficial to improve biological fidelity of a Finite Element (FE) model and predict impact damages more effectively. In this paper, a femur FE model was established under multiple loading conditions with diverse impact positions. Then, based on sequential response surface method and genetic algorithms, the material parameters identification was transformed to a multi-response optimization problem. Finally, the simulation results successfully coincided with force-displacement curves obtained by numerous experiments. Thus, computational accuracy and efficiency of the entire inverse calculation process were enhanced. This method was able to effectively reduce the computation time in the inverse process of material parameters. Meanwhile, the material parameters obtained by the proposed method achieved higher accuracy.

  16. Scale Development for Measuring and Predicting Adolescents’ Leisure Time Physical Activity Behavior

    PubMed Central

    Ries, Francis; Romero Granados, Santiago; Arribas Galarraga, Silvia

    2009-01-01

    The aim of this study was to develop a scale for assessing and predicting adolescents’ physical activity behavior in Spain and Luxembourg using the Theory of Planned Behavior as a framework. The sample was comprised of 613 Spanish (boys = 309, girls = 304; M age =15.28, SD =1.127) and 752 Luxembourgish adolescents (boys = 343, girls = 409; M age = 14.92, SD = 1.198), selected from students of two secondary schools in both countries, with a similar socio-economic status. The initial 43-items were all scored on a 4-point response format using the structured alternative format and translated into Spanish, French and German. In order to ensure the accuracy of the translation, standardized parallel back-translation techniques were employed. Following two pilot tests and subsequent revisions, a second order exploratory factor analysis with oblimin direct rotation was used for factor extraction. Internal consistency and test-retest reliabilities were also tested. The 4-week test-retest correlations confirmed the items’ time stability. The same five factors were obtained, explaining 63.76% and 63.64% of the total variance in both samples. Internal consistency for the five factors ranged from α = 0.759 to α = 0. 949 in the Spanish sample and from α = 0.735 to α = 0.952 in the Luxembourgish sample. For both samples, inter-factor correlations were all reported significant and positive, except for Factor 5 where they were significant but negative. The high internal consistency of the subscales, the reported item test-retest reliabilities and the identical factor structure confirm the adequacy of the elaborated questionnaire for assessing the TPB-based constructs when used with a population of adolescents in Spain and Luxembourg. The results give some indication that they may have value in measuring the hypothesized TPB constructs for PA behavior in a cross-cultural context. Key points When using the structured alternative format, weak internal consistency was obtained. Rephrasing the items and scoring items on a Likert-type scale enhanced greatly the subscales reliability. Identical factorial structure was extracted for both culturally different samples. The obtained factors, namely perceived physical competence, parents’ physical activity, perceived resources support, attitude toward physical activity and perceived parental support were hypothesized as for the original TPB constructs. PMID:24149606

  17. Scale development for measuring and predicting adolescents' leisure time physical activity behavior.

    PubMed

    Ries, Francis; Romero Granados, Santiago; Arribas Galarraga, Silvia

    2009-01-01

    The aim of this study was to develop a scale for assessing and predicting adolescents' physical activity behavior in Spain and Luxembourg using the Theory of Planned Behavior as a framework. The sample was comprised of 613 Spanish (boys = 309, girls = 304; M age =15.28, SD =1.127) and 752 Luxembourgish adolescents (boys = 343, girls = 409; M age = 14.92, SD = 1.198), selected from students of two secondary schools in both countries, with a similar socio-economic status. The initial 43-items were all scored on a 4-point response format using the structured alternative format and translated into Spanish, French and German. In order to ensure the accuracy of the translation, standardized parallel back-translation techniques were employed. Following two pilot tests and subsequent revisions, a second order exploratory factor analysis with oblimin direct rotation was used for factor extraction. Internal consistency and test-retest reliabilities were also tested. The 4-week test-retest correlations confirmed the items' time stability. The same five factors were obtained, explaining 63.76% and 63.64% of the total variance in both samples. Internal consistency for the five factors ranged from α = 0.759 to α = 0. 949 in the Spanish sample and from α = 0.735 to α = 0.952 in the Luxembourgish sample. For both samples, inter-factor correlations were all reported significant and positive, except for Factor 5 where they were significant but negative. The high internal consistency of the subscales, the reported item test-retest reliabilities and the identical factor structure confirm the adequacy of the elaborated questionnaire for assessing the TPB-based constructs when used with a population of adolescents in Spain and Luxembourg. The results give some indication that they may have value in measuring the hypothesized TPB constructs for PA behavior in a cross-cultural context. Key pointsWhen using the structured alternative format, weak internal consistency was obtained. Rephrasing the items and scoring items on a Likert-type scale enhanced greatly the subscales reliability.Identical factorial structure was extracted for both culturally different samples.The obtained factors, namely perceived physical competence, parents' physical activity, perceived resources support, attitude toward physical activity and perceived parental support were hypothesized as for the original TPB constructs.

  18. Feasibility of developing LSI microcircuit reliability prediction models

    NASA Technical Reports Server (NTRS)

    Ryerson, C. M.

    1972-01-01

    In the proposed modeling approach, when any of the essential key factors are not known initially, they can be approximated in various ways with a known impact on the accuracy of the final predictions. For example, on any program where reliability predictions are started at interim states of project completion, a-priori approximate estimates of the key factors are established for making preliminary predictions. Later these are refined for greater accuracy as subsequent program information of a more definitive nature becomes available. Specific steps to develop, validate and verify these new models are described.

  19. Surface and downhole shear wave seismic methods for thick soil site investigations

    USGS Publications Warehouse

    Hunter, J.A.; Benjumea, B.; Harris, J.B.; Miller, R.D.; Pullan, S.E.; Burns, R.A.; Good, R.L.

    2002-01-01

    Shear wave velocity-depth information is required for predicting the ground motion response to earthquakes in areas where significant soil cover exists over firm bedrock. Rather than estimating this critical parameter, it can be reliably measured using a suite of surface (non-invasive) and downhole (invasive) seismic methods. Shear wave velocities from surface measurements can be obtained using SH refraction techniques. Array lengths as large as 1000 m and depth of penetration to 250 m have been achieved in some areas. High resolution shear wave reflection techniques utilizing the common midpoint method can delineate the overburden-bedrock surface as well as reflecting boundaries within the overburden. Reflection data can also be used to obtain direct estimates of fundamental site periods from shear wave reflections without the requirement of measuring average shear wave velocity and total thickness of unconsolidated overburden above the bedrock surface. Accurate measurements of vertical shear wave velocities can be obtained using a seismic cone penetrometer in soft sediments, or with a well-locked geophone array in a borehole. Examples from thick soil sites in Canada demonstrate the type of shear wave velocity information that can be obtained with these geophysical techniques, and show how these data can be used to provide a first look at predicted ground motion response for thick soil sites. ?? 2002 Published by Elsevier Science Ltd.

  20. On the use and the performance of software reliability growth models

    NASA Technical Reports Server (NTRS)

    Keiller, Peter A.; Miller, Douglas R.

    1991-01-01

    We address the problem of predicting future failures for a piece of software. The number of failures occurring during a finite future time interval is predicted from the number failures observed during an initial period of usage by using software reliability growth models. Two different methods for using the models are considered: straightforward use of individual models, and dynamic selection among models based on goodness-of-fit and quality-of-prediction criteria. Performance is judged by the relative error of the predicted number of failures over future finite time intervals relative to the number of failures eventually observed during the intervals. Six of the former models and eight of the latter are evaluated, based on their performance on twenty data sets. Many open questions remain regarding the use and the performance of software reliability growth models.

  1. Reliability of smartphone-based gait measurements for quantification of physical activity/inactivity levels.

    PubMed

    Ebara, Takeshi; Azuma, Ryohei; Shoji, Naoto; Matsukawa, Tsuyoshi; Yamada, Yasuyuki; Akiyama, Tomohiro; Kurihara, Takahiro; Yamada, Shota

    2017-11-25

    Objective measurements using built-in smartphone sensors that can measure physical activity/inactivity in daily working life have the potential to provide a new approach to assessing workers' health effects. The aim of this study was to elucidate the characteristics and reliability of built-in step counting sensors on smartphones for development of an easy-to-use objective measurement tool that can be applied in ergonomics or epidemiological research. To evaluate the reliability of step counting sensors embedded in seven major smartphone models, the 6-minute walk test was conducted and the following analyses of sensor precision and accuracy were performed: 1) relationship between actual step count and step count detected by sensors, 2) reliability between smartphones of the same model, and 3) false detection rates when sitting during office work, while riding the subway, and driving. On five of the seven models, the inter-class correlations coefficient (ICC (3,1) ) showed high reliability with a range of 0.956-0.993. The other two models, however, had ranges of 0.443-0.504 and the relative error ratios of the sensor-detected step count to the actual step count were ±48.7%-49.4%. The level of agreement between the same models was ICC (3,1) : 0.992-0.998. The false detection rates differed between the sitting conditions. These results suggest the need for appropriate regulation of step counts measured by sensors, through means such as correction or calibration with a predictive model formula, in order to obtain the highly reliable measurement results that are sought in scientific investigation.

  2. Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow

    NASA Astrophysics Data System (ADS)

    Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca

    2017-11-01

    The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.

  3. A novel method for calculating the dynamic capillary force and correcting the pressure error in micro-tube experiment.

    PubMed

    Wang, Shuoliang; Liu, Pengcheng; Zhao, Hui; Zhang, Yuan

    2017-11-29

    Micro-tube experiment has been implemented to understand the mechanisms of governing microcosmic fluid percolation and is extensively used in both fields of micro electromechanical engineering and petroleum engineering. The measured pressure difference across the microtube is not equal to the actual pressure difference across the microtube. Taking into account the additional pressure losses between the outlet of the micro tube and the outlet of the entire setup, we propose a new method for predicting the dynamic capillary pressure using the Level-set method. We first demonstrate it is a reliable method for describing microscopic flow by comparing the micro-model flow-test results against the predicted results using the Level-set method. In the proposed approach, Level-set method is applied to predict the pressure distribution along the microtube when the fluids flow along the microtube at a given flow rate; the microtube used in the calculation has the same size as the one used in the experiment. From the simulation results, the pressure difference across a curved interface (i.e., dynamic capillary pressure) can be directly obtained. We also show that dynamic capillary force should be properly evaluated in the micro-tube experiment in order to obtain the actual pressure difference across the microtube.

  4. Role of physicochemical properties in the activation of peroxisome proliferator-activated receptor δ.

    PubMed

    Maltarollo, Vinícius G; Homem-de-Mello, Paula; Honorio, Káthia M

    2011-10-01

    Current researches on treatments for metabolic diseases involve a class of biological receptors called peroxisome proliferator-activated receptors (PPARs), which control the metabolism of carbohydrates and lipids. A subclass of these receptors, PPARδ, regulates several metabolic processes, and the substances that activate them are being studied as new drug candidates for the treatment of diabetes mellitus and metabolic syndrome. In this study, several PPARδ agonists with experimental biological activity were selected for a structural and chemical study. Electronic, stereochemical, lipophilic and topological descriptors were calculated for the selected compounds using various theoretical methods, such as density functional theory (DFT). Fisher's weight and principal components analysis (PCA) methods were employed to select the most relevant variables for this study. The partial least squares (PLS) method was used to construct the multivariate statistical model, and the best model obtained had 4 PCs, q ( 2 ) = 0.80 and r ( 2 ) = 0.90, indicating a good internal consistency. The prediction residues calculated for the compounds in the test set had low values, indicating the good predictive capability of our PLS model. The model obtained in this study is reliable and can be used to predict the biological activity of new untested compounds. Docking studies have also confirmed the importance of the molecular descriptors selected for this system.

  5. Reliability Prediction for Aerospace Electronics

    DTIC Science & Technology

    2015-04-20

    RESEARCH AUTHORITY 3 KIRYAT HAMADA ARIEL ISRAEL EOARD GRANT FA9550-14-1-0216 Report Date: April 2015 Final Report for 15 July 2014 to 14... Ariel , Israel Period of Performance 15 July 2014 – 14 April 2015   Abstract...AFRL-AFOSR-UK-TR-2015-0028 Reliability Prediction for Aerospace Electronics Joseph B. Bernstein ARIEL UNIVERSITY

  6. Predictions of Crystal Structure Based on Radius Ratio: How Reliable Are They?

    ERIC Educational Resources Information Center

    Nathan, Lawrence C.

    1985-01-01

    Discussion of crystalline solids in undergraduate curricula often includes the use of radius ratio rules as a method for predicting which type of crystal structure is likely to be adopted by a given ionic compound. Examines this topic, establishing more definitive guidelines for the use and reliability of the rules. (JN)

  7. Probabilistic and structural reliability analysis of laminated composite structures based on the IPACS code

    NASA Technical Reports Server (NTRS)

    Sobel, Larry; Buttitta, Claudio; Suarez, James

    1993-01-01

    Probabilistic predictions based on the Integrated Probabilistic Assessment of Composite Structures (IPACS) code are presented for the material and structural response of unnotched and notched, 1M6/3501-6 Gr/Ep laminates. Comparisons of predicted and measured modulus and strength distributions are given for unnotched unidirectional, cross-ply, and quasi-isotropic laminates. The predicted modulus distributions were found to correlate well with the test results for all three unnotched laminates. Correlations of strength distributions for the unnotched laminates are judged good for the unidirectional laminate and fair for the cross-ply laminate, whereas the strength correlation for the quasi-isotropic laminate is deficient because IPACS did not yet have a progressive failure capability. The paper also presents probabilistic and structural reliability analysis predictions for the strain concentration factor (SCF) for an open-hole, quasi-isotropic laminate subjected to longitudinal tension. A special procedure was developed to adapt IPACS for the structural reliability analysis. The reliability results show the importance of identifying the most significant random variables upon which the SCF depends, and of having accurate scatter values for these variables.

  8. Wearable Lactate Threshold Predicting Device is Valid and Reliable in Runners.

    PubMed

    Borges, Nattai R; Driller, Matthew W

    2016-08-01

    Borges, NR and Driller, MW. Wearable lactate threshold predicting device is valid and reliable in runners. J Strength Cond Res 30(8): 2212-2218, 2016-A commercially available device claiming to be the world's first wearable lactate threshold predicting device (WLT), using near-infrared LED technology, has entered the market. The aim of this study was to determine the levels of agreement between the WLT-derived lactate threshold workload and traditional methods of lactate threshold (LT) calculation and the interdevice and intradevice reliability of the WLT. Fourteen (7 male, 7 female; mean ± SD; age: 18-45 years, height: 169 ± 9 cm, mass: 67 ± 13 kg, V[Combining Dot Above]O2max: 53 ± 9 ml·kg·min) subjects ranging from recreationally active to highly trained athletes completed an incremental exercise test to exhaustion on a treadmill. Blood lactate samples were taken at the end of each 3-minute stage during the test to determine lactate threshold using 5 traditional methods from blood lactate analysis which were then compared against the WLT predicted value. In a subset of the population (n = 12), repeat trials were performed to determine both inter-reliability and intrareliability of the WLT device. Intraclass correlation coefficient (ICC) found high to very high agreement between the WLT and traditional methods (ICC > 0.80), with TEMs and mean differences ranging between 3.9-10.2% and 1.3-9.4%. Both interdevice and intradevice reliability resulted in highly reproducible and comparable results (CV < 1.2%, TEM <0.2 km·h, ICC > 0.97). This study suggests that the WLT is a practical, reliable, and noninvasive tool for use in predicting LT in runners.

  9. Pocket Handbook on Reliability

    DTIC Science & Technology

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  10. Prediction of seasonal climate-induced variations in global food production

    NASA Astrophysics Data System (ADS)

    Iizumi, Toshichika; Sakuma, Hirofumi; Yokozawa, Masayuki; Luo, Jing-Jia; Challinor, Andrew J.; Brown, Molly E.; Sakurai, Gen; Yamagata, Toshio

    2013-10-01

    Consumers, including the poor in many countries, are increasingly dependent on food imports and are thus exposed to variations in yields, production and export prices in the major food-producing regions of the world. National governments and commercial entities are therefore paying increased attention to the cropping forecasts of important food-exporting countries as well as to their own domestic food production. Given the increased volatility of food markets and the rising incidence of climatic extremes affecting food production, food price spikes may increase in prevalence in future years. Here we present a global assessment of the reliability of crop failure hindcasts for major crops at two lead times derived by linking ensemble seasonal climatic forecasts with statistical crop models. We found that moderate-to-marked yield loss over a substantial percentage (26-33%) of the harvested area of these crops is reliably predictable if climatic forecasts are near perfect. However, only rice and wheat production are reliably predictable at three months before the harvest using within-season hindcasts. The reliabilities of estimates varied substantially by crop--rice and wheat yields were the most predictable, followed by soybean and maize. The reasons for variation in the reliability of the estimates included the differences in crop sensitivity to the climate and the technology used by the crop-producing regions. Our findings reveal that the use of seasonal climatic forecasts to predict crop failures will be useful for monitoring global food production and will encourage the adaptation of food systems toclimatic extremes.

  11. Assessment of the reliability of protein-protein interactions and protein function prediction.

    PubMed

    Deng, Minghua; Sun, Fengzhu; Chen, Ting

    2003-01-01

    As more and more high-throughput protein-protein interaction data are collected, the task of estimating the reliability of different data sets becomes increasingly important. In this paper, we present our study of two groups of protein-protein interaction data, the physical interaction data and the protein complex data, and estimate the reliability of these data sets using three different measurements: (1) the distribution of gene expression correlation coefficients, (2) the reliability based on gene expression correlation coefficients, and (3) the accuracy of protein function predictions. We develop a maximum likelihood method to estimate the reliability of protein interaction data sets according to the distribution of correlation coefficients of gene expression profiles of putative interacting protein pairs. The results of the three measurements are consistent with each other. The MIPS protein complex data have the highest mean gene expression correlation coefficients (0.256) and the highest accuracy in predicting protein functions (70% sensitivity and specificity), while Ito's Yeast two-hybrid data have the lowest mean (0.041) and the lowest accuracy (15% sensitivity and specificity). Uetz's data are more reliable than Ito's data in all three measurements, and the TAP protein complex data are more reliable than the HMS-PCI data in all three measurements as well. The complex data sets generally perform better in function predictions than do the physical interaction data sets. Proteins in complexes are shown to be more highly correlated in gene expression. The results confirm that the components of a protein complex can be assigned to functions that the complex carries out within a cell. There are three interaction data sets different from the above two groups: the genetic interaction data, the in-silico data and the syn-express data. Their capability of predicting protein functions generally falls between that of the Y2H data and that of the MIPS protein complex data. The supplementary information is available at the following Web site: http://www-hto.usc.edu/-msms/AssessInteraction/.

  12. External Validation and Evaluation of Reliability and Validity of the Modified Seoul National University Renal Stone Complexity Scoring System to Predict Stone-Free Status After Retrograde Intrarenal Surgery.

    PubMed

    Park, Juhyun; Kang, Minyong; Jeong, Chang Wook; Oh, Sohee; Lee, Jeong Woo; Lee, Seung Bae; Son, Hwancheol; Jeong, Hyeon; Cho, Sung Yong

    2015-08-01

    The modified Seoul National University Renal Stone Complexity scoring system (S-ReSC-R) for retrograde intrarenal surgery (RIRS) was developed as a tool to predict stone-free rate (SFR) after RIRS. We externally validated the S-ReSC-R. We retrospectively reviewed 159 patients who underwent RIRS. The S-ReSC-R was assigned from 1 to 12 according to the location and number of sites involved. The stone-free status was defined as no evidence of a stone or with clinically insignificant residual fragment stones less than 2 mm. Interobserver and test-retest reliabilities were evaluated. Statistical performance of the prediction model was assessed by its predictive accuracy, predictive probability, and clinical usefulness. Overall SFR was 73.0%. The SFRs were 86.7%, 70.2%, and 48.6% in low-score (1-2), intermediate-score (3-4), and high-score (5-12) groups, respectively (p<0.001). External validation of S-ReSC-R revealed an area under the curve (AUC) of 0.731 (95% CI 0.650-0.813). The AUC of the three-titered S-ReSC-R was 0.701 (95% CI 0.609-0.794). The calibration plot showed that the predicted probability of SFR had a concordance comparable to that of observed frequency. The Hosmer-Lemeshow goodness of fit test revealed a p-value of 0.01 for the S-ReSC-R and 0.90 for the three-titered S-ReSC-R. Interobserver and test-retest reliabilities revealed an almost perfect level of agreement. The present study proved the predictive value of S-ReSC-R to predict SFR following RIRS in an independent cohort. Interobserver and test-retest reliabilities confirmed that S-ReSC-R was reliable and valid.

  13. Digital-computer model of ground-water flow in Tooele Valley, Utah

    USGS Publications Warehouse

    Razem, Allan C.; Bartholoma, Scott D.

    1980-01-01

    A two-dimensional, finite-difference digital-computer model was used to simulate the ground-water flow in the principal artesian aquifer in Tooele Valley, Utah. The parameters used in the model were obtained through field measurements and tests, from historical records, and by trial-and-error adjustments. The model was calibrated against observed water-level changes that occurred during 1941-50, 1951-60, 1961-66, 1967-73, and 1974-78. The reliability of the predictions is good in most parts of the valley, as is shown by the ability of the model to match historical water-level changes.

  14. Experimental study and constitutive modeling of the viscoelastic mechanical properties of the human prolapsed vaginal tissue.

    PubMed

    Peña, Estefania; Calvo, B; Martínez, M A; Martins, P; Mascarenhas, T; Jorge, R M N; Ferreira, A; Doblaré, M

    2010-02-01

    In this paper, the viscoelastic mechanical properties of vaginal tissue are investigated. Using previous results of the authors on the mechanical properties of biological soft tissues and newly experimental data from uniaxial tension tests, a new model for the viscoelastic mechanical properties of the human vaginal tissue is proposed. The structural model seems to be sufficiently accurate to guarantee its application to prediction of reliable stress distributions, and is suitable for finite element computations. The obtained results may be helpful in the design of surgical procedures with autologous tissue or prostheses.

  15. Finite element simulation of cutting grey iron HT250 by self-prepared Si3N4 ceramic insert

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Wang, Li; Zhang, Enguang

    2017-04-01

    The finite element method has been able to simulate and solve practical machining problems, achieve the required accuracy and the highly reliability. In this paper, the simulation models based on the material properties of the self-prepared Si3N4 insert and HT250 were created. Using these models, the results of cutting force, cutting temperature and tool wear rate were obtained, and tool wear mode was predicted after cutting simulation. These approaches may develop as the new method for testing new cutting-tool materials, shortening development cycle and reducing the cost.

  16. Thermal design verification testing for the ATS-F and -G spacecraft.

    NASA Technical Reports Server (NTRS)

    Coyle, M.; Greenwell, J.

    1972-01-01

    There is a wide fluctuation in the internal power dissipation from the components within the earth viewing module (EVM). The electronic component functional reliability required for a two-to-five year mission is the most significant factor for the thermal design criteria. A mathematical thermal model of the EVM and the orbital environment is used to predict the performance of the thermal control system. Comparisons of the results obtained in chamber thermal balance tests with the data computed on the basis of the theoretical model provide the means for validating the thermal design.

  17. Accuracy of genomic predictions in Gyr (Bos indicus) dairy cattle.

    PubMed

    Boison, S A; Utsunomiya, A T H; Santos, D J A; Neves, H H R; Carvalheiro, R; Mészáros, G; Utsunomiya, Y T; do Carmo, A S; Verneque, R S; Machado, M A; Panetto, J C C; Garcia, J F; Sölkner, J; da Silva, M V G B

    2017-07-01

    Genomic selection may accelerate genetic progress in breeding programs of indicine breeds when compared with traditional selection methods. We present results of genomic predictions in Gyr (Bos indicus) dairy cattle of Brazil for milk yield (MY), fat yield (FY), protein yield (PY), and age at first calving using information from bulls and cows. Four different single nucleotide polymorphism (SNP) chips were studied. Additionally, the effect of the use of imputed data on genomic prediction accuracy was studied. A total of 474 bulls and 1,688 cows were genotyped with the Illumina BovineHD (HD; San Diego, CA) and BovineSNP50 (50K) chip, respectively. Genotypes of cows were imputed to HD using FImpute v2.2. After quality check of data, 496,606 markers remained. The HD markers present on the GeneSeek SGGP-20Ki (15,727; Lincoln, NE), 50K (22,152), and GeneSeek GGP-75Ki (65,018) were subset and used to assess the effect of lower SNP density on accuracy of prediction. Deregressed breeding values were used as pseudophenotypes for model training. Data were split into reference and validation to mimic a forward prediction scheme. The reference population consisted of animals whose birth year was ≤2004 and consisted of either only bulls (TR1) or a combination of bulls and dams (TR2), whereas the validation set consisted of younger bulls (born after 2004). Genomic BLUP was used to estimate genomic breeding values (GEBV) and reliability of GEBV (R 2 PEV ) was based on the prediction error variance approach. Reliability of GEBV ranged from ∼0.46 (FY and PY) to 0.56 (MY) with TR1 and from 0.51 (PY) to 0.65 (MY) with TR2. When averaged across all traits, R 2 PEV were substantially higher (R 2 PEV of TR1 = 0.50 and TR2 = 0.57) compared with reliabilities of parent averages (0.35) computed from pedigree data and based on diagonals of the coefficient matrix (prediction error variance approach). Reliability was similar for all the 4 marker panels using either TR1 or TR2, except that imputed HD cow data set led to an inflation of reliability. Reliability of GEBV could be increased by enlarging the limited bull reference population with cow information. A reduced panel of ∼15K markers resulted in reliabilities similar to using HD markers. Reliability of GEBV could be increased by enlarging the limited bull reference population with cow information. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. Comparing niche- and process-based models to reduce prediction uncertainty in species range shifts under climate change.

    PubMed

    Morin, Xavier; Thuiller, Wilfried

    2009-05-01

    Obtaining reliable predictions of species range shifts under climate change is a crucial challenge for ecologists and stakeholders. At the continental scale, niche-based models have been widely used in the last 10 years to predict the potential impacts of climate change on species distributions all over the world, although these models do not include any mechanistic relationships. In contrast, species-specific, process-based predictions remain scarce at the continental scale. This is regrettable because to secure relevant and accurate predictions it is always desirable to compare predictions derived from different kinds of models applied independently to the same set of species and using the same raw data. Here we compare predictions of range shifts under climate change scenarios for 2100 derived from niche-based models with those of a process-based model for 15 North American boreal and temperate tree species. A general pattern emerged from our comparisons: niche-based models tend to predict a stronger level of extinction and a greater proportion of colonization than the process-based model. This result likely arises because niche-based models do not take phenotypic plasticity and local adaptation into account. Nevertheless, as the two kinds of models rely on different assumptions, their complementarity is revealed by common findings. Both modeling approaches highlight a major potential limitation on species tracking their climatic niche because of migration constraints and identify similar zones where species extirpation is likely. Such convergent predictions from models built on very different principles provide a useful way to offset uncertainties at the continental scale. This study shows that the use in concert of both approaches with their own caveats and advantages is crucial to obtain more robust results and that comparisons among models are needed in the near future to gain accuracy regarding predictions of range shifts under climate change.

  19. Efficient finite element modelling for the investigation of the dynamic behaviour of a structure with bolted joints

    NASA Astrophysics Data System (ADS)

    Omar, R.; Rani, M. N. Abdul; Yunus, M. A.; Mirza, W. I. I. Wan Iskandar; Zin, M. S. Mohd

    2018-04-01

    A simple structure with bolted joints consists of the structural components, bolts and nuts. There are several methods to model the structures with bolted joints, however there is no reliable, efficient and economic modelling methods that can accurately predict its dynamics behaviour. Explained in this paper is an investigation that was conducted to obtain an appropriate modelling method for bolted joints. This was carried out by evaluating four different finite element (FE) models of the assembled plates and bolts namely the solid plates-bolts model, plates without bolt model, hybrid plates-bolts model and simplified plates-bolts model. FE modal analysis was conducted for all four initial FE models of the bolted joints. Results of the FE modal analysis were compared with the experimental modal analysis (EMA) results. EMA was performed to extract the natural frequencies and mode shapes of the test physical structure with bolted joints. Evaluation was made by comparing the number of nodes, number of elements, elapsed computer processing unit (CPU) time, and the total percentage of errors of each initial FE model when compared with EMA result. The evaluation showed that the simplified plates-bolts model could most accurately predict the dynamic behaviour of the structure with bolted joints. This study proved that the reliable, efficient and economic modelling of bolted joints, mainly the representation of the bolting, has played a crucial element in ensuring the accuracy of the dynamic behaviour prediction.

  20. A systematic review of quantitative burn wound microbiology in the management of burns patients.

    PubMed

    Halstead, Fenella D; Lee, Kwang Chear; Kwei, Johnny; Dretzke, Janine; Oppenheim, Beryl A; Moiemen, Naiem S

    2018-02-01

    The early diagnosis of infection or sepsis in burns are important for patient care. Globally, a large number of burn centres advocate quantitative cultures of wound biopsies for patient management, since there is assumed to be a direct link between the bioburden of a burn wound and the risk of microbial invasion. Given the conflicting study findings in this area, a systematic review was warranted. Bibliographic databases were searched with no language restrictions to August 2015. Study selection, data extraction and risk of bias assessment were performed in duplicate using pre-defined criteria. Substantial heterogeneity precluded quantitative synthesis, and findings were described narratively, sub-grouped by clinical question. Twenty six laboratory and/or clinical studies were included. Substantial heterogeneity hampered comparisons across studies and interpretation of findings. Limited evidence suggests that (i) more than one quantitative microbiology sample is required to obtain reliable estimates of bacterial load; (ii) biopsies are more sensitive than swabs in diagnosing or predicting sepsis; (iii) high bacterial loads may predict worse clinical outcomes, and (iv) both quantitative and semi-quantitative culture reports need to be interpreted with caution and in the context of other clinical risk factors. The evidence base for the utility and reliability of quantitative microbiology for diagnosing or predicting clinical outcomes in burns patients is limited and often poorly reported. Consequently future research is warranted. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  1. Direct measurements of forces between different charged colloidal particles and their prediction by the theory of Derjaguin, Landau, Verwey, and Overbeek (DLVO)

    NASA Astrophysics Data System (ADS)

    Ruiz-Cabello, F. Javier Montes; Maroni, Plinio; Borkovec, Michal

    2013-06-01

    Force measurements between three types of latex particles of diameters down to 1 μm with sulfate and carboxyl surface functionalities were carried out with the multi-particle colloidal probe technique. The experiments were performed in monovalent electrolyte up to concentrations of about 5 mM. The force profiles could be quantified with the theory of Derjaguin, Landau, Verwey, and Overbeek (DLVO) by invoking non-retarded van der Waals forces and the Poisson-Boltzmann description of double layer forces within the constant regulation approximation. The forces measured in the symmetric systems were used to extract particle and surface properties, namely, the Hamaker constant, surface potentials, and regulation parameters. The regulation parameter is found to be independent of solution composition. With these values at hand, the DLVO theory is capable to accurately predict the measured forces in the asymmetric systems down to distances of 2-3 nm without adjustable parameters. This success indicates that DLVO theory is highly reliable to quantify interaction forces in such systems. However, charge regulation effects are found to be important, and they must be considered to obtain correct description of the forces. The use of the classical constant charge or constant potential boundary conditions may lead to erroneous results. To make reliable predictions of the force profiles, the surface potentials must be extracted from direct force measurements too. For highly charged surfaces, the commonly used electrophoresis techniques are found to yield incorrect estimates of this quantity.

  2. Direct measurements of forces between different charged colloidal particles and their prediction by the theory of Derjaguin, Landau, Verwey, and Overbeek (DLVO).

    PubMed

    Montes Ruiz-Cabello, F Javier; Maroni, Plinio; Borkovec, Michal

    2013-06-21

    Force measurements between three types of latex particles of diameters down to 1 μm with sulfate and carboxyl surface functionalities were carried out with the multi-particle colloidal probe technique. The experiments were performed in monovalent electrolyte up to concentrations of about 5 mM. The force profiles could be quantified with the theory of Derjaguin, Landau, Verwey, and Overbeek (DLVO) by invoking non-retarded van der Waals forces and the Poisson-Boltzmann description of double layer forces within the constant regulation approximation. The forces measured in the symmetric systems were used to extract particle and surface properties, namely, the Hamaker constant, surface potentials, and regulation parameters. The regulation parameter is found to be independent of solution composition. With these values at hand, the DLVO theory is capable to accurately predict the measured forces in the asymmetric systems down to distances of 2-3 nm without adjustable parameters. This success indicates that DLVO theory is highly reliable to quantify interaction forces in such systems. However, charge regulation effects are found to be important, and they must be considered to obtain correct description of the forces. The use of the classical constant charge or constant potential boundary conditions may lead to erroneous results. To make reliable predictions of the force profiles, the surface potentials must be extracted from direct force measurements too. For highly charged surfaces, the commonly used electrophoresis techniques are found to yield incorrect estimates of this quantity.

  3. A new theory for X-ray diffraction.

    PubMed

    Fewster, Paul F

    2014-05-01

    This article proposes a new theory of X-ray scattering that has particular relevance to powder diffraction. The underlying concept of this theory is that the scattering from a crystal or crystallite is distributed throughout space: this leads to the effect that enhanced scatter can be observed at the `Bragg position' even if the `Bragg condition' is not satisfied. The scatter from a single crystal or crystallite, in any fixed orientation, has the fascinating property of contributing simultaneously to many `Bragg positions'. It also explains why diffraction peaks are obtained from samples with very few crystallites, which cannot be explained with the conventional theory. The intensity ratios for an Si powder sample are predicted with greater accuracy and the temperature factors are more realistic. Another consequence is that this new theory predicts a reliability in the intensity measurements which agrees much more closely with experimental observations compared to conventional theory that is based on `Bragg-type' scatter. The role of dynamical effects (extinction etc.) is discussed and how they are suppressed with diffuse scattering. An alternative explanation for the Lorentz factor is presented that is more general and based on the capture volume in diffraction space. This theory, when applied to the scattering from powders, will evaluate the full scattering profile, including peak widths and the `background'. The theory should provide an increased understanding of the reliability of powder diffraction measurements, and may also have wider implications for the analysis of powder diffraction data, by increasing the accuracy of intensities predicted from structural models.

  4. Contribution of domestic production records, Interbull estimated breeding values, and single nucleotide polymorphism genetic markers to the single-step genomic evaluation of milk production.

    PubMed

    Přibyl, J; Madsen, P; Bauer, J; Přibylová, J; Simečková, M; Vostrý, L; Zavadilová, L

    2013-03-01

    Estimated breeding values (EBV) for first-lactation milk production of Holstein cattle in the Czech Republic were calculated using a conventional animal model and by single-step prediction of the genomic enhanced breeding value. Two overlapping data sets of milk production data were evaluated: (1) calving years 1991 to 2006, with 861,429 lactations and 1,918,901 animals in the pedigree and (2) calving years 1991 to 2010, with 1,097,319 lactations and 1,906,576 animals in the pedigree. Global Interbull (Uppsala, Sweden) deregressed proofs of 114,189 bulls were used in the analyses. Reliabilities of Interbull values were equivalent to an average of 8.53 effective records, which were used in a weighted analysis. A total of 1,341 bulls were genotyped using the Illumina BovineSNP50 BeadChip V2 (Illumina Inc., San Diego, CA). Among the genotyped bulls were 332 young bulls with no daughters in the first data set but more than 50 daughters (88.41, on average) with performance records in the second data set. For young bulls, correlations of EBV and genomic enhanced breeding value before and after progeny testing, corresponding average expected reliabilities, and effective daughter contributions (EDC) were calculated. The reliability of prediction pedigree EBV of young bulls was 0.41, corresponding to EDC=10.6. Including Interbull deregressed proofs improved the reliability of prediction by EDC=13.4 and including genotyping improved prediction reliability by EDC=6.2. Total average expected reliability of prediction reached 0.67, corresponding to EDC=30.2. The combination of domestic and Interbull sources for both genotyped and nongenotyped animals is valuable for improving the accuracy of genetic prediction in small populations of dairy cattle. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  5. The Yale-Brown Obsessive Compulsive Scale: A Reliability Generalization Meta-Analysis.

    PubMed

    López-Pina, José Antonio; Sánchez-Meca, Julio; López-López, José Antonio; Marín-Martínez, Fulgencio; Núñez-Núñez, Rosa Maria; Rosa-Alcázar, Ana I; Gómez-Conesa, Antonia; Ferrer-Requena, Josefa

    2015-10-01

    The Yale-Brown Obsessive Compulsive Scale (Y-BOCS) is the most frequently applied test to assess obsessive compulsive symptoms. We conducted a reliability generalization meta-analysis on the Y-BOCS to estimate the average reliability, examine the variability among the reliability estimates, search for moderators, and propose a predictive model that researchers and clinicians can use to estimate the expected reliability of the Y-BOCS. We included studies where the Y-BOCS was applied to a sample of adults and reliability estimate was reported. Out of the 11,490 references located, 144 studies met the selection criteria. For the total scale, the mean reliability was 0.866 for coefficients alpha, 0.848 for test-retest correlations, and 0.922 for intraclass correlations. The moderator analyses led to a predictive model where the standard deviation of the total test and the target population (clinical vs. nonclinical) explained 38.6% of the total variability among coefficients alpha. Finally, clinical implications of the results are discussed. © The Author(s) 2014.

  6. Common carotid artery intima-media thickness is as good as carotid intima-media thickness of all carotid artery segments in improving prediction of coronary heart disease risk in the Atherosclerosis Risk in Communities (ARIC) study.

    PubMed

    Nambi, Vijay; Chambless, Lloyd; He, Max; Folsom, Aaron R; Mosley, Tom; Boerwinkle, Eric; Ballantyne, Christie M

    2012-01-01

    Carotid intima-media thickness (CIMT) and plaque information can improve coronary heart disease (CHD) risk prediction when added to traditional risk factors (TRF). However, obtaining adequate images of all carotid artery segments (A-CIMT) may be difficult. Of A-CIMT, the common carotid artery intima-media thickness (CCA-IMT) is relatively more reliable and easier to measure. We evaluated whether CCA-IMT is comparable to A-CIMT when added to TRF and plaque information in improving CHD risk prediction in the Atherosclerosis Risk in Communities (ARIC) study. Ten-year CHD risk prediction models using TRF alone, TRF + A-CIMT + plaque, and TRF + CCA-IMT + plaque were developed for the overall cohort, men, and women. The area under the receiver operator characteristic curve (AUC), per cent individuals reclassified, net reclassification index (NRI), and model calibration by the Grønnesby-Borgan test were estimated. There were 1722 incident CHD events in 12 576 individuals over a mean follow-up of 15.2 years. The AUC for TRF only, TRF + A-CIMT + plaque, and TRF + CCA-IMT + plaque models were 0.741, 0.754, and 0.753, respectively. Although there was some discordance when the CCA-IMT + plaque- and A-CIMT + plaque-based risk estimation was compared, the NRI and clinical NRI (NRI in the intermediate-risk group) when comparing the CIMT models with TRF-only model, per cent reclassified, and test for model calibration were not significantly different. Coronary heart disease risk prediction can be improved by adding A-CIMT + plaque or CCA-IMT + plaque information to TRF. Therefore, evaluating the carotid artery for plaque presence and measuring CCA-IMT, which is easier and more reliable than measuring A-CIMT, provide a good alternative to measuring A-CIMT for CHD risk prediction.

  7. Decreasing inventory of a cement factory roller mill parts using reliability centered maintenance method

    NASA Astrophysics Data System (ADS)

    Witantyo; Rindiyah, Anita

    2018-03-01

    According to data from maintenance planning and control, it was obtained that highest inventory value is non-routine components. Maintenance components are components which procured based on maintenance activities. The problem happens because there is no synchronization between maintenance activities and the components required. Reliability Centered Maintenance method is used to overcome the problem by reevaluating maintenance activities required components. The case chosen is roller mill system because it has the highest unscheduled downtime record. Components required for each maintenance activities will be determined by its failure distribution, so the number of components needed could be predicted. Moreover, those components will be reclassified from routine component to be non-routine component, so the procurement could be carried out regularly. Based on the conducted analysis, failure happens in almost every maintenance task are classified to become scheduled on condition task, scheduled discard task, schedule restoration task and no schedule maintenance. From 87 used components for maintenance activities are evaluated and there 19 components that experience reclassification from non-routine components to routine components. Then the reliability and need of those components were calculated for one-year operation period. Based on this invention, it is suggested to change all of the components in overhaul activity to increase the reliability of roller mill system. Besides, the inventory system should follow maintenance schedule and the number of required components in maintenance activity so the value of procurement will be decreased and the reliability system will increase.

  8. Modification of Hazen's equation in coarse grained soils by soft computing techniques

    NASA Astrophysics Data System (ADS)

    Kaynar, Oguz; Yilmaz, Isik; Marschalko, Marian; Bednarik, Martin; Fojtova, Lucie

    2013-04-01

    Hazen first proposed a Relationship between coefficient of permeability (k) and effective grain size (d10) was first proposed by Hazen, and it was then extended by some other researchers. However many attempts were done for estimation of k, correlation coefficients (R2) of the models were generally lower than ~0.80 and whole grain size distribution curves were not included in the assessments. Soft computing techniques such as; artificial neural networks, fuzzy inference systems, genetic algorithms, etc. and their hybrids are now being successfully used as an alternative tool. In this study, use of some soft computing techniques such as Artificial Neural Networks (ANNs) (MLP, RBF, etc.) and Adaptive Neuro-Fuzzy Inference System (ANFIS) for prediction of permeability of coarse grained soils was described, and Hazen's equation was then modificated. It was found that the soft computing models exhibited high performance in prediction of permeability coefficient. However four different kinds of ANN algorithms showed similar prediction performance, results of MLP was found to be relatively more accurate than RBF models. The most reliable prediction was obtained from ANFIS model.

  9. Semi-supervised prediction of gene regulatory networks using machine learning algorithms.

    PubMed

    Patel, Nihir; Wang, Jason T L

    2015-10-01

    Use of computational methods to predict gene regulatory networks (GRNs) from gene expression data is a challenging task. Many studies have been conducted using unsupervised methods to fulfill the task; however, such methods usually yield low prediction accuracies due to the lack of training data. In this article, we propose semi-supervised methods for GRN prediction by utilizing two machine learning algorithms, namely, support vector machines (SVM) and random forests (RF). The semi-supervised methods make use of unlabelled data for training. We investigated inductive and transductive learning approaches, both of which adopt an iterative procedure to obtain reliable negative training data from the unlabelled data. We then applied our semi-supervised methods to gene expression data of Escherichia coli and Saccharomyces cerevisiae, and evaluated the performance of our methods using the expression data. Our analysis indicated that the transductive learning approach outperformed the inductive learning approach for both organisms. However, there was no conclusive difference identified in the performance of SVM and RF. Experimental results also showed that the proposed semi-supervised methods performed better than existing supervised methods for both organisms.

  10. Separating predictable and unpredictable work to manage interruptions and promote safe and effective work flow.

    PubMed

    Kowinsky, Amy M; Shovel, Judith; McLaughlin, Maribeth; Vertacnik, Lisa; Greenhouse, Pamela K; Martin, Susan Christie; Minnier, Tamra E

    2012-01-01

    Predictable and unpredictable patient care tasks compete for caregiver time and attention, making it difficult for patient care staff to reliably and consistently meet patient needs. We have piloted a redesigned care model that separates the work of patient care technicians based on task predictability and creates role specificity. This care model shows promise in improving the ability of staff to reliably complete tasks in a more consistent and timely manner.

  11. Individual and population pharmacokinetic compartment analysis: a graphic procedure for quantification of predictive performance.

    PubMed

    Eksborg, Staffan

    2013-01-01

    Pharmacokinetic studies are important for optimizing of drug dosing, but requires proper validation of the used pharmacokinetic procedures. However, simple and reliable statistical methods suitable for evaluation of the predictive performance of pharmacokinetic analysis are essentially lacking. The aim of the present study was to construct and evaluate a graphic procedure for quantification of predictive performance of individual and population pharmacokinetic compartment analysis. Original data from previously published pharmacokinetic compartment analyses after intravenous, oral, and epidural administration, and digitized data, obtained from published scatter plots of observed vs predicted drug concentrations from population pharmacokinetic studies using the NPEM algorithm and NONMEM computer program and Bayesian forecasting procedures, were used for estimating the predictive performance according to the proposed graphical method and by the method of Sheiner and Beal. The graphical plot proposed in the present paper proved to be a useful tool for evaluation of predictive performance of both individual and population compartment pharmacokinetic analysis. The proposed method is simple to use and gives valuable information concerning time- and concentration-dependent inaccuracies that might occur in individual and population pharmacokinetic compartment analysis. Predictive performance can be quantified by the fraction of concentration ratios within arbitrarily specified ranges, e.g. within the range 0.8-1.2.

  12. Occultation Predictions Using CCD Strip-Scanning Astrometry

    NASA Technical Reports Server (NTRS)

    Dunham, Edward W.; Ford, C. H.; Stone, R. P. S.; McDonald, S. W.; Olkin, C. B.; Elliot, J. L.; Witteborn, Fred C. (Technical Monitor)

    1994-01-01

    We are developing the method of CCD strip-scanning astrometry for the purpose of deriving reliable advance predictions for occultations involving small objects in the outer solar system. We are using a camera system based on a Ford/Loral 2Kx2K CCD with the Crossley telescope at Lick Observatory for this work. The columns of die CCD are aligned East-West, the telescope drive is stopped, and the CCD is clocked at the same rate that the stars drift across it. In this way we obtain arbitrary length strip images 20 arcmin wide with 0.58" pixels. Since planets move mainly in RA, it is possible to obtain images of the planet and star to be occulted on the same strip well before the occultation occurs. The strip-to-strip precision (i.e. reproducibility) of positions is limited by atmospheric image motion to about 0.1" rms per strip. However, for objects that are nearby in R.A., the image motion is highly correlated and their relative positions are good to 0.02" rms per strip. We will show that the effects of atmospheric image motion on a given strip can be removed if a sufficient number of strips of a given area have been obtained. Thus, it is possible to reach an rms precision of 0.02" per strip, corresponding to about 0.3 of Pluto or Triton's angular radius. The ultimate accuracy of a prediction based on strip-scanning astrometry is currently limited by the accuracy of the positions of the stars in the astrometric network used and by systematic errors most likely due to the optical system. We will show the results of . the prediction of some recent occultations as examples of the current capabilities and limitations of this technique.

  13. Fourier transform infrared spectroscopy for the prediction of fatty acid profiles in Mucor fungi grown in media with different carbon sources.

    PubMed

    Shapaval, Volha; Afseth, Nils Kristian; Vogt, Gjermund; Kohler, Achim

    2014-09-11

    Fungal production of polyunsaturated fatty acids (PUFAs) is a highly potential approach in biotechnology. Currently the main focus is directed towards screening of hundreds strains in order to select of few potential ones. Thus, a reliable method for screening a high number of strains within a short period of time is needed. Here, we present a novel method for screening of PUFA-producing fungi by high-throughput microcultivation and FTIR spectroscopy. In the study selected Mucor fungi were grown in media with different carbon sources and fatty acid profiles were predicted on the basis of the obtained spectral data. FTIR spectra were calibrated against fatty acid analysis by GC-FD. The calibration models were cross-validated and correlation coefficients (R2) from 0.71 to 0.78 with RMSECV (root mean squared error) from 2.86% to 6.96% (percentage of total fat) were obtained. The FTIR results show a strong correlation to the results obtained by GC analysis, where high total contents of unsaturated fatty acids (both PUFA and MUFA) were achieved for Mucor plumbeus VI02019 cultivated in canola, olive and sunflower oil and Mucor hiemalis VI01993 cultivated in canola and olive oil.

  14. Geometrical modeling of complete dental shapes by using panoramic X-ray, digital mouth data and anatomical templates.

    PubMed

    Barone, Sandro; Paoli, Alessandro; Razionale, Armando Viviano

    2015-07-01

    In the field of orthodontic planning, the creation of a complete digital dental model to simulate and predict treatments is of utmost importance. Nowadays, orthodontists use panoramic radiographs (PAN) and dental crown representations obtained by optical scanning. However, these data do not contain any 3D information regarding tooth root geometries. A reliable orthodontic treatment should instead take into account entire geometrical models of dental shapes in order to better predict tooth movements. This paper presents a methodology to create complete 3D patient dental anatomies by combining digital mouth models and panoramic radiographs. The modeling process is based on using crown surfaces, reconstructed by optical scanning, and root geometries, obtained by adapting anatomical CAD templates over patient specific information extracted from radiographic data. The radiographic process is virtually replicated on crown digital geometries through the Discrete Radon Transform (DRT). The resulting virtual PAN image is used to integrate the actual radiographic data and the digital mouth model. This procedure provides the root references on the 3D digital crown models, which guide a shape adjustment of the dental CAD templates. The entire geometrical models are finally created by merging dental crowns, captured by optical scanning, and root geometries, obtained from the CAD templates. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Extraction Optimization for Obtaining Artemisia capillaris Extract with High Anti-Inflammatory Activity in RAW 264.7 Macrophage Cells

    PubMed Central

    Jang, Mi; Jeong, Seung-Weon; Kim, Bum-Keun; Kim, Jong-Chan

    2015-01-01

    Plant extracts have been used as herbal medicines to treat a wide variety of human diseases. We used response surface methodology (RSM) to optimize the Artemisia capillaris Thunb. extraction parameters (extraction temperature, extraction time, and ethanol concentration) for obtaining an extract with high anti-inflammatory activity at the cellular level. The optimum ranges for the extraction parameters were predicted by superimposing 4-dimensional response surface plots of the lipopolysaccharide- (LPS-) induced PGE2 and NO production and by cytotoxicity of A. capillaris Thunb. extracts. The ranges of extraction conditions used for determining the optimal conditions were extraction temperatures of 57–65°C, ethanol concentrations of 45–57%, and extraction times of 5.5–6.8 h. On the basis of the results, a model with a central composite design was considered to be accurate and reliable for predicting the anti-inflammation activity of extracts at the cellular level. These approaches can provide a logical starting point for developing novel anti-inflammatory substances from natural products and will be helpful for the full utilization of A. capillaris Thunb. The crude extract obtained can be used in some A. capillaris Thunb.-related health care products. PMID:26075271

  16. Assessing the impact of land use change on hydrology by ensemble modeling (LUCHEM) III: Scenario analysis

    USGS Publications Warehouse

    Huisman, J.A.; Breuer, L.; Bormann, H.; Bronstert, A.; Croke, B.F.W.; Frede, H.-G.; Graff, T.; Hubrechts, L.; Jakeman, A.J.; Kite, G.; Lanini, J.; Leavesley, G.; Lettenmaier, D.P.; Lindstrom, G.; Seibert, J.; Sivapalan, M.; Viney, N.R.; Willems, P.

    2009-01-01

    An ensemble of 10 hydrological models was applied to the same set of land use change scenarios. There was general agreement about the direction of changes in the mean annual discharge and 90% discharge percentile predicted by the ensemble members, although a considerable range in the magnitude of predictions for the scenarios and catchments under consideration was obvious. Differences in the magnitude of the increase were attributed to the different mean annual actual evapotranspiration rates for each land use type. The ensemble of model runs was further analyzed with deterministic and probabilistic ensemble methods. The deterministic ensemble method based on a trimmed mean resulted in a single somewhat more reliable scenario prediction. The probabilistic reliability ensemble averaging (REA) method allowed a quantification of the model structure uncertainty in the scenario predictions. It was concluded that the use of a model ensemble has greatly increased our confidence in the reliability of the model predictions. ?? 2008 Elsevier Ltd.

  17. Puppy Temperament Assessments Predict Breed and American Kennel Club Group but Not Adult Temperament.

    PubMed

    Robinson, Lauren M; Skiver Thompson, Rebekah; Ha, James C

    2016-01-01

    Puppy assessments for companion dogs have shown mixed long-term reliability. Temperament is cited among the reasons for surrendering dogs to shelters. A puppy temperament test that reliably predicts adult behavior is one potential way to lower the number of dogs given to shelters. This study used a longitudinal design to assess temperament in puppies from 8 different breeds at 7 weeks old (n = 52) and 6 years old (n = 34) using modified temperament tests, physiological measures, and a follow-up questionnaire. For 7-week-old puppies, results revealed (a) puppy breed was predictable using 3 variables, (b) 4 American Kennel Club breed groups had some validity based on temperament, (c) temperament was variable within litters of puppies, and (d) certain measures of temperament were related to physiological measures (heart rate). Finally, puppy temperament assessments were reliable in predicting the scores of 2 of the 8 adult dog temperament measures. However, overall, the puppy temperament scores were unreliable in predicting adult temperament.

  18. An improved grey model for the prediction of real-time GPS satellite clock bias

    NASA Astrophysics Data System (ADS)

    Zheng, Z. Y.; Chen, Y. Q.; Lu, X. S.

    2008-07-01

    In real-time GPS precise point positioning (PPP), real-time and reliable satellite clock bias (SCB) prediction is a key to implement real-time GPS PPP. It is difficult to hold the nuisance and inenarrable performance of space-borne GPS satellite atomic clock because of its high-frequency, sensitivity and impressionable, it accords with the property of grey model (GM) theory, i. e. we can look on the variable process of SCB as grey system. Firstly, based on limits of quadratic polynomial (QP) and traditional GM to predict SCB, a modified GM (1,1) is put forward to predict GPS SCB in this paper; and then, taking GPS SCB data for example, we analyzed clock bias prediction with different sample interval, the relationship between GM exponent and prediction accuracy, precision comparison of GM to QP, and concluded the general rule of different type SCB and GM exponent; finally, to test the reliability and validation of the modified GM what we put forward, taking IGS clock bias ephemeris product as reference, we analyzed the prediction precision with the modified GM, It is showed that the modified GM is reliable and validation to predict GPS SCB and can offer high precise SCB prediction for real-time GPS PPP.

  19. Comparison of the performance and reliability of 18 lumped hydrological models driven by ECMWF rainfall ensemble forecasts: a case study on 29 French catchments

    NASA Astrophysics Data System (ADS)

    Velázquez, Juan Alberto; Anctil, François; Ramos, Maria-Helena; Perrin, Charles

    2010-05-01

    An ensemble forecasting system seeks to assess and to communicate the uncertainty of hydrological predictions by proposing, at each time step, an ensemble of forecasts from which one can estimate the probability distribution of the predictant (the probabilistic forecast), in contrast with a single estimate of the flow, for which no distribution is obtainable (the deterministic forecast). In the past years, efforts towards the development of probabilistic hydrological prediction systems were made with the adoption of ensembles of numerical weather predictions (NWPs). The additional information provided by the different available Ensemble Prediction Systems (EPS) was evaluated in a hydrological context on various case studies (see the review by Cloke and Pappenberger, 2009). For example, the European ECMWF-EPS was explored in case studies by Roulin et al. (2005), Bartholmes et al. (2005), Jaun et al. (2008), and Renner et al. (2009). The Canadian EC-EPS was also evaluated by Velázquez et al. (2009). Most of these case studies investigate the ensemble predictions of a given hydrological model, set up over a limited number of catchments. Uncertainty from weather predictions is assessed through the use of meteorological ensembles. However, uncertainty from the tested hydrological model and statistical robustness of the forecasting system when coping with different hydro-meteorological conditions are less frequently evaluated. The aim of this study is to evaluate and compare the performance and the reliability of 18 lumped hydrological models applied to a large number of catchments in an operational ensemble forecasting context. Some of these models were evaluated in a previous study (Perrin et al. 2001) for their ability to simulate streamflow. Results demonstrated that very simple models can achieve a level of performance almost as high (sometimes higher) as models with more parameters. In the present study, we focus on the ability of the hydrological models to provide reliable probabilistic forecasts of streamflow, based on ensemble weather predictions. The models were therefore adapted to run in a forecasting mode, i.e., to update initial conditions according to the last observed discharge at the time of the forecast, and to cope with ensemble weather scenarios. All models are lumped, i.e., the hydrological behavior is integrated over the spatial scale of the catchment, and run at daily time steps. The complexity of tested models varies between 3 and 13 parameters. The models are tested on 29 French catchments. Daily streamflow time series extend over 17 months, from March 2005 to July 2006. Catchment areas range between 1470 km2 and 9390 km2, and represent a variety of hydrological and meteorological conditions. The 12 UTC 10-day ECMWF rainfall ensemble (51 members) was used, which led to daily streamflow forecasts for a 9-day lead time. In order to assess the performance and reliability of the hydrological ensemble predictions, we computed the Continuous Ranked probability Score (CRPS) (Matheson and Winkler, 1976), as well as the reliability diagram (e.g. Wilks, 1995) and the rank histogram (Talagrand et al., 1999). Since the ECMWF deterministic forecasts are also available, the performance of the hydrological forecasting systems was also evaluated by comparing the deterministic score (MAE) with the probabilistic score (CRPS). The results obtained for the 18 hydrological models and the 29 studied catchments are discussed in the perspective of improving the operational use of ensemble forecasting in hydrology. References Bartholmes, J. and Todini, E.: Coupling meteorological and hydrological models for flood forecasting, Hydrol. Earth Syst. Sci., 9, 333-346, 2005. Cloke, H. and Pappenberger, F.: Ensemble Flood Forecasting: A Review. Journal of Hydrology 375 (3-4): 613-626, 2009. Jaun, S., Ahrens, B., Walser, A., Ewen, T., and Schär, C.: A probabilistic view on the August 2005 floods in the upper Rhine catchment, Nat. Hazards Earth Syst. Sci., 8, 281-291, 2008. Matheson, J. E. and Winkler, R. L.: Scoring rules for continuous probability distributions, Manage Sci., 22, 1087-1096, 1976. Perrin, C., Michel C. and Andréassian,V. Does a large number of parameters enhance model performance? Comparative assessment of common catchment model structures on 429 catchments, J. Hydrol., 242, 275-301, 2001. Renner, M., Werner, M. G. F., Rademacher, S., and Sprokkereef, E.: Verification of ensemble flow forecast for the River Rhine, J. Hydrol., 376, 463-475, 2009. Roulin, E. and Vannitsem, S.: Skill of medium-range hydrological ensemble predictions, J. Hydrometeorol., 6, 729-744, 2005. Talagrand, O., Vautard, R., and Strauss, B.: Evaluation of the probabilistic prediction systems, in: Proceedings, ECMWF Workshop on Predictability, Shinfield Park, Reading, Berkshire, ECMWF, 1-25, 1999. Velázquez, J.A., Petit, T., Lavoie, A., Boucher M.-A., Turcotte R., Fortin V., and Anctil, F. : An evaluation of the Canadian global meteorological ensemble prediction system for short-term hydrological forecasting, Hydrol. Earth Syst. Sci., 13, 2221-2231, 2009. Wilks, D. S.: Statistical Methods in the Atmospheric Sciences, Academic Press, San Diego, CA, 465 pp., 1995.

  20. Empirical comparison between different methods for genomic prediction of number of piglets born alive in moderate sized breeding populations.

    PubMed

    Fangmann, A; Sharifi, R A; Heinkel, J; Danowski, K; Schrade, H; Erbe, M; Simianer, H

    2017-04-01

    Currently used multi-step methods to incorporate genomic information in the prediction of breeding values (BV) implicitly involve many assumptions which, if violated, may result in loss of information, inaccuracies and bias. To overcome this, single-step genomic best linear unbiased prediction (ssGBLUP) was proposed combining pedigree, phenotype and genotype of all individuals for genetic evaluation. Our objective was to implement ssGBLUP for genomic predictions in pigs and to compare the accuracy of ssGBLUP with that of multi-step methods with empirical data of moderately sized pig breeding populations. Different predictions were performed: conventional parent average (PA), direct genomic value (DGV) calculated with genomic BLUP (GBLUP), a GEBV obtained by blending the DGV with PA, and ssGBLUP. Data comprised individuals from a German Landrace (LR) and Large White (LW) population. The trait 'number of piglets born alive' (NBA) was available for 182,054 litters of 41,090 LR sows and 15,750 litters from 4534 LW sows. The pedigree contained 174,021 animals, of which 147,461 (26,560) animals were LR (LW) animals. In total, 526 LR and 455 LW animals were genotyped with the Illumina PorcineSNP60 BeadChip. After quality control and imputation, 495 LR (424 LW) animals with 44,368 (43,678) SNP on 18 autosomes remained for the analysis. Predictive abilities, i.e., correlations between de-regressed proofs and genomic BV, were calculated with a five-fold cross validation and with a forward prediction for young genotyped validation animals born after 2011. Generally, predictive abilities for LR were rather small (0.08 for GBLUP, 0.19 for GEBV and 0.18 for ssGBLUP). For LW, ssGBLUP had the greatest predictive ability (0.45). For both breeds, assessment of reliabilities for young genotyped animals indicated that genomic prediction outperforms PA with ssGBLUP providing greater reliabilities (0.40 for LR and 0.32 for LW) than GEBV (0.35 for LR and 0.29 for LW). Grouping of animals according to information sources revealed that genomic prediction had the highest potential benefit for genotyped animals without their own phenotype. Although, ssGBLUP did not generally outperform GBLUP or GEBV, the results suggest that ssGBLUP can be a useful and conceptually convincing approach for practical genomic prediction of NBA in moderately sized LR and LW populations.

  1. Evaluation of modeling as a tool to determine the potential impacts related to drilling wastes in the Brazilian offshore.

    PubMed

    Pivel, María Alejandra Gómez; Dal Sasso Freitas, Carla Maria

    2010-08-01

    Numerical models that predict the fate of drilling discharges at sea constitute a valuable tool for both the oil industry and regulatory agencies. In order to provide reliable estimates, models must be validated through the comparison of predictions with field or laboratory observations. In this paper, we used the Offshore Operators Committee Model to simulate the discharges from two wells drilled at Campos Basin, offshore SE Brazil, and compared the results with field observations obtained 3 months after drilling. The comparison showed that the model provided reasonable predictions, considering that data about currents were reconstructed and theoretical data were used to characterize the classes of solids. The model proved to be a valuable tool to determine the degree of potential impact associated to drilling activities. However, since the accuracy of the model is directly dependent on the quality of input data, different possible scenarios should be considered when used for forecast modeling.

  2. Thermomechanical Modeling of Sintered Silver - A Fracture Mechanics-based Approach: Extended Abstract: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paret, Paul P; DeVoto, Douglas J; Narumanchi, Sreekant V

    Sintered silver has proven to be a promising candidate for use as a die-attach and substrate-attach material in automotive power electronics components. It holds promise of greater reliability than lead-based and lead-free solders, especially at higher temperatures (less than 200 degrees Celcius). Accurate predictive lifetime models of sintered silver need to be developed and its failure mechanisms thoroughly characterized before it can be deployed as a die-attach or substrate-attach material in wide-bandgap device-based packages. We present a finite element method (FEM) modeling methodology that can offer greater accuracy in predicting the failure of sintered silver under accelerated thermal cycling. Amore » fracture mechanics-based approach is adopted in the FEM model, and J-integral/thermal cycle values are computed. In this paper, we outline the procedures for obtaining the J-integral/thermal cycle values in a computational model and report on the possible advantage of using these values as modeling parameters in a predictive lifetime model.« less

  3. Coupling model of aerobic waste degradation considering temperature, initial moisture content and air injection volume.

    PubMed

    Ma, Jun; Liu, Lei; Ge, Sai; Xue, Qiang; Li, Jiangshan; Wan, Yong; Hui, Xinminnan

    2018-03-01

    A quantitative description of aerobic waste degradation is important in evaluating landfill waste stability and economic management. This research aimed to develop a coupling model to predict the degree of aerobic waste degradation. On the basis of the first-order kinetic equation and the law of conservation of mass, we first developed the coupling model of aerobic waste degradation that considered temperature, initial moisture content and air injection volume to simulate and predict the chemical oxygen demand in the leachate. Three different laboratory experiments on aerobic waste degradation were simulated to test the model applicability. Parameter sensitivity analyses were conducted to evaluate the reliability of parameters. The coupling model can simulate aerobic waste degradation, and the obtained simulation agreed with the corresponding results of the experiment. Comparison of the experiment and simulation demonstrated that the coupling model is a new approach to predict aerobic waste degradation and can be considered as the basis for selecting the economic air injection volume and appropriate management in the future.

  4. Evaluation and prediction of solar radiation for energy management based on neural networks

    NASA Astrophysics Data System (ADS)

    Aldoshina, O. V.; Van Tai, Dinh

    2017-08-01

    Currently, there is a high rate of distribution of renewable energy sources and distributed power generation based on intelligent networks; therefore, meteorological forecasts are particularly useful for planning and managing the energy system in order to increase its overall efficiency and productivity. The application of artificial neural networks (ANN) in the field of photovoltaic energy is presented in this article. Implemented in this study, two periodically repeating dynamic ANS, that are the concentration of the time delay of a neural network (CTDNN) and the non-linear autoregression of a network with exogenous inputs of the NAEI, are used in the development of a model for estimating and daily forecasting of solar radiation. ANN show good productivity, as reliable and accurate models of daily solar radiation are obtained. This allows to successfully predict the photovoltaic output power for this installation. The potential of the proposed method for controlling the energy of the electrical network is shown using the example of the application of the NAEI network for predicting the electric load.

  5. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  6. First Principle Predictions of Isotopic Shifts in H2O

    NASA Technical Reports Server (NTRS)

    Schwenke, David W.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    We compute isotope independent first and second order corrections to the Born-Oppenheimer approximation for water and use them to predict isotopic shifts. For the diagonal correction, we use icMRCI wavefunctions and derivatives with respect to mass dependent, internal coordinates to generate the mass independent correction functions. For the non-adiabatic correction, we use scaled SCF/CIS wave functions and a generalization of the Handy method to obtain mass independent correction functions. We find that including the non-adiabatic correction gives significantly improved results compared to just including the diagonal correction when the Born-Oppenheimer potential energy surface is optimized for H2O-16. The agreement with experimental results for deuterium and tritium containing isotopes is nearly as good as our best empirical correction, however, the present correction is expected to be more reliable for higher, uncharacterized levels.

  7. Confronting uncertainty in flood damage predictions

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno

    2015-04-01

    Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  8. Measurement of body temperature in adult patients: comparative study of accuracy, reliability and validity of different devices.

    PubMed

    Rubia-Rubia, J; Arias, A; Sierra, A; Aguirre-Jaime, A

    2011-07-01

    We compared a range of alternative devices with core body temperature measured at the pulmonary artery to identify the most valid and reliable instrument for measuring temperature in routine conditions in health services. 201 patients from the intensive care unit of the Candelaria University Hospital, Canary Islands, admitted to hospital between April 2006 and July 2007. All patients (or their families) gave informed consent. Readings from gallium-in-glass, reactive strip and digital in axilla, infra-red ear and frontal thermometers were compared with the pulmonary artery core temperature simultaneously. External factors suspected of having an influence on the differences were explored. The cut-off point readings for each thermometer were fixed for the maximum negative predictive value in comparison with the core temperature. The validity, reliability, accuracy, external influence, the waste they generated, ease of use, speed, durability, security, comfort and cost of each thermometer was evaluated. An ad hoc overall valuation score was obtained from these parameters for each instrument. For an error of ± 0.2°C and concordance with respect to fever, the gallium-in-glass thermometer gave the best results. The largest area under the receiver operating characteristic (ROC) curve is obtained by the digital axillar thermometer with probe (0.988 ± 0.007). The minimum difference between readings was given by the infrared ear thermometer, in comparison with the core temperature (-0.1 ± 0.3°C). Age, weight, level of conscience, male sex, environmental temperature and vaso-constrictor medication increases the difference in the readings and fever treatment reduces it, although this is not the same for all thermometers. The compact digital axillar thermometer and the digital thermometer with probe obtained the highest overall valuation score. If we only evaluate the aspects of validity, reliability, accuracy and external influence, the best thermometer would be the gallium-in-glass after 12 min. The gallium-in-glass thermometer is less accurate after only 5 min in comparison with the reading taken after being placed for 12 min. If we add the evaluation of waste production, ease-of-use, speed, durability, security, patient comfort and costs, the thermometers that obtain the highest score are the compact digital and digital with probe in right axilla. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Obtaining Reliable Estimates of Ambulatory Physical Activity in People with Parkinson's Disease.

    PubMed

    Paul, Serene S; Ellis, Terry D; Dibble, Leland E; Earhart, Gammon M; Ford, Matthew P; Foreman, K Bo; Cavanaugh, James T

    2016-05-05

    We determined the number of days required, and whether to include weekdays and/or weekends, to obtain reliable measures of ambulatory physical activity in people with Parkinson's disease (PD). Ninety-two persons with PD wore a step activity monitor for seven days. The number of days required to obtain a reliable estimate of daily activity was determined from the mean intraclass correlation (ICC2,1) for all possible combinations of 1-6 consecutive days of monitoring. Two days of monitoring were sufficient to obtain reliable daily activity estimates (ICC2,1 > 0.9). Amount (p = 0.03) but not intensity (p = 0.13) of ambulatory activity was greater on weekdays than weekends. Activity prescription based on amount rather than intensity may be more appropriate for people with PD.

  10. Evaluation of white blood cell count, neutrophil percentage, and elevated temperature as predictors of bloodstream infection in burn patients.

    PubMed

    Murray, Clinton K; Hoffmaster, Roselle M; Schmit, David R; Hospenthal, Duane R; Ward, John A; Cancio, Leopoldo C; Wolf, Steven E

    2007-07-01

    To investigate whether specific values of or changes in temperature, white blood cell count, or neutrophil percentage were predictive of bloodstream infection in burn patients. Retrospective review of electronic records. Intensive care center at the US Army Institute of Surgical Research Burn Center. Burn patients with blood cultures obtained from 2001 to 2004. Temperature recorded at the time blood cultures were obtained; highest temperature in each 6-hour interval during the 24 hours prior to this; white blood cell count and neutrophil percentage at the time of obtaining the blood culture and during the 24 hours preceding the blood culture; demographic data; and total body surface area burned. A total of 1063 blood cultures were obtained from 223 patients. Seventy-three people had 140 blood cultures from which microorganisms were recovered. Organisms that were recovered from blood cultures included 80 that were gram negative, 54 that were gram positive, 3 that were mixed gram positive/gram negative, and 3 yeasts. Although white blood cell count and neutrophil percentage at the time of the culture were statistically different between patients with and patients without bloodstream infection, receiver operating characteristic curve analysis revealed these values to be poor discriminators (receiver operating characteristic curve area = 0.624). Temperature or alterations in temperature in the preceding 24-hour period did not predict presence, absence, or type of bloodstream infection. Temperature, white blood cell count, neutrophil percentage, or changes in these values were not clinically reliable in predicting bloodstream infection. Further work is needed to identify alternative clinical parameters, which should prompt blood culture evaluations in this population.

  11. Simulating Physiological Response with a Passive Sensor Manikin and an Adaptive Thermal Manikin to Predict Thermal Sensation and Comfort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rugh, John P; Chaney, Larry; Hepokoski, Mark

    2015-04-14

    Reliable assessment of occupant thermal comfort can be difficult to obtain within automotive environments, especially under transient and asymmetric heating and cooling scenarios. Evaluation of HVAC system performance in terms of comfort commonly requires human subject testing, which may involve multiple repetitions, as well as multiple test subjects. Instrumentation (typically comprised of an array of temperature sensors) is usually only sparsely applied across the human body, significantly reducing the spatial resolution of available test data. Further, since comfort is highly subjective in nature, a single test protocol can yield a wide variation in results which can only be overcome bymore » increasing the number of test replications and subjects. In light of these difficulties, various types of manikins are finding use in automotive testing scenarios. These manikins can act as human surrogates from which local skin and core temperatures can be obtained, which are necessary for accurately predicting local and whole body thermal sensation and comfort using a physiology-based comfort model (e.g., the Berkeley Comfort Model). This paper evaluates two different types of manikins, i) an adaptive sweating thermal manikin, which is coupled with a human thermoregulation model, running in real-time, to obtain realistic skin temperatures; and, ii) a passive sensor manikin, which is used to measure boundary conditions as they would act on a human, from which skin and core temperatures can be predicted using a thermophysiological model. The simulated physiological responses and comfort obtained from both of these manikin-model coupling schemes are compared to those of a human subject within a vehicle cabin compartment transient heat-up scenario.« less

  12. Validated Questionnaire of Maternal Attitude and Knowledge for Predicting Caries Risk in Children: Epidemiological Study in North Jakarta, Indonesia.

    PubMed

    Laksmiastuti, Sri Ratna; Budiardjo, Sarworini Bagio; Sutadi, Heriandi

    2017-06-01

    Predicting caries risk in children can be done by identifying caries risk factors. It is an important measure which contributes to best understanding of the cariogenic profile of the patient. Identification could be done by clinical examination and answering the questionnaire. We arrange the study to verify the questionnaire validation for predicting caries risk in children. The study was conducted on 62 pairs of mothers and their children, aged between 3 and 5 years. The questionnaire consists of 10 questions concerning mothers' attitude and knowledge about oral health. The reliability and validity test is based on Cronbach's alpha and correlation coefficient value. All question are reliable (Cronbach's alpha = 0.873) and valid (Corrected item-total item correlation >0.4). Five questionnaires of mother's attitude about oral health and five questionnaires of mother's knowledge about oral health are reliable and valid for predicting caries risk in children.

  13. A Step Made Toward Designing Microelectromechanical System (MEMS) Structures With High Reliability

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    2003-01-01

    The mechanical design of microelectromechanical systems-particularly for micropower generation applications-requires the ability to predict the strength capacity of load-carrying components over the service life of the device. These microdevices, which typically are made of brittle materials such as polysilicon, show wide scatter (stochastic behavior) in strength as well as a different average strength for different sized structures (size effect). These behaviors necessitate either costly and time-consuming trial-and-error designs or, more efficiently, the development of a probabilistic design methodology for MEMS. Over the years, the NASA Glenn Research Center s Life Prediction Branch has developed the CARES/Life probabilistic design methodology to predict the reliability of advanced ceramic components. In this study, done in collaboration with Johns Hopkins University, the ability of the CARES/Life code to predict the reliability of polysilicon microsized structures with stress concentrations is successfully demonstrated.

  14. [A reliability growth assessment method and its application in the development of equipment in space cabin].

    PubMed

    Chen, J D; Sun, H L

    1999-04-01

    Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.

  15. Application of a new laser Doppler imaging system in planning and monitoring of surgical flaps

    NASA Astrophysics Data System (ADS)

    Schlosser, Stefan; Wirth, Raphael; Plock, Jan A.; Serov, Alexandre; Banic, Andrej; Erni, Dominique

    2010-05-01

    There is a demand for technologies able to assess the perfusion of surgical flaps quantitatively and reliably to avoid ischemic complications. The aim of this study is to test a new high-speed high-definition laser Doppler imaging (LDI) system (FluxEXPLORER, Microvascular Imaging, Lausanne, Switzerland) in terms of preoperative mapping of the vascular supply (perforator vessels) and postoperative flow monitoring. The FluxEXPLORER performs perfusion mapping of an area 9×9 cm with a resolution of 256×256 pixels within 6 s in high-definition imaging mode. The sensitivity and predictability to localize perforators is expressed by the coincidence of preoperatively assessed LDI high flow spots with intraoperatively verified perforators in nine patients. 18 free flaps are monitored before, during, and after total ischemia. 63% of all verified perforators correspond to a high flow spot, and 38% of all high flow spots correspond to a verified perforator (positive predictive value). All perfused flaps reveal a value of above 221 perfusion units (PUs), and all values obtained in the ischemic flaps are beneath 187 PU. In summary, we conclude that the present LDI system can serve as a reliable, fast, and easy-to-handle tool to detect ischemia in free flaps, whereas perforator vessels cannot be detected appropriately.

  16. Predicting performance of axial pump inducer of LOX booster turbo-pump of staged combustion cycle based rocket engine using CFD

    NASA Astrophysics Data System (ADS)

    Mishra, Arpit; Ghosh, Parthasarathi

    2015-12-01

    For low cost, high thrust, space missions with high specific impulse and high reliability, inert weight needs to be minimized and thereby increasing the delivered payload. Turbopump feed system for a liquid propellant rocket engine (LPRE) has the highest power to weight ratio. Turbopumps are primarily equipped with an axial flow inducer to achieve the high angular velocity and low suction pressure in combination with increased system reliability. Performance of the turbopump strongly depends on the performance of the inducer. Thus, for designing a LPRE turbopump, demands optimization of the inducer geometry based on the performance of different off-design operating regimes. In this paper, steady-state CFD analysis of the inducer of a liquid oxygen (LOX) axial pump used as a booster pump for an oxygen rich staged combustion cycle rocket engine has been presented using ANSYS® CFX. Attempts have been made to obtain the performance characteristic curves for the LOX pump inducer. The formalism has been used to predict the performance of the inducer for the throttling range varying from 80% to 113% of nominal thrust and for the different rotational velocities from 4500 to 7500 rpm. The results have been analysed to determine the region of cavitation inception for different inlet pressure.

  17. Ensemble variant interpretation methods to predict enzyme activity and assign pathogenicity in the CAGI4 NAGLU (Human N-acetyl-glucosaminidase) and UBE2I (Human SUMO-ligase) challenges.

    PubMed

    Yin, Yizhou; Kundu, Kunal; Pal, Lipika R; Moult, John

    2017-09-01

    CAGI (Critical Assessment of Genome Interpretation) conducts community experiments to determine the state of the art in relating genotype to phenotype. Here, we report results obtained using newly developed ensemble methods to address two CAGI4 challenges: enzyme activity for population missense variants found in NAGLU (Human N-acetyl-glucosaminidase) and random missense mutations in Human UBE2I (Human SUMO E2 ligase), assayed in a high-throughput competitive yeast complementation procedure. The ensemble methods are effective, ranked second for SUMO-ligase and third for NAGLU, according to the CAGI independent assessors. However, in common with other methods used in CAGI, there are large discrepancies between predicted and experimental activities for a subset of variants. Analysis of the structural context provides some insight into these. Post-challenge analysis shows that the ensemble methods are also effective at assigning pathogenicity for the NAGLU variants. In the clinic, providing an estimate of the reliability of pathogenic assignments is the key. We have also used the NAGLU dataset to show that ensemble methods have considerable potential for this task, and are already reliable enough for use with a subset of mutations. © 2017 Wiley Periodicals, Inc.

  18. Computation of heats of transport in crystalline solids: II

    NASA Astrophysics Data System (ADS)

    Grout, P. J.; Lidiard, A. B.

    2008-10-01

    This paper explores the application of classical molecular dynamics to the computation of the heat of transport of Au atoms in a model of solid gold at several elevated temperatures above the Debye temperature. It is assumed that the solid shows vacancy disorder. The work shows that to obtain consistent and reliable results it is necessary (a) to use very small time steps (≈1 fs) in the molecular dynamics integration routine and (b) to take averages over a very large number of vacancy displacements—a number which varies with temperature but which is of the order of 105. The results for the reduced heat of transport for the Au atoms show that: (1) it is positive in sign, i.e. that the diffusion of Au atoms in a temperature gradient is biassed towards the cold region or equivalently that the vacancies tend to migrate towards the hotter region; (2) it is predicted to fall as the average temperature increases and that the variation is closely linear in (1/T); (3) its value at high T relative to the energy of activation for vacancy movement is close to the corresponding ratio of experimental quantities. Analysis of these results indicates that the method and model may allow reliable predictions for other metals having the face centred cubic structure.

  19. Geoid undulations and gravity anomalies over the Aral Sea, the Black Sea and the Caspian Sea from a combined GEOS-3/SEASAT/GEOSAT altimeter data set

    NASA Technical Reports Server (NTRS)

    Au, Andrew Y.; Brown, Richard D.; Welker, Jean E.

    1991-01-01

    Satellite-based altimetric data taken by GOES-3, SEASAT, and GEOSAT over the Aral Sea, the Black Sea, and the Caspian Sea are analyzed and a least squares collocation technique is used to predict the geoid undulations on a 0.25x0.25 deg. grid and to transform these geoid undulations to free air gravity anomalies. Rapp's 180x180 geopotential model is used as the reference surface for the collocation procedure. The result of geoid to gravity transformation is, however, sensitive to the information content of the reference geopotential model used. For example, considerable detailed surface gravity data were incorporated into the reference model over the Black Sea, resulting in a reference model with significant information content at short wavelengths. Thus, estimation of short wavelength gravity anomalies from gridded geoid heights is generally reliable over regions such as the Black Sea, using the conventional collocation technique with local empirical covariance functions. Over regions such as the Caspian Sea, where detailed surface data are generally not incorporated into the reference model, unconventional techniques are needed to obtain reliable gravity anomalies. Based on the predicted gravity anomalies over these inland seas, speculative tectonic structures are identified and geophysical processes are inferred.

  20. Predictors of competitive achievement among pubescent synchronized swimmers: an analysis of the solo-figure competition.

    PubMed

    Peric, M; Cavar, M; Zenic, N; Sekulic, D; Sajber, D

    2014-02-01

    This study examined the applicability of sport-specific fitness tests (SSTs), anthropometrics, and respiratory parameters in predicting competitive results among pubescent synchronized swimmers. A total of 25 synchronized swimmers (16-17 years; 166.2 ± 5.4 cm; and 58.4 ± 4.3 kg) volunteered for this study. The independent variables were body mass, body height, Body Mass Index (BMI), body fat percentage (BF%), lean body mass percentage, respiratory variables, and four SSTs (two specific power tests plus one aerobic- and one anaerobic-endurance test). The dependent variable was competitive achievement in the solo figure competition. The reliability analyses, Pearson's correlation coefficient and forward stepwise regression were calculated. The SSTs were reliable for testing fitness status among pubescent synchronized swimmers. The forward stepwise regression retained two SSTs, BF% and forced vital capacity (FVC, relative for age and stature) in a set of predictors of competitive achievement. Significant Beta coefficients are found for aerobic-endurance, SST and FVC. The sport-specific measure of aerobic endurance and FVC appropriately predicted competitive achievement with regard to the figures used in the competition when competitive results (the dependent variable) were obtained. Athletes and coaches should be aware of the probable negative influence of very low body fat levels on competitive achievement.

  1. PREDICTING CLINICALLY DIAGNOSED DYSENTERY INCIDENCE OBTAINED FROM MONTHLY CASE REPORTING BASED ON METEOROLOGICAL VARIABLES IN DALIAN, LIAONING PROVINCE, CHINA, 2005-2011 USING A DEVELOPED MODEL.

    PubMed

    An, Qingyu; Yao, Wei; Wu, Jun

    2015-03-01

    This study describes our development of a model to predict the incidence of clinically diagnosed dysentery in Dalian, Liaoning Province, China, using time series analysis. The model was developed using the seasonal autoregressive integrated moving average (SARIMA). Spearman correlation analysis was conducted to explore the relationship between meteorological variables and the incidence of clinically diagnosed dysentery. The meteorological variables which significantly correlated with the incidence of clinically diagnosed dysentery were then used as covariables in the model, which incorporated the monthly incidence of clinically diagnosed dysentery from 2005 to 2010 in Dalian. After model development, a simulation was conducted for the year 2011 and the results of this prediction were compared with the real observed values. The model performed best when the temperature data for the preceding month was used to predict clinically diagnosed dysentery during the following month. The developed model was effective and reliable in predicting the incidence of clinically diagnosed dysentery for most but not all months, and may be a useful tool for dysentery disease control and prevention, but further studies are needed to fine tune the model.

  2. Applicability of a panel method, which includes nonlinear effects, to a forward-swept-wing aircraft

    NASA Technical Reports Server (NTRS)

    Ross, J. C.

    1984-01-01

    The ability of a lower order panel method VSAERO, to accurately predict the lift and pitching moment of a complete forward-swept-wing/canard configuration was investigated. The program can simulate nonlinear effects including boundary-layer displacement thickness, wake roll up, and to a limited extent, separated wakes. The predictions were compared with experimental data obtained using a small-scale model in the 7- by 10- Foot Wind Tunnel at NASA Ames Research Center. For the particular configuration under investigation, wake roll up had only a small effect on the force and moment predictions. The effect of the displacement thickness modeling was to reduce the lift curve slope slightly, thus bringing the predicted lift into good agreement with the measured value. Pitching moment predictions were also improved by the boundary-layer simulation. The separation modeling was found to be sensitive to user inputs, but appears to give a reasonable representation of a separated wake. In general, the nonlinear capabilities of the code were found to improve the agreement with experimental data. The usefullness of the code would be enhanced by improving the reliability of the separated wake modeling and by the addition of a leading edge separation model.

  3. Predicting bioconcentration of chemicals into vegetation from soil or air using the molecular connectivity index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowdy, D.L.; McKone, T.E.; Hsieh, D.P.H.

    1995-12-31

    Bioconcentration factors (BCFs) are the ratio of chemical concentration found in an exposed organism (in this case a plant) to the concentration in an air or soil exposure medium. The authors examine here the use of molecular connectivity indices (MCIs) as quantitative structure-activity relationships (QSARS) for predicting BCFs for organic chemicals between plants and air or soil. The authors compare the reliability of the octanol-air partition coefficient (K{sub oa}) to the MC based prediction method for predicting plant/air partition coefficients. The authors also compare the reliability of the octanol/water partition coefficient (K{sub ow}) to the MC based prediction method formore » predicting plant/soil partition coefficients. The results here indicate that, relative to the use of K{sub ow} or K{sub oa} as predictors of BCFs the MC can substantially increase the reliability with which BCFs can be estimated. The authors find that the MC provides a relatively precise and accurate method for predicting the potential biotransfer of a chemical from environmental media into plants. In addition, the MC is much faster and more cost effective than direct measurements.« less

  4. Accuracy and reliability of observational gait analysis data: judgments of push-off in gait after stroke.

    PubMed

    McGinley, Jennifer L; Goldie, Patricia A; Greenwood, Kenneth M; Olney, Sandra J

    2003-02-01

    Physical therapists routinely observe gait in clinical practice. The purpose of this study was to determine the accuracy and reliability of observational assessments of push-off in gait after stroke. Eighteen physical therapists and 11 subjects with hemiplegia following a stroke participated in the study. Measurements of ankle power generation were obtained from subjects following stroke using a gait analysis system. Concurrent videotaped gait performances were observed by the physical therapists on 2 occasions. Ankle power generation at push-off was scored as either normal or abnormal using two 11-point rating scales. These observational ratings were correlated with the measurements of peak ankle power generation. A high correlation was obtained between the observational ratings and the measurements of ankle power generation (mean Pearson r=.84). Interobserver reliability was moderately high (mean intraclass correlation coefficient [ICC (2,1)]=.76). Intraobserver reliability also was high, with a mean ICC (2,1) of.89 obtained. Physical therapists were able to make accurate and reliable judgments of push-off in videotaped gait of subjects following stroke using observational assessment. Further research is indicated to explore the accuracy and reliability of data obtained with observational gait analysis as it occurs in clinical practice.

  5. Is It Reliable to Take the Molecular Docking Top Scoring Position as the Best Solution without Considering Available Structural Data?

    PubMed

    Ramírez, David; Caballero, Julio

    2018-04-28

    Molecular docking is the most frequently used computational method for studying the interactions between organic molecules and biological macromolecules. In this context, docking allows predicting the preferred pose of a ligand inside a receptor binding site. However, the selection of the “best” solution is not a trivial task, despite the widely accepted selection criterion that the best pose corresponds to the best energy score. Here, several rigid-target docking methods were evaluated on the same dataset with respect to their ability to reproduce crystallographic binding orientations, to test if the best energy score is a reliable criterion for selecting the best solution. For this, two experiments were performed: (A) to reconstruct the ligand-receptor complex by performing docking of the ligand in its own crystal structure receptor (defined as self-docking), and (B) to reconstruct the ligand-receptor complex by performing docking of the ligand in a crystal structure receptor that contains other ligand (defined as cross-docking). Root-mean square deviation (RMSD) was used to evaluate how different the obtained docking orientation is from the corresponding co-crystallized pose of the same ligand molecule. We found that docking score function is capable of predicting crystallographic binding orientations, but the best ranked solution according to the docking energy is not always the pose that reproduces the experimental binding orientation. This happened when self-docking was achieved, but it was critical in cross-docking. Taking into account that docking is typically used with predictive purposes, during cross-docking experiments, our results indicate that the best energy score is not a reliable criterion to select the best solution in common docking applications. It is strongly recommended to choose the best docking solution according to the scoring function along with additional structural criteria described for analogue ligands to assure the selection of a correct docking solution.

  6. Validity and reliability of the 6 minute walk in persons with fibromyalgia.

    PubMed

    King, S; Wessel, J; Bhambhani, Y; Maikala, R; Sholter, D; Maksymowych, W

    1999-10-01

    To assess the reliability and construct validity of the 6 minute walk (6MW) in persons with fibromyalgia (FM) and to determine an equation for predicting peak oxygen consumption (pVO2) from the distance covered in 6 minutes. Ninety-six women who met the American College of Rheumatology (ACR) criteria for FM were tested on the 6MW and the Fibromyalgia Impact Questionnaire (FIQ). A subset (n = 23) were tested on a separate day for pVO2 during a symptom-limited, incremental treadmill test. Twelve subjects repeated the 6MW five times over 10 days. Heart rate and rating of perceived exertion (RPE) were recorded for each walk. Intraclass correlations were used to determine the reliability of the 6MW. Validity was examined by correlating the 6MW with pVO2 and the FIQ. Body mass index (BMI) and 6MW were independent variables in a stepwise regression to predict pVO2. A significant increase in distance occurred from Walk 1 to Walk 2 (p = 0.000) with the distance maintained on the remaining walks (p = 0.148) The correlations of the 6MW with the FIQ and pVO2 were -0.325 and 0.657, respectively. The regression equation to predict pVO2 from 6MW distance and BMI was: pVO2 (ml/kg/min) = 21.48 + (-0.4316 x BMI) + [0.0304 x distance(m)] (R = 0.76, R2 = 0.66). When using the 6MW it is necessary to conduct a practice walk, with the second walk taken as the baseline measure. It was determined from the correlations that the 6MW cannot replace the FIQ as a measure of function. The 6MW may be used as an indicator of aerobic fitness, although obtaining VO2 by means of a graded exercise test is preferable.

  7. Response surface models for effects of temperature and previous growth sodium chloride on growth kinetics of Salmonella typhimurium on cooked chicken breast.

    PubMed

    Oscar, T P

    1999-12-01

    Response surface models were developed and validated for effects of temperature (10 to 40 degrees C) and previous growth NaCl (0.5 to 4.5%) on lag time (lambda) and specific growth rate (mu) of Salmonella Typhimurium on cooked chicken breast. Growth curves for model development (n = 55) and model validation (n = 16) were fit to a two-phase linear growth model to obtain lambda and mu of Salmonella Typhimurium on cooked chicken breast. Response surface models for natural logarithm transformations of lambda and mu as a function of temperature and previous growth NaCl were obtained by regression analysis. Both lambda and mu of Salmonella Typhimurium were affected (P < 0.0001) by temperature but not by previous growth NaCl. Models were validated against data not used in their development. Mean absolute relative error of predictions (model accuracy) was 26.6% for lambda and 15.4% for mu. Median relative error of predictions (model bias) was 0.9% for lambda and 5.2% for mu. Results indicated that the models developed provided reliable predictions of lambda and mu of Salmonella Typhimurium on cooked chicken breast within the matrix of conditions modeled. In addition, results indicated that previous growth NaCl (0.5 to 4.5%) was not a major factor affecting subsequent growth kinetics of Salmonella Typhimurium on cooked chicken breast. Thus, inclusion of previous growth NaCl in predictive models may not significantly improve our ability to predict growth of Salmonella spp. on food subjected to temperature abuse.

  8. Integration of RAMS in LCC analysis for linear transport infrastructures. A case study for railways.

    NASA Astrophysics Data System (ADS)

    Calle-Cordón, Álvaro; Jiménez-Redondo, Noemi; Morales-Gámiz, F. J.; García-Villena, F. A.; Garmabaki, Amir H. S.; Odelius, Johan

    2017-09-01

    Life-cycle cost (LCC) analysis is an economic technique used to assess the total costs associated with the lifetime of a system in order to support decision making in long term strategic planning. For complex systems, such as railway and road infrastructures, the cost of maintenance plays an important role in the LCC analysis. Costs associated with maintenance interventions can be more reliably estimated by integrating the probabilistic nature of the failures associated to these interventions in the LCC models. Reliability, Maintainability, Availability and Safety (RAMS) parameters describe the maintenance needs of an asset in a quantitative way by using probabilistic information extracted from registered maintenance activities. Therefore, the integration of RAMS in the LCC analysis allows obtaining reliable predictions of system maintenance costs and the dependencies of these costs with specific cost drivers through sensitivity analyses. This paper presents an innovative approach for a combined RAMS & LCC methodology for railway and road transport infrastructures being developed under the on-going H2020 project INFRALERT. Such RAMS & LCC analysis provides relevant probabilistic information to be used for condition and risk-based planning of maintenance activities as well as for decision support in long term strategic investment planning.

  9. Radio-wave propagation for space communications systems

    NASA Technical Reports Server (NTRS)

    Ippolito, L. J.

    1981-01-01

    The most recent information on the effects of Earth's atmosphere on space communications systems is reviewed. The design and reliable operation of satellite systems that provide the many applications in space which rely on the transmission of radio waves for communications and scientific purposes are dependent on the propagation characteristics of the transmission path. The presence of atmospheric gases, clouds, fog, precipitation, and turbulence causes uncontrolled variations in the signal characteristics. These variations can result in a reduction of the quality and reliability of the transmitted information. Models and other techniques are used in the prediction of atmospheric effects as influenced by frequency, geography, elevation angle, and type of transmission. Recent data on performance characteristics obtained from direct measurements on satellite links operating to above 30 GHz have been reviewed. Particular emphasis has been placed on the effects of precipitation on the Earth/space path, including rain attenuation, and ice particle depolarization. Other factors are sky noise, antenna gain degradation, scintillations, and bandwidth coherence. Each of the various propagation factors has an effect on design criteria for communications systems. These criteria include link reliability, power margins, noise contribution, modulation and polarization factors, channel cross talk, error rate, and bandwidth limitations.

  10. Reliability of astronomical records in the Nihongi

    NASA Astrophysics Data System (ADS)

    Kawabata, Kin-Aki; Tanikawa, Kiyotaka; Soma, Mitsuru

    2002-03-01

    Records of solar and lunar eclipses and occultations of stars in the Nihongi have been investigated to show their usefulness in answering questions about the long term variability of the Earth's rate of rotation. Results show that reliability of these records depend on the volume of the Nihongi and records in β group of volumes in the classification by H. Mori based on Chinese characters employed as phonetic letters, i.e. Vol. 22 (Empress Suiko), Vol. 23 (Emperor Jomei), and Vol. 29 (Emperor Tenmu), are highly reliable for these studies. Studies of solar eclipses recorded as total eclipses in the Nihongi and the Suishu and an occultation of Mars recorded in the Nihongi show that good agreements can be obtained between descriptions in these Japanese and Chinese historical books and calculations when we adopt TT-UT=3000 sec with correction for tidal term -2.0"/cy2 in the 7th century. Descriptions of solar and lunar eclipses recorded in Vol. 24 (Empress Kogyoku) and Vol. 30 (Empress Jito) are not based on observations but on theoretical predictions. All records of comets, aurorae, volcanic explosions, earthquakes, and tsunami in the Nihongi are described in β group volumes.

  11. Evaluating and improving count-based population inference: A case study from 31 years of monitoring Sandhill Cranes

    USGS Publications Warehouse

    Gerber, Brian D.; Kendall, William L.

    2017-01-01

    Monitoring animal populations can be difficult. Limited resources often force monitoring programs to rely on unadjusted or smoothed counts as an index of abundance. Smoothing counts is commonly done using a moving-average estimator to dampen sampling variation. These indices are commonly used to inform management decisions, although their reliability is often unknown. We outline a process to evaluate the biological plausibility of annual changes in population counts and indices from a typical monitoring scenario and compare results with a hierarchical Bayesian time series (HBTS) model. We evaluated spring and fall counts, fall indices, and model-based predictions for the Rocky Mountain population (RMP) of Sandhill Cranes (Antigone canadensis) by integrating juvenile recruitment, harvest, and survival into a stochastic stage-based population model. We used simulation to evaluate population indices from the HBTS model and the commonly used 3-yr moving average estimator. We found counts of the RMP to exhibit biologically unrealistic annual change, while the fall population index was largely biologically realistic. HBTS model predictions suggested that the RMP changed little over 31 yr of monitoring, but the pattern depended on assumptions about the observational process. The HBTS model fall population predictions were biologically plausible if observed crane harvest mortality was compensatory up to natural mortality, as empirical evidence suggests. Simulations indicated that the predicted mean of the HBTS model was generally a more reliable estimate of the true population than population indices derived using a moving 3-yr average estimator. Practitioners could gain considerable advantages from modeling population counts using a hierarchical Bayesian autoregressive approach. Advantages would include: (1) obtaining measures of uncertainty; (2) incorporating direct knowledge of the observational and population processes; (3) accommodating missing years of data; and (4) forecasting population size.

  12. HIGH PREVALENCE OF VOLUNTARY STERILIZATION AMONG AMERICAN WOMEN EXPLAINED BY TRADE-OFFS RESULTING FROM MALE PARENTAL COMMITMENT.

    PubMed

    Anderson, Kermyt G

    2018-07-01

    SummaryTubal ligation is the modal form of family planning among American women aged 30 and older. As the preference for tubal ligation over cheaper, lower risk and more reliable methods, such as vasectomy, has puzzled experts, a theoretical approach that explains this preference would be useful. The present study investigates the high prevalence of voluntary sterilization among American women from the perspective of life history theory, arguing that the trade-offs between investing in current and future offspring will favour tubal ligation when women cannot obtain reliable male commitment to future parental investment. Data came from the National Survey of Fertility Barriers (NSFB), a nationally representative survey of 4712 American women aged 25-45 conducted between 2004 and 2007. Four novel predictions of the prevalence of tubal ligation, drawn from life history theory, were developed and tested: 1) it is most common among unpartnered women with children, and least common among married women with children; 2) it is negatively correlated with age at first birth; 3) it is least common among highly educated women without children, and most common among less educated women with children; and 4) among women with two or more children, it is positively correlated with lifetime number of long-term partners. These predictions were tested using multivariate regression analysis. The first prediction was not supported: women with children were more likely to be sterilized, regardless of their marital status. The other three predictions were all supported by the data. The results suggest that trade-offs influence women's decisions to undergo voluntary sterilization. Women are most likely to opt for tubal ligation when the costs of an additional child will impinge on their ability to invest in existing offspring, especially in the context of reduced male commitment.

  13. Validation of High Frequency (HF) Propagation Prediction Models in the Arctic region

    NASA Astrophysics Data System (ADS)

    Athieno, R.; Jayachandran, P. T.

    2014-12-01

    Despite the emergence of modern techniques for long distance communication, Ionospheric communication in the high frequency (HF) band (3-30 MHz) remains significant to both civilian and military users. However, the efficient use of the ever-varying ionosphere as a propagation medium is dependent on the reliability of ionospheric and HF propagation prediction models. Most available models are empirical implying that data collection has to be sufficiently large to provide good intended results. The models we present were developed with little data from the high latitudes which necessitates their validation. This paper presents the validation of three long term High Frequency (HF) propagation prediction models over a path within the Arctic region. Measurements of the Maximum Usable Frequency for a 3000 km range (MUF (3000) F2) for Resolute, Canada (74.75° N, 265.00° E), are obtained from hand-scaled ionograms generated by the Canadian Advanced Digital Ionosonde (CADI). The observations have been compared with predictions obtained from the Ionospheric Communication Enhanced Profile Analysis Program (ICEPAC), Voice of America Coverage Analysis Program (VOACAP) and International Telecommunication Union Recommendation 533 (ITU-REC533) for 2009, 2011, 2012 and 2013. A statistical analysis shows that the monthly predictions seem to reproduce the general features of the observations throughout the year though it is more evident in the winter and equinox months. Both predictions and observations show a diurnal and seasonal variation. The analysed models did not show large differences in their performances. However, there are noticeable differences across seasons for the entire period analysed: REC533 gives a better performance in winter months while VOACAP has a better performance for both equinox and summer months. VOACAP gives a better performance in the daily predictions compared to ICEPAC though, in general, the monthly predictions seem to agree more with the observations compared to the daily predictions.

  14. Predicting grief intensity after recent perinatal loss.

    PubMed

    Hutti, Marianne H; Myers, John; Hall, Lynne A; Polivka, Barbara J; White, Susan; Hill, Janice; Kloenne, Elizabeth; Hayden, Jaclyn; Grisanti, Meredith McGrew

    2017-10-01

    The Perinatal Grief Intensity Scale (PGIS) was developed for clinical use to identify and predict intense grief and need for follow-up after perinatal loss. This study evaluates the validity of the PGIS via its ability to predict future intense grief based on a PGIS score obtained early after a loss. A prospective observational study was conducted with 103 international, English-speaking women recruited at hospital discharge or via the internet who experienced a miscarriage, stillbirth, or neonatal death within the previous 8weeks. Survey data were collected at baseline using the PGIS and the Perinatal Grief Scale (PGS). Follow-up data on the PGS were obtained 3months later. Data analysis included descriptive statistics, Cronbach's alpha, receiver operating characteristic curve analysis, and confirmatory factor analysis. Cronbach's alphas were ≥0.70 for both instruments. PGIS factor analysis yielded three factors as predicted, explaining 57.7% of the variance. The optimal cutoff identified for the PGIS was 3.535. No difference was found when the ability of the PGIS to identify intense grief was compared to the PGS (p=0.754). The PGIS was not inferior to the PGS (AUC=0.78, 95% CI 0.68-0.88, p<0.001) in predicting intense grief at the follow-up. A PGIS score≥3.53 at baseline was associated with increased grief intensity at Time 2 (PGS: OR=1.97, 95% CI 1.59-2.34, p<0.001). The PGIS is comparable to the PGS, has a lower response burden, and can reliably and validly predict women who may experience future intense grief associated with perinatal loss. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. The Reliability and Stability of an Inferred Phylogenetic Tree from Empirical Data.

    PubMed

    Katsura, Yukako; Stanley, Craig E; Kumar, Sudhir; Nei, Masatoshi

    2017-03-01

    The reliability of a phylogenetic tree obtained from empirical data is usually measured by the bootstrap probability (Pb) of interior branches of the tree. If the bootstrap probability is high for most branches, the tree is considered to be reliable. If some interior branches show relatively low bootstrap probabilities, we are not sure that the inferred tree is really reliable. Here, we propose another quantity measuring the reliability of the tree called the stability of a subtree. This quantity refers to the probability of obtaining a subtree (Ps) of an inferred tree obtained. We then show that if the tree is to be reliable, both Pb and Ps must be high. We also show that Ps is given by a bootstrap probability of the subtree with the closest outgroup sequence, and computer program RESTA for computing the Pb and Ps values will be presented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  16. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  17. Prediction of physical-chemical properties of crude oils by 1H NMR analysis of neat samples and chemometrics.

    PubMed

    Masili, Alice; Puligheddu, Sonia; Sassu, Lorenzo; Scano, Paola; Lai, Adolfo

    2012-11-01

    In this work, we report the feasibility study to predict the properties of neat crude oil samples from 300-MHz NMR spectral data and partial least squares (PLS) regression models. The study was carried out on 64 crude oil samples obtained from 28 different extraction fields and aims at developing a rapid and reliable method for characterizing the crude oil in a fast and cost-effective way. The main properties generally employed for evaluating crudes' quality and behavior during refining were measured and used for calibration and testing of the PLS models. Among these, the UOP characterization factor K (K(UOP)) used to classify crude oils in terms of composition, density (D), total acidity number (TAN), sulfur content (S), and true boiling point (TBP) distillation yields were investigated. Test set validation with an independent set of data was used to evaluate model performance on the basis of standard error of prediction (SEP) statistics. Model performances are particularly good for K(UOP) factor, TAN, and TPB distillation yields, whose standard error of calibration and SEP values match the analytical method precision, while the results obtained for D and S are less accurate but still useful for predictions. Furthermore, a strategy that reduces spectral data preprocessing and sample preparation procedures has been adopted. The models developed with such an ample crude oil set demonstrate that this methodology can be applied with success to modern refining process requirements. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Design of novel quinazolinone derivatives as inhibitors for 5HT7 receptor.

    PubMed

    Chitta, Aparna; Jatavath, Mohan Babu; Fatima, Sabiha; Manga, Vijjulatha

    2012-02-01

    To study the pharmacophore properties of quinazolinone derivatives as 5HT(7) inhibitors, 3D QSAR methodologies, namely Comparative Molecular Field Analysis (CoMFA) and Comparative Molecular Similarity Indices Analysis (CoMSIA) were applied, partial least square (PLS) analysis was performed and QSAR models were generated. The derived model showed good statistical reliability in terms of predicting the 5HT(7) inhibitory activity of the quinazolione derivative, based on molecular property fields like steric, electrostatic, hydrophobic, hydrogen bond donor and hydrogen bond acceptor fields. This is evident from statistical parameters like q(2) (cross validated correlation coefficient) of 0.642, 0.602 and r(2) (conventional correlation coefficient) of 0.937, 0.908 for CoMFA and CoMSIA respectively. The predictive ability of the models to determine 5HT(7) antagonistic activity is validated using a test set of 26 molecules that were not included in the training set and the predictive r(2) obtained for the test set was 0.512 & 0.541. Further, the results of the derived model are illustrated by means of contour maps, which give an insight into the interaction of the drug with the receptor. The molecular fields so obtained served as the basis for the design of twenty new ligands. In addition, ADME (Adsorption, Distribution, Metabolism and Elimination) have been calculated in order to predict the relevant pharmaceutical properties, and the results are in conformity with required drug like properties.

  19. Reliable probabilities through statistical post-processing of ensemble predictions

    NASA Astrophysics Data System (ADS)

    Van Schaeybroeck, Bert; Vannitsem, Stéphane

    2013-04-01

    We develop post-processing or calibration approaches based on linear regression that make ensemble forecasts more reliable. We enforce climatological reliability in the sense that the total variability of the prediction is equal to the variability of the observations. Second, we impose ensemble reliability such that the spread around the ensemble mean of the observation coincides with the one of the ensemble members. In general the attractors of the model and reality are inhomogeneous. Therefore ensemble spread displays a variability not taken into account in standard post-processing methods. We overcome this by weighting the ensemble by a variable error. The approaches are tested in the context of the Lorenz 96 model (Lorenz 1996). The forecasts become more reliable at short lead times as reflected by a flatter rank histogram. Our best method turns out to be superior to well-established methods like EVMOS (Van Schaeybroeck and Vannitsem, 2011) and Nonhomogeneous Gaussian Regression (Gneiting et al., 2005). References [1] Gneiting, T., Raftery, A. E., Westveld, A., Goldman, T., 2005: Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. Mon. Weather Rev. 133, 1098-1118. [2] Lorenz, E. N., 1996: Predictability - a problem partly solved. Proceedings, Seminar on Predictability ECMWF. 1, 1-18. [3] Van Schaeybroeck, B., and S. Vannitsem, 2011: Post-processing through linear regression, Nonlin. Processes Geophys., 18, 147.

  20. Expanding Reliability Generalization Methods with KR-21 Estimates: An RG Study of the Coopersmith Self-Esteem Inventory.

    ERIC Educational Resources Information Center

    Lane, Ginny G.; White, Amy E.; Henson, Robin K.

    2002-01-01

    Conducted a reliability generalizability study on the Coopersmith Self-Esteem Inventory (CSEI; S. Coopersmith, 1967) to examine the variability of reliability estimates across studies and to identify study characteristics that may predict this variability. Results show that reliability for CSEI scores can vary considerably, especially at the…

  1. A Reliability Generalization Study of the Marlowe-Crowne Social Desirability Scale.

    ERIC Educational Resources Information Center

    Beretvas, S, Natasha; Meyers, Jason L.; Leite, Walter L.

    2002-01-01

    Conducted a reliability generalization study of the Marlowe-Crowne Social Desirability Scale (D. Crowne and D. Marlowe, 1960). Analysis of 93 studies show that the predicted score reliability for male adolescents was 0.53, and reliability for men's responses was lower than for women's. Discusses the need for further analysis of the scale. (SLD)

  2. MultiMiTar: a novel multi objective optimization based miRNA-target prediction method.

    PubMed

    Mitra, Ramkrishna; Bandyopadhyay, Sanghamitra

    2011-01-01

    Machine learning based miRNA-target prediction algorithms often fail to obtain a balanced prediction accuracy in terms of both sensitivity and specificity due to lack of the gold standard of negative examples, miRNA-targeting site context specific relevant features and efficient feature selection process. Moreover, all the sequence, structure and machine learning based algorithms are unable to distribute the true positive predictions preferentially at the top of the ranked list; hence the algorithms become unreliable to the biologists. In addition, these algorithms fail to obtain considerable combination of precision and recall for the target transcripts that are translationally repressed at protein level. In the proposed article, we introduce an efficient miRNA-target prediction system MultiMiTar, a Support Vector Machine (SVM) based classifier integrated with a multiobjective metaheuristic based feature selection technique. The robust performance of the proposed method is mainly the result of using high quality negative examples and selection of biologically relevant miRNA-targeting site context specific features. The features are selected by using a novel feature selection technique AMOSA-SVM, that integrates the multi objective optimization technique Archived Multi-Objective Simulated Annealing (AMOSA) and SVM. MultiMiTar is found to achieve much higher Matthew's correlation coefficient (MCC) of 0.583 and average class-wise accuracy (ACA) of 0.8 compared to the others target prediction methods for a completely independent test data set. The obtained MCC and ACA values of these algorithms range from -0.269 to 0.155 and 0.321 to 0.582, respectively. Moreover, it shows a more balanced result in terms of precision and sensitivity (recall) for the translationally repressed data set as compared to all the other existing methods. An important aspect is that the true positive predictions are distributed preferentially at the top of the ranked list that makes MultiMiTar reliable for the biologists. MultiMiTar is now available as an online tool at www.isical.ac.in/~bioinfo_miu/multimitar.htm. MultiMiTar software can be downloaded from www.isical.ac.in/~bioinfo_miu/multimitar-download.htm.

  3. Maximizing the reliability of genomic selection by optimizing the calibration set of reference individuals: comparison of methods in two diverse groups of maize inbreds (Zea mays L.).

    PubMed

    Rincent, R; Laloë, D; Nicolas, S; Altmann, T; Brunel, D; Revilla, P; Rodríguez, V M; Moreno-Gonzalez, J; Melchinger, A; Bauer, E; Schoen, C-C; Meyer, N; Giauffret, C; Bauland, C; Jamin, P; Laborde, J; Monod, H; Flament, P; Charcosset, A; Moreau, L

    2012-10-01

    Genomic selection refers to the use of genotypic information for predicting breeding values of selection candidates. A prediction formula is calibrated with the genotypes and phenotypes of reference individuals constituting the calibration set. The size and the composition of this set are essential parameters affecting the prediction reliabilities. The objective of this study was to maximize reliabilities by optimizing the calibration set. Different criteria based on the diversity or on the prediction error variance (PEV) derived from the realized additive relationship matrix-best linear unbiased predictions model (RA-BLUP) were used to select the reference individuals. For the latter, we considered the mean of the PEV of the contrasts between each selection candidate and the mean of the population (PEVmean) and the mean of the expected reliabilities of the same contrasts (CDmean). These criteria were tested with phenotypic data collected on two diversity panels of maize (Zea mays L.) genotyped with a 50k SNPs array. In the two panels, samples chosen based on CDmean gave higher reliabilities than random samples for various calibration set sizes. CDmean also appeared superior to PEVmean, which can be explained by the fact that it takes into account the reduction of variance due to the relatedness between individuals. Selected samples were close to optimality for a wide range of trait heritabilities, which suggests that the strategy presented here can efficiently sample subsets in panels of inbred lines. A script to optimize reference samples based on CDmean is available on request.

  4. Reliability Practice at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Pruessner, Paula S.; Li, Ming

    2008-01-01

    This paper describes in brief the Reliability and Maintainability (R&M) Programs performed directly by the reliability branch at Goddard Space Flight Center (GSFC). The mission assurance requirements flow down is explained. GSFC practices for PRA, reliability prediction/fault tree analysis/reliability block diagram, FMEA, part stress and derating analysis, worst case analysis, trend analysis, limit life items are presented. Lessons learned are summarized and recommendations on improvement are identified.

  5. The Environmental Reward Observation Scale (EROS): development, validity, and reliability.

    PubMed

    Armento, Maria E A; Hopko, Derek R

    2007-06-01

    Researchers acknowledge a strong association between the frequency and duration of environmental reward and affective mood states, particularly in relation to the etiology, assessment, and treatment of depression. Given behavioral theories that outline environmental reward as a strong mediator of affect and the unavailability of an efficient, reliable, and valid self-report measure of environmental reward, we developed the Environmental Reward Observation Scale (EROS) and examined its psychometric properties. In Experiment 1, exploratory factor analysis supported a unidimensional 10-item measure with strong internal consistency and test-retest reliability. When administered to a replication sample, confirmatory factor analysis suggested an excellent fit to the 1-factor model and convergent/discriminant validity data supported the construct validity of the EROS. In Experiment 2, further support for the convergent validity of the EROS was obtained via moderate correlations with the Pleasant Events Schedule (PES; MacPhillamy & Lewinsohn, 1976). In Experiment 3, hierarchical regression supported the ecological validity of the EROS toward predicting daily diary reports of time spent in highly rewarding behaviors and activities. Above and beyond variance accounted for by depressive symptoms (BDI), the EROS was associated with significant incremental variance in accounting for time spent in both low and high reward behaviors. The EROS may represent a brief, reliable and valid measure of environmental reward that may improve the psychological assessment of negative mood states such as clinical depression.

  6. The Strengths and Difficulties Questionnaire: psychometric properties of the parent and teacher version in children aged 4-7.

    PubMed

    Stone, Lisanne L; Janssens, Jan M A M; Vermulst, Ad A; Van Der Maten, Marloes; Engels, Rutger C M E; Otten, Roy

    2015-01-01

    The Strengths and Difficulties Questionnaire is one of the most employed screening instruments. Although there is a large research body investigating its psychometric properties, reliability and validity are not yet fully tested using modern techniques. Therefore, we investigate reliability, construct validity, measurement invariance, and predictive validity of the parent and teacher version in children aged 4-7. Besides, we intend to replicate previous studies by investigating test-retest reliability and criterion validity. In a Dutch community sample 2,238 teachers and 1,513 parents filled out questionnaires regarding problem behaviors and parenting, while 1,831 children reported on sociometric measures at T1. These children were followed-up during three consecutive years. Reliability was examined using Cronbach's alpha and McDonald's omega, construct validity was examined by Confirmatory Factor Analysis, and predictive validity was examined by calculating developmental profiles and linking these to measures of inadequate parenting, parenting stress and social preference. Further, mean scores and percentiles were examined in order to establish norms. Omega was consistently higher than alpha regarding reliability. The original five-factor structure was replicated, and measurement invariance was established on a configural level. Further, higher SDQ scores were associated with future indices of higher inadequate parenting, higher parenting stress and lower social preference. Finally, previous results on test-retest reliability and criterion validity were replicated. This study is the first to show SDQ scores are predictively valid, attesting to the feasibility of the SDQ as a screening instrument. Future research into predictive validity of the SDQ is warranted.

  7. Effect of Individual Component Life Distribution on Engine Life Prediction

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.

    2003-01-01

    The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.

  8. Is the necrosis/wall ADC ratio useful for the differentiation of benign and malignant breast lesions?

    PubMed Central

    Durur-Karakaya, Afak; Karaman, Adem; Seker, Mehmet; Demirci, Elif; Alper, Fatih

    2017-01-01

    Objective: To determine whether the necrosis/wall apparent diffusion coefficient (ADC) ratio is useful for the malignant–benign differentiation of necrotic breast lesions. Methods: Breast MRI was performed using a 3-T system. In this retrospective study, calculation of the necrosis/wall ADC ratio was based on ADC values measured from the necrosis and from the wall of malignant and benign breast lesions by diffusion-weighted imaging (DWI). By synchronizing post-contrast T1 weighted images, the separate parts of wall and necrosis were maintained. All the diagnoses were pathologically confirmed. Statistical analyses were conducted using an independent sample t-test and receiver operating characteristic analysis. The intraclass and interclass correlations were evaluated. Results: A total of 66 female patients were enrolled, 38 of whom had necrotic breast carcinomas and 28 of whom had breast abscesses. The ADC values were obtained from both the wall and necrosis. The mean necrosis/wall ADC ratio (± standard deviation) was 1.61 ± 0.51 in carcinomas, and it was 0.65 ± 0.33 in abscesses. The area under the curve values for necrosis ADC, wall ADC and the necrosis/wall ADC ratio were 0.680, 0.068 and 0.942, respectively. A wall/necrosis ADC ratio cut-off value of 1.18 demonstrated a sensitivity of 97%, specificity of 93%, a positive-predictive value of 95%, a negative-predictive value of 96% and an accuracy of 95% in determining the malignant nature of necrotic breast lesions. There was a good intra- and interclass reliability for the ADC values of both necrosis and wall. Conclusion: The necrosis/wall ADC ratio appears to be a reliable and promising tool for discriminating breast carcinomas from abscesses using DWI. Advances in knowledge: ADC values of the necrosis obtained by DWI are valuable for malignant-benign differentiation in necrotic breast lesions. The necrosis/wall ADC ratio appears to be a reliable and promising tool in the breast imaging field. PMID:28339285

  9. What do we gain with Probabilistic Flood Loss Models?

    NASA Astrophysics Data System (ADS)

    Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.

    2015-12-01

    The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  10. Validity of the Special Needs Education Assessment Tool (SNEAT), a Newly Developed Scale for Children with Disabilities.

    PubMed

    Kohara, Aiko; Han, ChangWan; Kwon, HaeJin; Kohzuki, Masahiro

    2015-11-01

    The improvement of the quality of life (QOL) of children with disabilities has been considered important. Therefore, the Special Needs Education Assessment Tool (SNEAT) was developed based on the concept of QOL to objectively evaluate the educational outcome of children with disabilities. SNEAT consists of 11 items in three domains: physical functioning, mental health, and social functioning. This study aimed to verify the reliability and construct validity of SNEAT using 93 children collected from the classes on independent activities of daily living for children with disabilities in Okinawa Prefecture between October and November 2014. Survey data were collected in a longitudinal prospective cohort study. The reliability of SNEAT was verified via the internal consistency method and the test-pretest method; both the coefficient of Cronbach's α and the intra-class correlation coefficient were over 0.7. The validity of SNEAT was also verified via one-way repeated-measures ANOVA and the latent growth curve model. The scores of all the items and domains and the total scores obtained from one-way repeated-measures ANOVA were the same as the predicted scores. SNEAT is valid based on its goodness-of-fit values obtained using the latent growth curve model, where the values of comparative fit index (0.983) and root mean square error of approximation (0.062) were within the goodness-of-fit range. These results indicate that SNEAT has high reliability and construct validity and may contribute to improve QOL of children with disabilities in the classes on independent activities of daily living for children with disabilities.

  11. The reliability of Fishman method of skeletal maturation for age estimation in children of South Indian population.

    PubMed

    Mohammed, Rezwana Begum; Kalyan, V Siva; Tircouveluri, Saritha; Vegesna, Goutham Chakravarthy; Chirla, Anil; Varma, D Maruthi

    2014-07-01

    Determining the age of a person in the absence of documentary evidence of birth is essential for legal and medico-legal purpose. Fishman method of skeletal maturation is widely used for this purpose; however, the reliability of this method for people with all geographic locations is not well-established. In this study, we assessed various stages of carpal and metacarpal bone maturation and tested the reliability of Fishman method of skeletal maturation to estimate the age in South Indian population. We also evaluated the correlation between the chronological age (CA) and predicted age based on the Fishman method of skeletal maturation. Digital right hand-wrist radiographs of 330 individuals aged 9-20 years were obtained and the skeletal maturity stage for each subject was determined using Fishman method. The skeletal maturation indicator scores were obtained and analyzed with reference to CA and sex. Data was analyzed using the SPSS software package (version 12, SPSS Inc., Chicago, IL, USA). The study subjects had a tendency toward late maturation with the mean skeletal age (SA) estimated being significantly lowers (P < 0.05) than the mean CA at various skeletal maturity stages. Nevertheless, significant correlation was observed in this study between SA and CA for males (r = 0.82) and females (r = 0.85). Interestingly, female subjects were observed to be advanced in SA compared with males. Fishman method of skeletal maturation can be used as an alternative tool for the assessment of mean age of an individual of unknown CA in South Indian children.

  12. DATA ASSIMILATION APPROACH FOR FORECAST OF SOLAR ACTIVITY CYCLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitiashvili, Irina N., E-mail: irina.n.kitiashvili@nasa.gov

    Numerous attempts to predict future solar cycles are mostly based on empirical relations derived from observations of previous cycles, and they yield a wide range of predicted strengths and durations of the cycles. Results obtained with current dynamo models also deviate strongly from each other, thus raising questions about criteria to quantify the reliability of such predictions. The primary difficulties in modeling future solar activity are shortcomings of both the dynamo models and observations that do not allow us to determine the current and past states of the global solar magnetic structure and its dynamics. Data assimilation is a relativelymore » new approach to develop physics-based predictions and estimate their uncertainties in situations where the physical properties of a system are not well-known. This paper presents an application of the ensemble Kalman filter method for modeling and prediction of solar cycles through use of a low-order nonlinear dynamo model that includes the essential physics and can describe general properties of the sunspot cycles. Despite the simplicity of this model, the data assimilation approach provides reasonable estimates for the strengths of future solar cycles. In particular, the prediction of Cycle 24 calculated and published in 2008 is so far holding up quite well. In this paper, I will present my first attempt to predict Cycle 25 using the data assimilation approach, and discuss the uncertainties of that prediction.« less

  13. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  14. A sparse autoencoder-based deep neural network for protein solvent accessibility and contact number prediction.

    PubMed

    Deng, Lei; Fan, Chao; Zeng, Zhiwen

    2017-12-28

    Direct prediction of the three-dimensional (3D) structures of proteins from one-dimensional (1D) sequences is a challenging problem. Significant structural characteristics such as solvent accessibility and contact number are essential for deriving restrains in modeling protein folding and protein 3D structure. Thus, accurately predicting these features is a critical step for 3D protein structure building. In this study, we present DeepSacon, a computational method that can effectively predict protein solvent accessibility and contact number by using a deep neural network, which is built based on stacked autoencoder and a dropout method. The results demonstrate that our proposed DeepSacon achieves a significant improvement in the prediction quality compared with the state-of-the-art methods. We obtain 0.70 three-state accuracy for solvent accessibility, 0.33 15-state accuracy and 0.74 Pearson Correlation Coefficient (PCC) for the contact number on the 5729 monomeric soluble globular protein dataset. We also evaluate the performance on the CASP11 benchmark dataset, DeepSacon achieves 0.68 three-state accuracy and 0.69 PCC for solvent accessibility and contact number, respectively. We have shown that DeepSacon can reliably predict solvent accessibility and contact number with stacked sparse autoencoder and a dropout approach.

  15. Reliability of smartphone-based gait measurements for quantification of physical activity/inactivity levels

    PubMed Central

    Ebara, Takeshi; Azuma, Ryohei; Shoji, Naoto; Matsukawa, Tsuyoshi; Yamada, Yasuyuki; Akiyama, Tomohiro; Kurihara, Takahiro; Yamada, Shota

    2017-01-01

    Objectives: Objective measurements using built-in smartphone sensors that can measure physical activity/inactivity in daily working life have the potential to provide a new approach to assessing workers' health effects. The aim of this study was to elucidate the characteristics and reliability of built-in step counting sensors on smartphones for development of an easy-to-use objective measurement tool that can be applied in ergonomics or epidemiological research. Methods: To evaluate the reliability of step counting sensors embedded in seven major smartphone models, the 6-minute walk test was conducted and the following analyses of sensor precision and accuracy were performed: 1) relationship between actual step count and step count detected by sensors, 2) reliability between smartphones of the same model, and 3) false detection rates when sitting during office work, while riding the subway, and driving. Results: On five of the seven models, the inter-class correlations coefficient (ICC (3,1)) showed high reliability with a range of 0.956-0.993. The other two models, however, had ranges of 0.443-0.504 and the relative error ratios of the sensor-detected step count to the actual step count were ±48.7%-49.4%. The level of agreement between the same models was ICC (3,1): 0.992-0.998. The false detection rates differed between the sitting conditions. Conclusions: These results suggest the need for appropriate regulation of step counts measured by sensors, through means such as correction or calibration with a predictive model formula, in order to obtain the highly reliable measurement results that are sought in scientific investigation. PMID:28835575

  16. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  17. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Wilson, Larry W.

    1989-01-01

    The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.

  18. The reliability and validity of ultrasound to quantify muscles in older adults: a systematic review

    PubMed Central

    Scafoglieri, Aldo; Jager‐Wittenaar, Harriët; Hobbelen, Johannes S.M.; van der Schans, Cees P.

    2017-01-01

    Abstract This review evaluates the reliability and validity of ultrasound to quantify muscles in older adults. The databases PubMed, Cochrane, and Cumulative Index to Nursing and Allied Health Literature were systematically searched for studies. In 17 studies, the reliability (n = 13) and validity (n = 8) of ultrasound to quantify muscles in community‐dwelling older adults (≥60 years) or a clinical population were evaluated. Four out of 13 reliability studies investigated both intra‐rater and inter‐rater reliability. Intraclass correlation coefficient (ICC) scores for reliability ranged from −0.26 to 1.00. The highest ICC scores were found for the vastus lateralis, rectus femoris, upper arm anterior, and the trunk (ICC = 0.72 to 1.000). All included validity studies found ICC scores ranging from 0.92 to 0.999. Two studies describing the validity of ultrasound to predict lean body mass showed good validity as compared with dual‐energy X‐ray absorptiometry (r 2 = 0.92 to 0.96). This systematic review shows that ultrasound is a reliable and valid tool for the assessment of muscle size in older adults. More high‐quality research is required to confirm these findings in both clinical and healthy populations. Furthermore, ultrasound assessment of small muscles needs further evaluation. Ultrasound to predict lean body mass is feasible; however, future research is required to validate prediction equations in older adults with varying function and health. PMID:28703496

  19. Reliability Driven Space Logistics Demand Analysis

    NASA Technical Reports Server (NTRS)

    Knezevic, J.

    1995-01-01

    Accurate selection of the quantity of logistic support resources has a strong influence on mission success, system availability and the cost of ownership. At the same time the accurate prediction of these resources depends on the accurate prediction of the reliability measures of the items involved. This paper presents a method for the advanced and accurate calculation of the reliability measures of complex space systems which are the basis for the determination of the demands for logistics resources needed during the operational life or mission of space systems. The applicability of the method presented is demonstrated through several examples.

  20. Reliability and Accuracy of Static Parameters Obtained From Ink and Pressure Platform Footprints.

    PubMed

    Zuil-Escobar, Juan Carlos; Martínez-Cepa, Carmen Belén; Martín-Urrialde, Jose Antonio; Gómez-Conesa, Antonia

    2016-09-01

    The purpose of this study was to evaluate the accuracy and the intrarater reliability of arch angle (AA), Staheli Index (SI), and Chippaux-Smirak Index (CSI) obtained from ink and pressure platform footprints. We obtained AA, SI, and CSI measurements from ink pedigraph footprints and pressure platform footprints in 40 healthy participants (aged 25.65 ± 5.187 years). Intrarater reliability was calculated for all parameters obtained using the 2 methods. Standard error of measurement and minimal detectable change were also calculated. A repeated-measure analysis of variance was used to identify differences between ink and pressure platform footprints. Intraclass correlation coefficient and Bland and Altman plots were used to assess similar parameters obtained using different methods. Intrarater reliability was >0.9 for all parameters and was slightly higher for the ink footprints. No statistical difference was reported in repeated-measure analysis of variance for any of the parameters. Intraclass correlation coefficient values from AA, SI, and CSI that were obtained using ink footprints and pressure platform footprints were excellent, ranging from 0.797 to 0.829. However, pressure platform overestimated AA and underestimated SI and CSI. Our study revealed that AA, SI, and CSI were similar regardless of whether the ink or pressure platform method was used. In addition, the parameters indicated high intrarater reliability and were reproducible. Copyright © 2016. Published by Elsevier Inc.

  1. Reliable prediction intervals with regression neural networks.

    PubMed

    Papadopoulos, Harris; Haralambous, Haris

    2011-10-01

    This paper proposes an extension to conventional regression neural networks (NNs) for replacing the point predictions they produce with prediction intervals that satisfy a required level of confidence. Our approach follows a novel machine learning framework, called Conformal Prediction (CP), for assigning reliable confidence measures to predictions without assuming anything more than that the data are independent and identically distributed (i.i.d.). We evaluate the proposed method on four benchmark datasets and on the problem of predicting Total Electron Content (TEC), which is an important parameter in trans-ionospheric links; for the latter we use a dataset of more than 60000 TEC measurements collected over a period of 11 years. Our experimental results show that the prediction intervals produced by our method are both well calibrated and tight enough to be useful in practice. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Reliability and Validity of the Load-Velocity Relationship to Predict the 1RM Back Squat.

    PubMed

    Banyard, Harry G; Nosaka, Kazunori; Haff, G Gregory

    2017-07-01

    Banyard, HG, Nosaka, K, and Haff, GG. Reliability and validity of the load-velocity relationship to predict the 1RM back squat. J Strength Cond Res 31(7): 1897-1904, 2017-This study investigated the reliability and validity of the load-velocity relationship to predict the free-weight back squat one repetition maximum (1RM). Seventeen strength-trained males performed three 1RM assessments on 3 separate days. All repetitions were performed to full depth with maximal concentric effort. Predicted 1RMs were calculated by entering the mean concentric velocity of the 1RM (V1RM) into an individualized linear regression equation, which was derived from the load-velocity relationship of 3 (20, 40, 60% of 1RM), 4 (20, 40, 60, 80% of 1RM), or 5 (20, 40, 60, 80, 90% of 1RM) incremental warm-up sets. The actual 1RM (140.3 ± 27.2 kg) was very stable between 3 trials (ICC = 0.99; SEM = 2.9 kg; CV = 2.1%; ES = 0.11). Predicted 1RM from 5 warm-up sets up to and including 90% of 1RM was the most reliable (ICC = 0.92; SEM = 8.6 kg; CV = 5.7%; ES = -0.02) and valid (r = 0.93; SEE = 10.6 kg; CV = 7.4%; ES = 0.71) of the predicted 1RM methods. However, all predicted 1RMs were significantly different (p ≤ 0.05; ES = 0.71-1.04) from the actual 1RM. Individual variation for the actual 1RM was small between trials ranging from -5.6 to 4.8% compared with the most accurate predictive method up to 90% of 1RM, which was more variable (-5.5 to 27.8%). Importantly, the V1RM (0.24 ± 0.06 m·s) was unreliable between trials (ICC = 0.42; SEM = 0.05 m·s; CV = 22.5%; ES = 0.14). The load-velocity relationship for the full depth free-weight back squat showed moderate reliability and validity but could not accurately predict 1RM, which was stable between trials. Thus, the load-velocity relationship 1RM prediction method used in this study cannot accurately modify sessional training loads because of large V1RM variability.

  3. A comparison of manual anthropometric measurements with Kinect-based scanned measurements in terms of precision and reliability.

    PubMed

    Bragança, Sara; Arezes, Pedro; Carvalho, Miguel; Ashdown, Susan P; Castellucci, Ignacio; Leão, Celina

    2018-01-01

    Collecting anthropometric data for real-life applications demands a high degree of precision and reliability. It is important to test new equipment that will be used for data collectionOBJECTIVE:Compare two anthropometric data gathering techniques - manual methods and a Kinect-based 3D body scanner - to understand which of them gives more precise and reliable results. The data was collected using a measuring tape and a Kinect-based 3D body scanner. It was evaluated in terms of precision by considering the regular and relative Technical Error of Measurement and in terms of reliability by using the Intraclass Correlation Coefficient, Reliability Coefficient, Standard Error of Measurement and Coefficient of Variation. The results obtained showed that both methods presented better results for reliability than for precision. Both methods showed relatively good results for these two variables, however, manual methods had better results for some body measurements. Despite being considered sufficiently precise and reliable for certain applications (e.g. apparel industry), the 3D scanner tested showed, for almost every anthropometric measurement, a different result than the manual technique. Many companies design their products based on data obtained from 3D scanners, hence, understanding the precision and reliability of the equipment used is essential to obtain feasible results.

  4. Determining Criteria to Predict Repeatability of Performance in Older Adults: Using Coefficients of Variation for Strength and Functional Measures.

    PubMed

    Raj, Isaac Selva; Bird, Stephen R; Westfold, Ben A; Shield, Anthony J

    2017-01-01

    Reliable measures of muscle strength and functional capacity in older adults are essential. The aim of this study was to determine whether coefficients of variation (CVs) of individuals obtained at the first session can infer repeatability of performance in a subsequent session. Forty-eight healthy older adults (mean age 68.6 ± 6.1 years; age range 60-80 years) completed two assessment sessions, and on each occasion undertook: dynamometry for isometric and isokinetic quadriceps strength, 6 meter fast walk (6MFWT), timed up and go (TUG), stair climb and descent, and vertical jump. Significant linear relationships were observed between CVs in session 1 and the percentage difference between sessions 1 and 2 for torque at 60, 120, 240 and 360°/s, 6MFWT, TUG, stair climb, and stair descent. The results of this study could be used to establish criteria for determining an acceptably reliable performance in strength and functional tests.

  5. Heterogeneous upper-bound finite element limit analysis of masonry walls out-of-plane loaded

    NASA Astrophysics Data System (ADS)

    Milani, G.; Zuccarello, F. A.; Olivito, R. S.; Tralli, A.

    2007-11-01

    A heterogeneous approach for FE upper bound limit analyses of out-of-plane loaded masonry panels is presented. Under the assumption of associated plasticity for the constituent materials, mortar joints are reduced to interfaces with a Mohr Coulomb failure criterion with tension cut-off and cap in compression, whereas for bricks both limited and unlimited strength are taken into account. At each interface, plastic dissipation can occur as a combination of out-of-plane shear, bending and torsion. In order to test the reliability of the model proposed, several examples of dry-joint panels out-of-plane loaded tested at the University of Calabria (Italy) are discussed. Numerical results are compared with experimental data for three different series of walls at different values of the in-plane compressive vertical loads applied. The comparisons show that reliable predictions of both collapse loads and failure mechanisms can be obtained by means of the numerical procedure employed.

  6. Operating Experience and Reliability Improvements on the 5 kW CW Klystron at Jefferson Lab

    NASA Astrophysics Data System (ADS)

    Nelson, R.; Holben, S.

    1997-05-01

    With substantial operating hours on the RF system, considerable information on reliability of the 5 kW CW klystrons has been obtained. High early failure rates led to examination of the operating conditions and failure modes. Internal ceramic contamination caused premature failure of gun potting material and ultimate tube demise through arcing or ceramic fracture. A planned course of repotting and reconditioning of approximately 300 klystrons, plus careful attention to operating conditions and periodic analysis of operational data, has substantially reduced the failure rate. It is anticipated that implementation of planned supplemental monitoring systems for the klystrons will allow most catastrophic failures to be avoided. By predicting end of life, tubes can be changed out before they fail, thus minimizing unplanned downtime. Initial tests have also been conducted on this same klystron operated at higher voltages with resultant higher output power. The outcome of these tests will provide information to be considered for future upgrades to the accelerator.

  7. Molecular docking and 3D-QSAR studies on inhibitors of DNA damage signaling enzyme human PARP-1.

    PubMed

    Fatima, Sabiha; Bathini, Raju; Sivan, Sree Kanth; Manga, Vijjulatha

    2012-08-01

    Poly (ADP-ribose) polymerase-1 (PARP-1) operates in a DNA damage signaling network. Molecular docking and three dimensional-quantitative structure activity relationship (3D-QSAR) studies were performed on human PARP-1 inhibitors. Docked conformation obtained for each molecule was used as such for 3D-QSAR analysis. Molecules were divided into a training set and a test set randomly in four different ways, partial least square analysis was performed to obtain QSAR models using the comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA). Derived models showed good statistical reliability that is evident from their r², q²(loo) and r²(pred) values. To obtain a consensus for predictive ability from all the models, average regression coefficient r²(avg) was calculated. CoMFA and CoMSIA models showed a value of 0.930 and 0.936, respectively. Information obtained from the best 3D-QSAR model was applied for optimization of lead molecule and design of novel potential inhibitors.

  8. A Note on Some Characteristics and Correlates of the Meier Art Test of Aesthetic Perception.

    ERIC Educational Resources Information Center

    Stallings, William M.; Anderson, Frances E.

    The reliability and the predictive and concurrent validity of the MATAP were investigated with the implicit goal of improving the prediction of course grades in the College of Fine and Applied Arts. It was found that reliability and validity coefficients were low, and it was suggested that the scoring system was a source of error variance. (MS)

  9. A Compatible Hardware/Software Reliability Prediction Model.

    DTIC Science & Technology

    1981-07-22

    machines. In particular, he was interested in the following problem: assu me that one has a collection of connected elements computing and transmitting...software reliability prediction model is desirable, the findings about the Weibull distribution are intriguing. After collecting failure data from several...capacitor, some of the added charge carriers are collected by the capacitor. If the added charge is sufficiently large, the information stored is changed

  10. Cost prediction model for various payloads and instruments for the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Hoffman, F. E.

    1984-01-01

    The following cost parameters of the space shuttle were undertaken: (1) to develop a cost prediction model for various payload classes of instruments and experiments for the Space Shuttle Orbiter; and (2) to show the implications of various payload classes on the cost of: reliability analysis, quality assurance, environmental design requirements, documentation, parts selection, and other reliability enhancing activities.

  11. Combining molecular docking and QSAR studies for modeling the anti-tyrosinase activity of aromatic heterocycle thiosemicarbazone analogues

    NASA Astrophysics Data System (ADS)

    Dong, Huanhuan; Liu, Jing; Liu, Xiaoru; Yu, Yanying; Cao, Shuwen

    2018-01-01

    A collection of thirty-six aromatic heterocycle thiosemicarbazone analogues presented a broad span of anti-tyrosinase activities were designed and obtained. A robust and reliable two-dimensional quantitative structure-activity relationship model, as evidenced by the high q2 and r2 values (0.848 and 0.893, respectively), was gained based on the analogues to predict the quantitative chemical-biological relationship and the new modifier direction. Inhibitory activities of the compounds were found to greatly depend on molecular shape and orbital energy. Substituents brought out large ovality and high highest-occupied molecular orbital energy values helped to improve the activity of these analogues. The molecular docking results provided visual evidence for QSAR analysis and inhibition mechanism. Based on these, two novel tyrosinase inhibitors O04 and O05 with predicted IC50 of 0.5384 and 0.8752 nM were designed and suggested for further research.

  12. Production Planning and Simulation for Reverse Supply Chain

    NASA Astrophysics Data System (ADS)

    Murayama, Takeshi; Yoda, Mitsunobu; Eguchi, Toru; Oba, Fuminori

    This paper describes a production planning method for a reverse supply chain, in which a disassembly company takes reusable components from returned used products and supplies the reusable components for a product manufacturer. This method addresses the issue that the timings and quantities of returned products and reusable components obtained from them are unknown. This method first predicts the quantities of returned products and reusable components at each time period by using reliability models. Using the prediction result, the method performs production planning based on Material Requirements Planning (MRP). This method enables us to plan at each time period: the quantity of the products to be disassembled; the quantity of the reusable components to be used; and the quantity of the new components to be produced. The flow of the components and products through a forward and reverse supply chain is simulated to show the effectiveness of the method.

  13. Rational Design of an Ultrasensitive Quorum-Sensing Switch.

    PubMed

    Zeng, Weiqian; Du, Pei; Lou, Qiuli; Wu, Lili; Zhang, Haoqian M; Lou, Chunbo; Wang, Hongli; Ouyang, Qi

    2017-08-18

    One of the purposes of synthetic biology is to develop rational methods that accelerate the design of genetic circuits, saving time and effort spent on experiments and providing reliably predictable circuit performance. We applied a reverse engineering approach to design an ultrasensitive transcriptional quorum-sensing switch. We want to explore how systems biology can guide synthetic biology in the choice of specific DNA sequences and their regulatory relations to achieve a targeted function. The workflow comprises network enumeration that achieves the target function robustly, experimental restriction of the obtained candidate networks, global parameter optimization via mathematical analysis, selection and engineering of parts based on these calculations, and finally, circuit construction based on the principles of standardization and modularization. The performance of realized quorum-sensing switches was in good qualitative agreement with the computational predictions. This study provides practical principles for the rational design of genetic circuits with targeted functions.

  14. Dimensionless embedding for nonlinear time series analysis

    NASA Astrophysics Data System (ADS)

    Hirata, Yoshito; Aihara, Kazuyuki

    2017-09-01

    Recently, infinite-dimensional delay coordinates (InDDeCs) have been proposed for predicting high-dimensional dynamics instead of conventional delay coordinates. Although InDDeCs can realize faster computation and more accurate short-term prediction, it is still not well-known whether InDDeCs can be used in other applications of nonlinear time series analysis in which reconstruction is needed for the underlying dynamics from a scalar time series generated from a dynamical system. Here, we give theoretical support for justifying the use of InDDeCs and provide numerical examples to show that InDDeCs can be used for various applications for obtaining the recurrence plots, correlation dimensions, and maximal Lyapunov exponents, as well as testing directional couplings and extracting slow-driving forces. We demonstrate performance of the InDDeCs using the weather data. Thus, InDDeCs can eventually realize "dimensionless embedding" while we enjoy faster and more reliable computations.

  15. Burgers approximation for two-dimensional flow past an ellipse

    NASA Technical Reports Server (NTRS)

    Dorrepaal, J. M.

    1982-01-01

    A linearization of the Navier-Stokes equation due to Burgers in which vorticity is transported by the velocity field corresponding to continuous potential flow is examined. The governing equations are solved exactly for the two dimensional steady flow past an ellipse of arbitrary aspect ratio. The requirement of no slip along the surface of the ellipse results in an infinite algebraic system of linear equations for coefficients appearing in the solution. The system is truncated at a point which gives reliable results for Reynolds numbers R in the range 0 R 5. Predictions of the Burgers approximation regarding separation, drag and boundary layer behavior are investigated. In particular, Burgers linearization gives drag coefficients which are closer to observed experimental values than those obtained from Oseen's approximation. In the special case of flow past a circular cylinder, Burgers approximation predicts a boundary layer whose thickness is roughly proportional to R-1/2.

  16. Sediment sorting along tidal sand waves: A comparison between field observations and theoretical predictions

    NASA Astrophysics Data System (ADS)

    Van Oyen, Tomas; Blondeaux, Paolo; Van den Eynde, Dries

    2013-07-01

    A site-by-site comparison between field observations and theoretical predictions of sediment sorting patterns along tidal sand waves is performed for ten locations in the North Sea. At each site, the observed grain size distribution along the bottom topography and the geometry of the bed forms is described in detail and the procedure used to obtain the model parameters is summarized. The model appears to accurately describe the wavelength of the observed sand waves for the majority of the locations; still providing a reliable estimate for the other sites. In addition, it is found that for seven out of the ten locations, the qualitative sorting process provided by the model agrees with the observed grain size distribution. A discussion of the site-by-site comparison is provided which, taking into account uncertainties in the field data, indicates that the model grasps the major part of the key processes controlling the phenomenon.

  17. VEGA Launch Vehicle: VV02 Flight Campaign Thermal Analysis

    NASA Astrophysics Data System (ADS)

    Moroni, D.; Perugini, P.; Mancini, R.; Bonnet, M.

    2014-06-01

    A reliable tool for the prediction of temperature trends vs. time during the operative timeline of a launcher represents one of the key elements for the qualification of a launch vehicle itself.The correct evaluation of the thermal behaviour during the mission, both for the launcher elements (structures, electronic items, tanks, motors...) and for the Payloads carried by the same Launcher, is one of the preliminary activities to be performed before a flight campaign.For such scope AVIO constructed a Thermal Mathematical Model (TMM) by means of the ESA software "ESATAN Thermal Modelling Suite (TMS)" [1] used for the prediction of the temperature trends both on VV01 (VEGA LV Qualification Flight) and VV02 (First VEGA LV commercial flight) with successfully results in terms of post-flight comparison with the sensor data outputs.Aim of this paper is to show the correlation obtained by AVIO VEGA LV SYS TMM in the frame of VV02 Flight.

  18. Prediction of the compression ratio for municipal solid waste using decision tree.

    PubMed

    Heshmati R, Ali Akbar; Mokhtari, Maryam; Shakiba Rad, Saeed

    2014-01-01

    The compression ratio of municipal solid waste (MSW) is an essential parameter for evaluation of waste settlement and landfill design. However, no appropriate model has been proposed to estimate the waste compression ratio so far. In this study, a decision tree method was utilized to predict the waste compression ratio (C'c). The tree was constructed using Quinlan's M5 algorithm. A reliable database retrieved from the literature was used to develop a practical model that relates C'c to waste composition and properties, including dry density, dry weight water content, and percentage of biodegradable organic waste using the decision tree method. The performance of the developed model was examined in terms of different statistical criteria, including correlation coefficient, root mean squared error, mean absolute error and mean bias error, recommended by researchers. The obtained results demonstrate that the suggested model is able to evaluate the compression ratio of MSW effectively.

  19. Best Linear Unbiased Prediction (BLUP) for regional yield trials: a comparison to additive main effects and multiplicative interaction (AMMI) analysis.

    PubMed

    Piepho, H P

    1994-11-01

    Multilocation trials are often used to analyse the adaptability of genotypes in different environments and to find for each environment the genotype that is best adapted; i.e. that is highest yielding in that environment. For this purpose, it is of interest to obtain a reliable estimate of the mean yield of a cultivar in a given environment. This article compares two different statistical estimation procedures for this task: the Additive Main Effects and Multiplicative Interaction (AMMI) analysis and Best Linear Unbiased Prediction (BLUP). A modification of a cross validation procedure commonly used with AMMI is suggested for trials that are laid out as a randomized complete block design. The use of these procedure is exemplified using five faba bean datasets from German registration trails. BLUP was found to outperform AMMI in four of five faba bean datasets.

  20. Measuring cognitive change with ImPACT: the aggregate baseline approach.

    PubMed

    Bruce, Jared M; Echemendia, Ruben J; Meeuwisse, Willem; Hutchison, Michael G; Aubry, Mark; Comper, Paul

    2017-11-01

    The Immediate Post-Concussion Assessment and Cognitive Test (ImPACT) is commonly used to assess baseline and post-injury cognition among athletes in North America. Despite this, several studies have questioned the reliability of ImPACT when given at intervals employed in clinical practice. Poor test-retest reliability reduces test sensitivity to cognitive decline, increasing the likelihood that concussed athletes will be returned to play prematurely. We recently showed that the reliability of ImPACT can be increased when using a new composite structure and the aggregate of two baselines to predict subsequent performance. The purpose of the present study was to confirm our previous findings and determine whether the addition of a third baseline would further increase the test-retest reliability of ImPACT. Data from 97 English speaking professional hockey players who had received at least 4 ImPACT baseline evaluations were extracted from a National Hockey League Concussion Program database. Linear regression was used to determine whether each of the first three testing sessions accounted for unique variance in the fourth testing session. Results confirmed that the aggregate baseline approach improves the psychometric properties of ImPACT, with most indices demonstrating adequate or better test-retest reliability for clinical use. The aggregate baseline approach provides a modest clinical benefit when recent baselines are available - and a more substantial benefit when compared to approaches that obtain baseline measures only once during the course of a multi-year playing career. Pending confirmation in diverse samples, neuropsychologists are encouraged to use the aggregate baseline approach to best quantify cognitive change following sports concussion.

  1. Reliability study of biometrics "do not contact" in myopia.

    PubMed

    Migliorini, R; Fratipietro, M; Comberiati, A M; Pattavina, L; Arrico, L

    The aim of the study is a comparison between the actually achieved after surgery condition versus the expected refractive condition of the eye as calculated via a biometer. The study was conducted in a random group of 38 eyes of patients undergoing surgery by phacoemulsification. The mean absolute error was calculated between the predicted values from the measurements with the optical biometer and those obtained in the post-operative error which was at around 0.47% Our study shows results not far from those reported in the literature, and in relation, to the mean absolute error is among the lowest values at 0.47 ± 0.11 SEM.

  2. Developing Mathematical Provisions for Assessment of Liquid Hydrocarbon Emissions in Emergency Situations

    NASA Astrophysics Data System (ADS)

    Zemenkova, M. Yu; Zemenkov, Yu D.; Shantarin, V. D.

    2016-10-01

    The paper reviews the development of methodology for calculation of hydrocarbon emissions during seepage and evaporation to monitor the reliability and safety of hydrocarbon storage and transportation. The authors have analyzed existing methods, models and techniques for assessing the amount of evaporated oil. Models used for predicting the material balance of multicomponent two-phase systems have been discussed. The results of modeling the open-air hydrocarbon evaporation from an oil spill are provided and exemplified by an emergency pit. Dependences and systems of differential equations have been obtained to assess parameters of mass transfer from the open surface of a liquid multicomponent mixture.

  3. Electronic cooling design and test validation

    NASA Astrophysics Data System (ADS)

    Murtha, W. B.

    1983-07-01

    An analytical computer model has been used to design a counterflow air-cooled heat exchanger according to the cooling, structural and geometric requirements of a U.S. Navy shipboard electronics cabinet, emphasizing high reliability performance through the maintenance of electronic component junction temperatures lower than 110 C. Environmental testing of the design obtained has verified that the analytical predictions were conservative. Model correlation to the test data furnishes an upgraded capability for the evaluation of tactical effects, and has established a two-orders of magnitude growth potential for increased electronics capabilities through enhanced heat dissipation. Electronics cabinets of this type are destined for use with Vertical Launching System-type combatant vessel magazines.

  4. Dynamic fatigue testing of Zerodur glass-ceramic

    NASA Technical Reports Server (NTRS)

    Tucker, Dennis S.

    1988-01-01

    The inherent brittleness of glass invariably leads to a large variability in strength data and a time dependence in strength. Leading rate plays a large role in strength values. Glass is found to be weaker when supporting loads over long periods of time as compared to glass which undergoes rapid leading. These properties complicate the structural design allowables for the utilization of glass components in an application such as Advanced X-ray Astrophysics Facility (AXAF). The test methodology to obtain parameters which can be used to predict the reliability and life time of Zerodur glass-ceramic which is to be used for the mirrors in the AXAF is described.

  5. Complexities’ day-to-day dynamic evolution analysis and prediction for a Didi taxi trip network based on complex network theory

    NASA Astrophysics Data System (ADS)

    Zhang, Lin; Lu, Jian; Zhou, Jialin; Zhu, Jinqing; Li, Yunxuan; Wan, Qian

    2018-03-01

    Didi Dache is the most popular taxi order mobile app in China, which provides online taxi-hailing service. The obtained big database from this app could be used to analyze the complexities’ day-to-day dynamic evolution of Didi taxi trip network (DTTN) from the level of complex network dynamics. First, this paper proposes the data cleaning and modeling methods for expressing Nanjing’s DTTN as a complex network. Second, the three consecutive weeks’ data are cleaned to establish 21 DTTNs based on the proposed big data processing technology. Then, multiple topology measures that characterize the complexities’ day-to-day dynamic evolution of these networks are provided. Third, these measures of 21 DTTNs are calculated and subsequently explained with actual implications. They are used as a training set for modeling the BP neural network which is designed for predicting DTTN complexities evolution. Finally, the reliability of the designed BP neural network is verified by comparing with the actual data and the results obtained from ARIMA method simultaneously. Because network complexities are the basis for modeling cascading failures and conducting link prediction in complex system, this proposed research framework not only provides a novel perspective for analyzing DTTN from the level of system aggregated behavior, but can also be used to improve the DTTN management level.

  6. Pore network extraction from pore space images of various porous media systems

    NASA Astrophysics Data System (ADS)

    Yi, Zhixing; Lin, Mian; Jiang, Wenbin; Zhang, Zhaobin; Li, Haishan; Gao, Jian

    2017-04-01

    Pore network extraction, which is defined as the transformation from irregular pore space to a simplified network in the form of pores connected by throats, is significant to microstructure analysis and network modeling. A physically realistic pore network is not only a representation of the pore space in the sense of topology and morphology, but also a good tool for predicting transport properties accurately. We present a method to extract pore network by employing the centrally located medial axis to guide the construction of maximal-balls-like skeleton where the pores and throats are defined and parameterized. To validate our method, various rock samples including sand pack, sandstones, and carbonates were used to extract pore networks. The pore structures were compared quantitatively with the structures extracted by medial axis method or maximal ball method. The predicted absolute permeability and formation factor were verified against the theoretical solutions obtained by lattice Boltzmann method and finite volume method, respectively. The two-phase flow was simulated through the networks extracted from homogeneous sandstones, and the generated relative permeability curves were compared with the data obtained from experimental method and other numerical models. The results show that the accuracy of our network is higher than that of other networks for predicting transport properties, so the presented method is more reliable for extracting physically realistic pore network.

  7. A path-based measurement for human miRNA functional similarities using miRNA-disease associations

    NASA Astrophysics Data System (ADS)

    Ding, Pingjian; Luo, Jiawei; Xiao, Qiu; Chen, Xiangtao

    2016-09-01

    Compared with the sequence and expression similarity, miRNA functional similarity is so important for biology researches and many applications such as miRNA clustering, miRNA function prediction, miRNA synergism identification and disease miRNA prioritization. However, the existing methods always utilized the predicted miRNA target which has high false positive and false negative to calculate the miRNA functional similarity. Meanwhile, it is difficult to achieve high reliability of miRNA functional similarity with miRNA-disease associations. Therefore, it is increasingly needed to improve the measurement of miRNA functional similarity. In this study, we develop a novel path-based calculation method of miRNA functional similarity based on miRNA-disease associations, called MFSP. Compared with other methods, our method obtains higher average functional similarity of intra-family and intra-cluster selected groups. Meanwhile, the lower average functional similarity of inter-family and inter-cluster miRNA pair is obtained. In addition, the smaller p-value is achieved, while applying Wilcoxon rank-sum test and Kruskal-Wallis test to different miRNA groups. The relationship between miRNA functional similarity and other information sources is exhibited. Furthermore, the constructed miRNA functional network based on MFSP is a scale-free and small-world network. Moreover, the higher AUC for miRNA-disease prediction indicates the ability of MFSP uncovering miRNA functional similarity.

  8. A new theory for X-ray diffraction

    PubMed Central

    Fewster, Paul F.

    2014-01-01

    This article proposes a new theory of X-ray scattering that has particular relevance to powder diffraction. The underlying concept of this theory is that the scattering from a crystal or crystallite is distributed throughout space: this leads to the effect that enhanced scatter can be observed at the ‘Bragg position’ even if the ‘Bragg condition’ is not satisfied. The scatter from a single crystal or crystallite, in any fixed orientation, has the fascinating property of contributing simultaneously to many ‘Bragg positions’. It also explains why diffraction peaks are obtained from samples with very few crystallites, which cannot be explained with the conventional theory. The intensity ratios for an Si powder sample are predicted with greater accuracy and the temperature factors are more realistic. Another consequence is that this new theory predicts a reliability in the intensity measurements which agrees much more closely with experimental observations compared to conventional theory that is based on ‘Bragg-type’ scatter. The role of dynamical effects (extinction etc.) is discussed and how they are suppressed with diffuse scattering. An alternative explanation for the Lorentz factor is presented that is more general and based on the capture volume in diffraction space. This theory, when applied to the scattering from powders, will evaluate the full scattering profile, including peak widths and the ‘background’. The theory should provide an increased understanding of the reliability of powder diffraction measurements, and may also have wider implications for the analysis of powder diffraction data, by increasing the accuracy of intensities predicted from structural models. PMID:24815975

  9. Vs30 and spectral response from collocated shallow, active- and passive-source Vs data at 27 sites in Puerto Rico

    USGS Publications Warehouse

    Odum, Jack K.; Stephenson, William J.; Williams, Robert A.; von Hillebrandt-Andrade, Christa

    2013-01-01

    Shear‐wave velocity (VS) and time‐averaged shear‐wave velocity to 30 m depth (VS30) are the key parameters used in seismic site response modeling and earthquake engineering design. Where VS data are limited, available data are often used to develop and refine map‐based proxy models of VS30 for predicting ground‐motion intensities. In this paper, we present shallow VS data from 27 sites in Puerto Rico. These data were acquired using a multimethod acquisition approach consisting of noninvasive, collocated, active‐source body‐wave (refraction/reflection), active‐source surface wave at nine sites, and passive‐source surface‐wave refraction microtremor (ReMi) techniques. VS‐versus‐depth models are constructed and used to calculate spectral response plots for each site. Factors affecting method reliability are analyzed with respect to site‐specific differences in bedrock VS and spectral response. At many but not all sites, body‐ and surface‐wave methods generally determine similar depths to bedrock, and it is the difference in bedrock VS that influences site amplification. The predicted resonant frequencies for the majority of the sites are observed to be within a relatively narrow bandwidth of 1–3.5 Hz. For a first‐order comparison of peak frequency position, predictive spectral response plots from eight sites are plotted along with seismograph instrument spectra derived from the time series of the 16 May 2010 Puerto Rico earthquake. We show how a multimethod acquisition approach using collocated arrays compliments and corroborates VS results, thus adding confidence that reliable site characterization information has been obtained.

  10. Big Data Analytics for Modelling and Forecasting of Geomagnetic Field Indices

    NASA Astrophysics Data System (ADS)

    Wei, H. L.

    2016-12-01

    A massive amount of data are produced and stored in research areas of space weather and space climate. However, the value of a vast majority of the data acquired every day may not be effectively or efficiently exploited in our daily practice when we try to forecast solar wind parameters and geomagnetic field indices using these recorded measurements or digital signals, probably due to the challenges stemming from the dealing with big data which are characterized by the 4V futures: volume (a massively large amount of data), variety (a great number of different types of data), velocity (a requirement of quick processing of the data), and veracity (the trustworthiness and usability of the data). In order to obtain more reliable and accurate predictive models for geomagnetic field indices, it requires that models should be developed from the big data analytics perspective (or it at least benefits from such a perspective). This study proposes a few data-based modelling frameworks which aim to produce more efficient predictive models for space weather parameters forecasting by means of system identification and big data analytics. More specifically, it aims to build more reliable mathematical models that characterise the relationship between solar wind parameters and geomagnetic filed indices, for example the dependent relationship of Dst and Kp indices on a few solar wind parameters and magnetic field indices, namely, solar wind velocity (V), southward interplanetary magnetic field (Bs), solar wind rectified electric field (VBs), and dynamic flow pressure (P). Examples are provided to illustrate how the proposed modelling approaches are applied to Dst and Kp index prediction.

  11. Accurate and reliable prediction of relative ligand binding potency in prospective drug discovery by way of a modern free-energy calculation protocol and force field.

    PubMed

    Wang, Lingle; Wu, Yujie; Deng, Yuqing; Kim, Byungchan; Pierce, Levi; Krilov, Goran; Lupyan, Dmitry; Robinson, Shaughnessy; Dahlgren, Markus K; Greenwood, Jeremy; Romero, Donna L; Masse, Craig; Knight, Jennifer L; Steinbrecher, Thomas; Beuming, Thijs; Damm, Wolfgang; Harder, Ed; Sherman, Woody; Brewer, Mark; Wester, Ron; Murcko, Mark; Frye, Leah; Farid, Ramy; Lin, Teng; Mobley, David L; Jorgensen, William L; Berne, Bruce J; Friesner, Richard A; Abel, Robert

    2015-02-25

    Designing tight-binding ligands is a primary objective of small-molecule drug discovery. Over the past few decades, free-energy calculations have benefited from improved force fields and sampling algorithms, as well as the advent of low-cost parallel computing. However, it has proven to be challenging to reliably achieve the level of accuracy that would be needed to guide lead optimization (∼5× in binding affinity) for a wide range of ligands and protein targets. Not surprisingly, widespread commercial application of free-energy simulations has been limited due to the lack of large-scale validation coupled with the technical challenges traditionally associated with running these types of calculations. Here, we report an approach that achieves an unprecedented level of accuracy across a broad range of target classes and ligands, with retrospective results encompassing 200 ligands and a wide variety of chemical perturbations, many of which involve significant changes in ligand chemical structures. In addition, we have applied the method in prospective drug discovery projects and found a significant improvement in the quality of the compounds synthesized that have been predicted to be potent. Compounds predicted to be potent by this approach have a substantial reduction in false positives relative to compounds synthesized on the basis of other computational or medicinal chemistry approaches. Furthermore, the results are consistent with those obtained from our retrospective studies, demonstrating the robustness and broad range of applicability of this approach, which can be used to drive decisions in lead optimization.

  12. Rapid Detection of Small Movements with GNSS Doppler Observables

    NASA Astrophysics Data System (ADS)

    Hohensinn, Roland; Geiger, Alain

    2017-04-01

    High-alpine terrain reacts very sensitively to varying environmental conditions. As an example, increasing temperatures cause thawing of permafrost areas. This, in turn causes an increasing threat by natural hazards like debris flow (e.g. rock glaciers) or rockfalls. The Institute of Geodesy and Photogrammetry is contributing to alpine mass-movement monitoring systems in different project areas in the Swiss Alps. A main focus lies on providing geodetic mass-movement information derived from GNSS static solutions on a daily and a sub-daily basis, obtained with low-cost and autonomous GNSS stations. Another focus is set on rapidly providing reliable geodetic information in real-time i.e. for an integration in early warning systems. One way to achieve this is the estimation of accurate station velocities from observations of range rates, which can be obtained as Doppler observables from time derivatives of carrier phase measurements. The key for this method lies in a precise modeling of prominent effects contributing to the observed range rates, which are satellite velocity, atmospheric delay rates and relativistic effects. A suitable observation model is then devised, which accounts for these predictions. The observation model, combined with a simple kinematic movement model forms the basis for the parameter estimation. Based on the estimated station velocities, movements are then detected using a statistical test. To improve the reliablity of the estimated parameters, another spotlight is set on an on-line quality control procedure. We will present the basic algorithms as well as results from first tests which were carried out with a low-cost GPS L1 phase receiver. With a u-blox module and a sampling rate of 5 Hz, accuracies on the mm/s level can be obtained and velocities down to 1 cm/s can be detected. Reliable and accurate station velocities and movement information can be provided within seconds.

  13. A new lifetime estimation model for a quicker LED reliability prediction

    NASA Astrophysics Data System (ADS)

    Hamon, B. H.; Mendizabal, L.; Feuillet, G.; Gasse, A.; Bataillou, B.

    2014-09-01

    LED reliability and lifetime prediction is a key point for Solid State Lighting adoption. For this purpose, one hundred and fifty LEDs have been aged for a reliability analysis. LEDs have been grouped following nine current-temperature stress conditions. Stress driving current was fixed between 350mA and 1A and ambient temperature between 85C and 120°C. Using integrating sphere and I(V) measurements, a cross study of the evolution of electrical and optical characteristics has been done. Results show two main failure mechanisms regarding lumen maintenance. The first one is the typically observed lumen depreciation and the second one is a much more quicker depreciation related to an increase of the leakage and non radiative currents. Models of the typical lumen depreciation and leakage resistance depreciation have been made using electrical and optical measurements during the aging tests. The combination of those models allows a new method toward a quicker LED lifetime prediction. These two models have been used for lifetime predictions for LEDs.

  14. A New Method for the Evaluation and Prediction of Base Stealing Performance.

    PubMed

    Bricker, Joshua C; Bailey, Christopher A; Driggers, Austin R; McInnis, Timothy C; Alami, Arya

    2016-11-01

    Bricker, JC, Bailey, CA, Driggers, AR, McInnis, TC, and Alami, A. A new method for the evaluation and prediction of base stealing performance. J Strength Cond Res 30(11): 3044-3050, 2016-The purposes of this study were to evaluate a new method using electronic timing gates to monitor base stealing performance in terms of reliability, differences between it and traditional stopwatch-collected times, and its ability to predict base stealing performance. Twenty-five healthy collegiate baseball players performed maximal effort base stealing trials with a right and left-handed pitcher. An infrared electronic timing system was used to calculate the reaction time (RT) and total time (TT), whereas coaches' times (CT) were recorded with digital stopwatches. Reliability of the TGM was evaluated with intraclass correlation coefficients (ICCs) and coefficient of variation (CV). Differences between the TGM and traditional CT were calculated with paired samples t tests Cohen's d effect size estimates. Base stealing performance predictability of the TGM was evaluated with Pearson's bivariate correlations. Acceptable relative reliability was observed (ICCs 0.74-0.84). Absolute reliability measures were acceptable for TT (CVs = 4.4-4.8%), but measures were elevated for RT (CVs = 32.3-35.5%). Statistical and practical differences were found between TT and CT (right p = 0.00, d = 1.28 and left p = 0.00, d = 1.49). The TGM TT seems to be a decent predictor of base stealing performance (r = -0.49 to -0.61). The authors recommend using the TGM used in this investigation for athlete monitoring because it was found to be reliable, seems to be more precise than traditional CT measured with a stopwatch, provides an additional variable of value (RT), and may predict future performance.

  15. DBH Prediction Using Allometry Described by Bivariate Copula Distribution

    NASA Astrophysics Data System (ADS)

    Xu, Q.; Hou, Z.; Li, B.; Greenberg, J. A.

    2017-12-01

    Forest biomass mapping based on single tree detection from the airborne laser scanning (ALS) usually depends on an allometric equation that relates diameter at breast height (DBH) with per-tree aboveground biomass. The incapability of the ALS technology in directly measuring DBH leads to the need to predict DBH with other ALS-measured tree-level structural parameters. A copula-based method is proposed in the study to predict DBH with the ALS-measured tree height and crown diameter using a dataset measured in the Lassen National Forest in California. Instead of exploring an explicit mathematical equation that explains the underlying relationship between DBH and other structural parameters, the copula-based prediction method utilizes the dependency between cumulative distributions of these variables, and solves the DBH based on an assumption that for a single tree, the cumulative probability of each structural parameter is identical. Results show that compared with the bench-marking least-square linear regression and the k-MSN imputation, the copula-based method obtains better accuracy in the DBH for the Lassen National Forest. To assess the generalization of the proposed method, prediction uncertainty is quantified using bootstrapping techniques that examine the variability of the RMSE of the predicted DBH. We find that the copula distribution is reliable in describing the allometric relationship between tree-level structural parameters, and it contributes to the reduction of prediction uncertainty.

  16. Epileptic Seizure Prediction Using Big Data and Deep Learning: Toward a Mobile System.

    PubMed

    Kiral-Kornek, Isabell; Roy, Subhrajit; Nurse, Ewan; Mashford, Benjamin; Karoly, Philippa; Carroll, Thomas; Payne, Daniel; Saha, Susmita; Baldassano, Steven; O'Brien, Terence; Grayden, David; Cook, Mark; Freestone, Dean; Harrer, Stefan

    2018-01-01

    Seizure prediction can increase independence and allow preventative treatment for patients with epilepsy. We present a proof-of-concept for a seizure prediction system that is accurate, fully automated, patient-specific, and tunable to an individual's needs. Intracranial electroencephalography (iEEG) data of ten patients obtained from a seizure advisory system were analyzed as part of a pseudoprospective seizure prediction study. First, a deep learning classifier was trained to distinguish between preictal and interictal signals. Second, classifier performance was tested on held-out iEEG data from all patients and benchmarked against the performance of a random predictor. Third, the prediction system was tuned so sensitivity or time in warning could be prioritized by the patient. Finally, a demonstration of the feasibility of deployment of the prediction system onto an ultra-low power neuromorphic chip for autonomous operation on a wearable device is provided. The prediction system achieved mean sensitivity of 69% and mean time in warning of 27%, significantly surpassing an equivalent random predictor for all patients by 42%. This study demonstrates that deep learning in combination with neuromorphic hardware can provide the basis for a wearable, real-time, always-on, patient-specific seizure warning system with low power consumption and reliable long-term performance. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Stability of individual loudness functions obtained by magnitude estimation and production

    NASA Technical Reports Server (NTRS)

    Hellman, R. P.

    1981-01-01

    A correlational analysis of individual magnitude estimation and production exponents at the same frequency is performed, as is an analysis of individual exponents produced in different sessions by the same procedure across frequency (250, 1000, and 3000 Hz). Taken as a whole, the results show that individual exponent differences do not decrease by counterbalancing magnitude estimation with magnitude production and that individual exponent differences remain stable over time despite changes in stimulus frequency. Further results show that although individual magnitude estimation and production exponents do not necessarily obey the .6 power law, it is possible to predict the slope of an equal-sensation function averaged for a group of listeners from individual magnitude estimation and production data. On the assumption that individual listeners with sensorineural hearing also produce stable and reliable magnitude functions, it is also shown that the slope of the loudness-recruitment function measured by magnitude estimation and production can be predicted for individuals with bilateral losses of long duration. Results obtained in normal and pathological ears thus suggest that individual listeners can produce loudness judgements that reveal, although indirectly, the input-output characteristic of the auditory system.

  18. Global mapping of highly pathogenic avian influenza H5N1 and H5Nx clade 2.3.4.4 viruses with spatial cross-validation

    PubMed Central

    Dhingra, Madhur S; Artois, Jean; Robinson, Timothy P; Linard, Catherine; Chaiban, Celia; Xenarios, Ioannis; Engler, Robin; Liechti, Robin; Kuznetsov, Dmitri; Xiao, Xiangming; Dobschuetz, Sophie Von; Claes, Filip; Newman, Scott H; Dauphin, Gwenaëlle; Gilbert, Marius

    2016-01-01

    Global disease suitability models are essential tools to inform surveillance systems and enable early detection. We present the first global suitability model of highly pathogenic avian influenza (HPAI) H5N1 and demonstrate that reliable predictions can be obtained at global scale. Best predictions are obtained using spatial predictor variables describing host distributions, rather than land use or eco-climatic spatial predictor variables, with a strong association with domestic duck and extensively raised chicken densities. Our results also support a more systematic use of spatial cross-validation in large-scale disease suitability modelling compared to standard random cross-validation that can lead to unreliable measure of extrapolation accuracy. A global suitability model of the H5 clade 2.3.4.4 viruses, a group of viruses that recently spread extensively in Asia and the US, shows in comparison a lower spatial extrapolation capacity than the HPAI H5N1 models, with a stronger association with intensively raised chicken densities and anthropogenic factors. DOI: http://dx.doi.org/10.7554/eLife.19571.001 PMID:27885988

  19. Theoretical relationship between elastic wave velocity and electrical resistivity

    NASA Astrophysics Data System (ADS)

    Lee, Jong-Sub; Yoon, Hyung-Koo

    2015-05-01

    Elastic wave velocity and electrical resistivity have been commonly applied to estimate stratum structures and obtain subsurface soil design parameters. Both elastic wave velocity and electrical resistivity are related to the void ratio; the objective of this study is therefore to suggest a theoretical relationship between the two physical parameters. Gassmann theory and Archie's equation are applied to propose a new theoretical equation, which relates the compressional wave velocity to shear wave velocity and electrical resistivity. The piezo disk element (PDE) and bender element (BE) are used to measure the compressional and shear wave velocities, respectively. In addition, the electrical resistivity is obtained by using the electrical resistivity probe (ERP). The elastic wave velocity and electrical resistivity are recorded in several types of soils including sand, silty sand, silty clay, silt, and clay-sand mixture. The appropriate input parameters are determined based on the error norm in order to increase the reliability of the proposed relationship. The predicted compressional wave velocities from the shear wave velocity and electrical resistivity are similar to the measured compressional velocities. This study demonstrates that the new theoretical relationship may be effectively used to predict the unknown geophysical property from the measured values.

  20. Optimization of microwave-assisted extraction of total extract, stevioside and rebaudioside-A from Stevia rebaudiana (Bertoni) leaves, using response surface methodology (RSM) and artificial neural network (ANN) modelling.

    PubMed

    Ameer, Kashif; Bae, Seong-Woo; Jo, Yunhee; Lee, Hyun-Gyu; Ameer, Asif; Kwon, Joong-Ho

    2017-08-15

    Stevia rebaudiana (Bertoni) consists of stevioside and rebaudioside-A (Reb-A). We compared response surface methodology (RSM) and artificial neural network (ANN) modelling for their estimation and predictive capabilities in building effective models with maximum responses. A 5-level 3-factor central composite design was used to optimize microwave-assisted extraction (MAE) to obtain maximum yield of target responses as a function of extraction time (X 1 : 1-5min), ethanol concentration, (X 2 : 0-100%) and microwave power (X 3 : 40-200W). Maximum values of the three output parameters: 7.67% total extract yield, 19.58mg/g stevioside yield, and 15.3mg/g Reb-A yield, were obtained under optimum extraction conditions of 4min X 1 , 75% X 2 , and 160W X 3 . The ANN model demonstrated higher efficiency than did the RSM model. Hence, RSM can demonstrate interaction effects of inherent MAE parameters on target responses, whereas ANN can reliably model the MAE process with better predictive and estimation capabilities. Copyright © 2017. Published by Elsevier Ltd.

  1. Recent developments in skin mimic systems to predict transdermal permeation.

    PubMed

    Waters, Laura J

    2015-01-01

    In recent years there has been a drive to create experimental techniques that can facilitate the accurate and precise prediction of transdermal permeation without the use of in vivo studies. This review considers why permeation data is essential, provides a brief summary as to how skin acts as a natural barrier to permeation and discusses why in vivo studies are undesirable. This is followed by an in-depth discussion on the extensive range of alternative methods that have been developed in recent years. All of the major 'skin mimic systems' are considered including: in vitro models using synthetic membranes, mathematical models including quantitative structure-permeability relationships (QSPRs), human skin equivalents and chromatographic based methods. All of these model based systems are ideally trying to achieve the same end-point, namely a reliable in vitro-in vivo correlation, i.e. matching non-in vivo obtained data with that from human clinical trials. It is only by achieving this aim, that any new method of obtaining permeation data can be acknowledged as a potential replacement for animal studies, for the determination of transdermal permeation. In this review, the relevance and potential applicability of the various models systems will also be discussed.

  2. Reliability Growth and Its Applications to Dormant Reliability

    DTIC Science & Technology

    1981-12-01

    ability to make projection about future reli- ability (Rof 9:41-42). Barlow and Scheuer Model. Richard E. Barlow and Ernest M. Sch~uvr, of the University...Reliability Growth Prediction Models," Operations Research, 18(l):S2-6S (January/February 1970). 7. Bauer, John, William Hadley, and Robert Dietz... Texarkana , Texas, May 1973. (AD 768 119). 10. Bonis, Austin J. "Reliability Growth Curves for One Shot Devices," Proceedings 1977 Annual Reliability and

  3. An evaluation and comparison of intraventricular, intraparenchymal, and fluid-coupled techniques for intracranial pressure monitoring in patients with severe traumatic brain injury.

    PubMed

    Vender, John; Waller, Jennifer; Dhandapani, Krishnan; McDonnell, Dennis

    2011-08-01

    Intracranial pressure measurements have become one of the mainstays of traumatic brain injury management. Various technologies exist to monitor intracranial pressure from a variety of locations. Transducers are usually placed to assess pressure in the brain parenchyma and the intra-ventricular fluid, which are the two most widely accepted compartmental monitoring sites. The individual reliability and inter-reliability of these devices with and without cerebrospinal fluid diversion is not clear. The predictive capability of monitors in both of these sites to local, regional, and global changes also needs further clarification. The technique of monitoring intraventricular pressure with a fluid-coupled transducer system is also reviewed. There has been little investigation into the relationship among pressure measurements obtained from these two sources using these three techniques. Eleven consecutive patients with severe, closed traumatic brain injury not requiring intracranial mass lesion evacuation were admitted into this prospective study. Each patient underwent placement of a parenchymal and intraventricular pressure monitor. The ventricular catheter tubing was also connected to a sensor for fluid-coupled measurement. Pressure from all three sources was measured hourly with and without ventricular drainage. Statistically significant correlation within each monitoring site was seen. No monitoring location was more predictive of global pressure changes or more responsive to pressure changes related to patient stimulation. However, the intraventricular pressure measurements were not reliable in the presence of cerebrospinal fluid drainage whereas the parenchymal measurements remained unaffected. Intraparenchymal pressure monitoring provides equivalent, statistically similar pressure measurements when compared to intraventricular monitors in all care and clinical settings. This is particularly valuable when uninterrupted cerebrospinal fluid drainage is desirable.

  4. Diagnostic accuracy assessment of cytopathological examination of feline sporotrichosis.

    PubMed

    Jessica, N; Sonia, R L; Rodrigo, C; Isabella, D F; Tânia, M P; Jeferson, C; Anna, B F; Sandro, A

    2015-11-01

    Sporotrichosis is an implantation mycosis caused by pathogenic species of Sporothrix schenckii complex that affects humans and animals, especially cats. Its main forms of zoonotic transmission include scratching, biting and/or contact with the exudate from lesions of sick cats. In Brazil, epidemic involving humans, dogs and cats has occurred since 1998. The definitive diagnosis of sporotrichosis is obtained by the isolation of the fungus in culture; however, the result can take up to four weeks, which may delay the beginning of antifungal treatment in some cases. Cytopathological examination is often used in feline sporotrichosis diagnosis, but accuracy parameters have not been established yet. The aim of this study was to evaluate the accuracy and reliability of cytopathological examination in the diagnosis of feline sporotrichosis. The present study included 244 cats from the metropolitan region of Rio de Janeiro, mostly males in reproductive age with three or more lesions in non-adjacent anatomical places. To evaluate the inter-observer reliability, two different observers performed the microscopic examination of the slides blindly. Test sensitivity was 84.9%. The values of positive predictive value, negative predictive value, positive likelihood ratio, negative likelihood ratio and accuracy were 86.0, 24.4, 2.02, 0.26 and 82.8%, respectively. The reliability between the two observers was considered substantial. We conclude that the cytopathological examination is a sensitive, rapid and practical method to be used in feline sporotrichosis diagnosis in outbreaks of this mycosis. © The Author 2015. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Reliability Analysis of Uniaxially Ground Brittle Materials

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan A.; Nemeth, Noel N.; Powers, Lynn M.; Choi, Sung R.

    1995-01-01

    The fast fracture strength distribution of uniaxially ground, alpha silicon carbide was investigated as a function of grinding angle relative to the principal stress direction in flexure. Both as-ground and ground/annealed surfaces were investigated. The resulting flexural strength distributions were used to verify reliability models and predict the strength distribution of larger plate specimens tested in biaxial flexure. Complete fractography was done on the specimens. Failures occurred from agglomerates, machining cracks, or hybrid flaws that consisted of a machining crack located at a processing agglomerate. Annealing eliminated failures due to machining damage. Reliability analyses were performed using two and three parameter Weibull and Batdorf methodologies. The Weibull size effect was demonstrated for machining flaws. Mixed mode reliability models reasonably predicted the strength distributions of uniaxial flexure and biaxial plate specimens.

  6. Measurement of fatigue: Comparison of the reliability and validity of single-item and short measures to a comprehensive measure.

    PubMed

    Kim, Hee-Ju; Abraham, Ivo

    2017-01-01

    Evidence is needed on the clinicometric properties of single-item or short measures as alternatives to comprehensive measures. We examined whether two single-item fatigue measures (i.e., Likert scale, numeric rating scale) or a short fatigue measure were comparable to a comprehensive measure in reliability (i.e., internal consistency and test-retest reliability) and validity (i.e., convergent, concurrent, and predictive validity) in Korean young adults. For this quantitative study, we selected the Functional Assessment of Chronic Illness Therapy-Fatigue for the comprehensive measure and the Profile of Mood States-Brief, Fatigue subscale for the short measure; and constructed two single-item measures. A total of 368 students from four nursing colleges in South Korea participated. We used Cronbach's alpha and item-total correlation for internal consistency reliability and intraclass correlation coefficient for test-retest reliability. We assessed Pearson's correlation with a comprehensive measure for convergent validity, with perceived stress level and sleep quality for concurrent validity and the receiver operating characteristic curve for predictive validity. The short measure was comparable to the comprehensive measure in internal consistency reliability (Cronbach's alpha=0.81 vs. 0.88); test-retest reliability (intraclass correlation coefficient=0.66 vs. 0.61); convergent validity (r with comprehensive measure=0.79); concurrent validity (r with perceived stress=0.55, r with sleep quality=0.39) and predictive validity (area under curve=0.88). Single-item measures were not comparable to the comprehensive measure. A short fatigue measure exhibited similar levels of reliability and validity to the comprehensive measure in Korean young adults. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Opportunities of probabilistic flood loss models

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno

    2016-04-01

    Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved in comparison to uni-variable Stage damage function. Overall, Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  8. Reliability of IGBT in a STATCOM for Harmonic Compensation and Power Factor Correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopi Reddy, Lakshmi Reddy; Tolbert, Leon M; Ozpineci, Burak

    With smart grid integration, there is a need to characterize reliability of a power system by including reliability of power semiconductors in grid related applications. In this paper, the reliability of IGBTs in a STATCOM application is presented for two different applications, power factor correction and harmonic elimination. The STATCOM model is developed in EMTP, and analytical equations for average conduction losses in an IGBT and a diode are derived and compared with experimental data. A commonly used reliability model is used to predict reliability of IGBT.

  9. Identifying dyslexia in adults: an iterative method using the predictive value of item scores and self-report questions.

    PubMed

    Tamboer, Peter; Vorst, Harrie C M; Oort, Frans J

    2014-04-01

    Methods for identifying dyslexia in adults vary widely between studies. Researchers have to decide how many tests to use, which tests are considered to be the most reliable, and how to determine cut-off scores. The aim of this study was to develop an objective and powerful method for diagnosing dyslexia. We took various methodological measures, most of which are new compared to previous methods. We used a large sample of Dutch first-year psychology students, we considered several options for exclusion and inclusion criteria, we collected as many cognitive tests as possible, we used six independent sources of biographical information for a criterion of dyslexia, we compared the predictive power of discriminant analyses and logistic regression analyses, we used both sum scores and item scores as predictor variables, we used self-report questions as predictor variables, and we retested the reliability of predictions with repeated prediction analyses using an adjusted criterion. We were able to identify 74 dyslexic and 369 non-dyslexic students. For 37 students, various predictions were too inconsistent for a final classification. The most reliable predictions were acquired with item scores and self-report questions. The main conclusion is that it is possible to identify dyslexia with a high reliability, although the exact nature of dyslexia is still unknown. We therefore believe that this study yielded valuable information for future methods of identifying dyslexia in Dutch as well as in other languages, and that this would be beneficial for comparing studies across countries.

  10. Evapotranspiration using a satellite-based surface energy balance with standardized ground control

    NASA Astrophysics Data System (ADS)

    Trezza, Ricardo

    This study evaluated the potential of using the S&barbelow;urface E&barbelow;nergy Ḇalance A&barbelow;lgorithm for Ḻand (SEBAL) as a means for estimating evapotranspiration (ET) for local and regional scales in Southern Idaho. The original SEBAL model was refined during this study to provide better estimation of ET in agricultural areas and to make more reliable estimates of ET from other surfaces as well, including mountainous terrain. The modified version of SEBAL used in this study, termed as SEBALID (ID stands for Idaho) includes standardization of the two SEBAL "anchor" pixels, the use of a water balance model to track top soil moisture, adaptation of components of SEBAL for better prediction of the surface energy balance in mountains and sloping terrain, and use of the ratio between actual ET and alfalfa reference evapotranspiration (ET r) as a means for obtaining the temporal integration of instantaneous ET to daily and seasonal values. Validation of the SEBALID model at a local scale was performed by comparing lysimeter ET measurements from the USDA-ARS facility at Kimberly, Idaho, with ET predictions by SEBAL using Landsat 5 TM imagery. Comparison of measured and predicted ET values was challenging due to the resolution of the Landsat thermal band (120m x 120m) and the relatively small size of the lysimeter fields. In the cases where thermal information was adequate, SEBALID predictions were close to the measured values of ET in the lysimeters. Application of SEBALID at a regional scale was performed using Landsat 7 ETM+ and Landsat 5 TM imagery for the Eastern Snake Plain Aquifer (ESPA) region in Idaho during 2000. The results indicated that SEBALID performed well for predicting daily and seasonal ET for agricultural areas. Some unreasonable results were obtained for desert and basalt areas, due to uncertainties of the prediction of surface parameters. In mountains, even though validation of results was not possible, the values of ET obtained reflected the progress produced by the refinements made to the original SEBAL algorithm.

  11. Predicting water quality by relating secchi-disk transparency and chlorophyll a measurements to satellite imagery for Michigan Inland Lakes, August 2002

    USGS Publications Warehouse

    Fuller, L.M.; Aichele, Stephen S.; Minnerick, R.J.

    2004-01-01

    Inland lakes are an important economic and environmental resource for Michigan. The U.S. Geological Survey and the Michigan Department of Environmental Quality have been cooperatively monitoring the quality of selected lakes in Michigan through the Lake Water Quality Assessment program. Through this program, approximately 730 of Michigan's 11,000 inland lakes will be monitored once during this 15-year study. Targeted lakes will be sampled during spring turnover and again in late summer to characterize water quality. Because more extensive and more frequent sampling is not economically feasible in the Lake Water Quality Assessment program, the U.S. Geological Survey and Michigan Department of Environmental Quality investigate the use of satellite imagery as a means of estimating water quality in unsampled lakes. Satellite imagery has been successfully used in Minnesota, Wisconsin, and elsewhere to compute the trophic state of inland lakes from predicted secchi-disk measurements. Previous attempts of this kind in Michigan resulted in a poorer fit between observed and predicted data than was found for Minnesota or Wisconsin. This study tested whether estimates could be improved by using atmospherically corrected satellite imagery, whether a more appropriate regression model could be obtained for Michigan, and whether chlorophyll a concentrations could be reliably predicted from satellite imagery in order to compute trophic state of inland lakes. Although the atmospheric-correction did not significantly improve estimates of lake-water quality, a new regression equation was identified that consistently yielded better results than an equation obtained from the literature. A stepwise regression was used to determine an equation that accurately predicts chlorophyll a concentrations in northern Lower Michigan.

  12. ‘Put Your Money Where Your Mouth Is!’: Effects of Streaks on Confidence and Betting in a Binary Choice Task

    PubMed Central

    Studer, Bettina; Limbrick-Oldfield, Eve H; Clark, Luke

    2015-01-01

    Human choice under uncertainty is influenced by erroneous beliefs about randomness. In simple binary choice tasks, such as red/black predictions in roulette, long outcome runs (e.g. red, red, red) typically increase the tendency to predict the other outcome (i.e. black), an effect labeled the “gambler's fallacy.” In these settings, participants may also attend to streaks in their predictive performance. Winning and losing streaks are thought to affect decision confidence, although prior work indicates conflicting directions. Over three laboratory experiments involving red/black predictions in a sequential roulette task, we sought to identify the effects of outcome runs and winning/losing streaks upon color predictions, decision confidence and betting behavior. Experiments 1 (n = 40) and 3 (n = 40) obtained trial-by-trial confidence ratings, with a win/no win payoff and a no loss/loss payoff, respectively. Experiment 2 (n = 39) obtained a trial-by-trial bet amount on an equivalent scale. In each experiment, the gambler's fallacy was observed on choice behavior after color runs and, in experiment 2, on betting behavior after color runs. Feedback streaks exerted no reliable influence on confidence ratings, in either payoff condition. Betting behavior, on the other hand, increased as a function of losing streaks. The increase in betting on losing streaks is interpreted as a manifestation of loss chasing; these data help clarify the psychological mechanisms underlying loss chasing and caution against the use of betting measures (“post-decision wagering”) as a straightforward index of decision confidence. © 2014 The Authors. Journal of Behavioral Decision Making published by John Wiley & Sons Ltd. PMID:26236092

  13. 'Put Your Money Where Your Mouth Is!': Effects of Streaks on Confidence and Betting in a Binary Choice Task.

    PubMed

    Studer, Bettina; Limbrick-Oldfield, Eve H; Clark, Luke

    2015-07-01

    Human choice under uncertainty is influenced by erroneous beliefs about randomness. In simple binary choice tasks, such as red/black predictions in roulette, long outcome runs (e.g. red, red, red) typically increase the tendency to predict the other outcome (i.e. black), an effect labeled the "gambler's fallacy." In these settings, participants may also attend to streaks in their predictive performance. Winning and losing streaks are thought to affect decision confidence, although prior work indicates conflicting directions. Over three laboratory experiments involving red/black predictions in a sequential roulette task, we sought to identify the effects of outcome runs and winning/losing streaks upon color predictions, decision confidence and betting behavior. Experiments 1 ( n  = 40) and 3 ( n  = 40) obtained trial-by-trial confidence ratings, with a win/no win payoff and a no loss/loss payoff, respectively. Experiment 2 ( n  = 39) obtained a trial-by-trial bet amount on an equivalent scale. In each experiment, the gambler's fallacy was observed on choice behavior after color runs and, in experiment 2, on betting behavior after color runs. Feedback streaks exerted no reliable influence on confidence ratings, in either payoff condition. Betting behavior, on the other hand, increased as a function of losing streaks. The increase in betting on losing streaks is interpreted as a manifestation of loss chasing; these data help clarify the psychological mechanisms underlying loss chasing and caution against the use of betting measures ("post-decision wagering") as a straightforward index of decision confidence. © 2014 The Authors. Journal of Behavioral Decision Making published by John Wiley & Sons Ltd.

  14. The redoubtable ecological periodic table

    EPA Science Inventory

    Ecological periodic tables are repositories of reliable information on quantitative, predictably recurring (periodic) habitat–community patterns and their uncertainty, scaling and transferability. Their reliability derives from their grounding in sound ecological principle...

  15. Analysis of a simplified normalized covariance measure based on binary weighting functions for predicting the intelligibility of noise-suppressed speech.

    PubMed

    Chen, Fei; Loizou, Philipos C

    2010-12-01

    The normalized covariance measure (NCM) has been shown previously to predict reliably the intelligibility of noise-suppressed speech containing non-linear distortions. This study analyzes a simplified NCM measure that requires only a small number of bands (not necessarily contiguous) and uses simple binary (1 or 0) weighting functions. The rationale behind the use of a small number of bands is to account for the fact that the spectral information contained in contiguous or nearby bands is correlated and redundant. The modified NCM measure was evaluated with speech intelligibility scores obtained by normal-hearing listeners in 72 noisy conditions involving noise-suppressed speech corrupted by four different types of maskers (car, babble, train, and street interferences). High correlation (r = 0.8) was obtained with the modified NCM measure even when only one band was used. Further analysis revealed a masker-specific pattern of correlations when only one band was used, and bands with low correlation signified the corresponding envelopes that have been severely distorted by the noise-suppression algorithm and/or the masker. Correlation improved to r = 0.84 when only two disjoint bands (centered at 325 and 1874 Hz) were used. Even further improvements in correlation (r = 0.85) were obtained when three or four lower-frequency (<700 Hz) bands were selected.

  16. Application of an Integrated HPC Reliability Prediction Framework to HMMWV Suspension System

    DTIC Science & Technology

    2010-09-13

    model number M966 (TOW Missle Carrier, Basic Armor without weapons), since they were available. Tires used for all simulations were the bias-type...vehicle fleet, including consideration of all kinds of uncertainty, especially including model uncertainty. The end result will be a tool to use...building an adequate vehicle reliability prediction framework for military vehicles is the accurate modeling of the integration of various types of

  17. Design of Oil-Lubricated Machine for Life and Reliability

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.

    2007-01-01

    In the post-World War II era, the major technology drivers for improving the life, reliability, and performance of rolling-element bearings and gears have been the jet engine and the helicopter. By the late 1950s, most of the materials used for bearings and gears in the aerospace industry had been introduced into use. By the early 1960s, the life of most steels was increased over that experienced in the early 1940s, primarily by the introduction of vacuum degassing and vacuum melting processes in the late 1950s. The development of elastohydrodynamic (EHD) theory showed that most rolling bearings and gears have a thin film separating the contacting bodies during motion and it is that film which affects their lives. Computer programs modeling bearing and gear dynamics that incorporate probabilistic life prediction methods and EHD theory enable optimization of rotating machinery based on life and reliability. With improved manufacturing and processing, the potential improvement in bearing and gear life can be as much as 80 times that attainable in the early 1950s. The work presented summarizes the use of laboratory fatigue data for bearings and gears coupled with probabilistic life prediction and EHD theories to predict the life and reliability of a commercial turboprop gearbox. The resulting predictions are compared with field data.

  18. A model for prediction of fume formation rate in gas metal arc welding (GMAW), globular and spray modes, DC electrode positive.

    PubMed

    Dennis, J H; Hewitt, P J; Redding, C A; Workman, A D

    2001-03-01

    Prediction of fume formation rate during metal arc welding and the composition of the fume are of interest to occupational hygienists concerned with risk assessment and to manufacturers of welding consumables. A model for GMAW (DC electrode positive) is described based on the welder determined process parameters (current, wire feed rate and wire composition), on the surface area of molten metal in the arc and on the partial vapour pressures of the component metals of the alloy wire. The model is applicable to globular and spray welding transfer modes but not to dip mode. Metal evaporation from a droplet is evaluated for short time increments and total evaporation obtained by summation over the life of the droplet. The contribution of fume derived from the weld pool and spatter (particles of metal ejected from the arc) is discussed, as are limitations of the model. Calculated droplet temperatures are similar to values determined by other workers. A degree of relationship between predicted and measured fume formation rates is demonstrated but the model does not at this stage provide a reliable predictive tool.

  19. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    NASA Technical Reports Server (NTRS)

    Celaya, Jose; Kulkarni, Chetan; Biswas, Gautam; Saha, Sankalita; Goebel, Kai

    2011-01-01

    A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.

  20. Towards A Model-Based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    NASA Technical Reports Server (NTRS)

    Celaya, Jose R.; Kulkarni, Chetan S.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.

Top