Sample records for develop accurate predictive

  1. Predictability of the 2012 Great Arctic Cyclone on medium-range timescales

    NASA Astrophysics Data System (ADS)

    Yamagami, Akio; Matsueda, Mio; Tanaka, Hiroshi L.

    2018-03-01

    Arctic Cyclones (ACs) can have a significant impact on the Arctic region. Therefore, the accurate prediction of ACs is important in anticipating their associated environmental and societal costs. This study investigates the predictability of the 2012 Great Arctic Cyclone (AC12) that exhibited a minimum central pressure of 964 hPa on 6 August 2012, using five medium-range ensemble forecasts. We show that the development and position of AC12 were better predicted in forecasts initialized on and after 4 August 2012. In addition, the position of AC12 was more predictable than its development. A comparison of ensemble members, classified by the error in predictability of the development and position of AC12, revealed that an accurate prediction of upper-level fields, particularly temperature, was important for the prediction of this event. The predicted position of AC12 was influenced mainly by the prediction of the polar vortex, whereas the predicted development of AC12 was dependent primarily on the prediction of the merging of upper-level warm cores. Consequently, an accurate prediction of the polar vortex position and the development of the warm core through merging resulted in better prediction of AC12.

  2. Heart rate during basketball game play and volleyball drills accurately predicts oxygen uptake and energy expenditure.

    PubMed

    Scribbans, T D; Berg, K; Narazaki, K; Janssen, I; Gurd, B J

    2015-09-01

    There is currently little information regarding the ability of metabolic prediction equations to accurately predict oxygen uptake and exercise intensity from heart rate (HR) during intermittent sport. The purpose of the present study was to develop and, cross-validate equations appropriate for accurately predicting oxygen cost (VO2) and energy expenditure from HR during intermittent sport participation. Eleven healthy adult males (19.9±1.1yrs) were recruited to establish the relationship between %VO2peak and %HRmax during low-intensity steady state endurance (END), moderate-intensity interval (MOD) and high intensity-interval exercise (HI), as performed on a cycle ergometer. Three equations (END, MOD, and HI) for predicting %VO2peak based on %HRmax were developed. HR and VO2 were directly measured during basketball games (6 male, 20.8±1.0 yrs; 6 female, 20.0±1.3yrs) and volleyball drills (12 female; 20.8±1.0yrs). Comparisons were made between measured and predicted VO2 and energy expenditure using the 3 equations developed and 2 previously published equations. The END and MOD equations accurately predicted VO2 and energy expenditure, while the HI equation underestimated, and the previously published equations systematically overestimated VO2 and energy expenditure. Intermittent sport VO2 and energy expenditure can be accurately predicted from heart rate data using either the END (%VO2peak=%HRmax x 1.008-17.17) or MOD (%VO2peak=%HRmax x 1.2-32) equations. These 2 simple equations provide an accessible and cost-effective method for accurate estimation of exercise intensity and energy expenditure during intermittent sport.

  3. Development of a noise prediction model based on advanced fuzzy approaches in typical industrial workrooms.

    PubMed

    Aliabadi, Mohsen; Golmohammadi, Rostam; Khotanlou, Hassan; Mansoorizadeh, Muharram; Salarpour, Amir

    2014-01-01

    Noise prediction is considered to be the best method for evaluating cost-preventative noise controls in industrial workrooms. One of the most important issues is the development of accurate models for analysis of the complex relationships among acoustic features affecting noise level in workrooms. In this study, advanced fuzzy approaches were employed to develop relatively accurate models for predicting noise in noisy industrial workrooms. The data were collected from 60 industrial embroidery workrooms in the Khorasan Province, East of Iran. The main acoustic and embroidery process features that influence the noise were used to develop prediction models using MATLAB software. Multiple regression technique was also employed and its results were compared with those of fuzzy approaches. Prediction errors of all prediction models based on fuzzy approaches were within the acceptable level (lower than one dB). However, Neuro-fuzzy model (RMSE=0.53dB and R2=0.88) could slightly improve the accuracy of noise prediction compared with generate fuzzy model. Moreover, fuzzy approaches provided more accurate predictions than did regression technique. The developed models based on fuzzy approaches as useful prediction tools give professionals the opportunity to have an optimum decision about the effectiveness of acoustic treatment scenarios in embroidery workrooms.

  4. Motor system contribution to action prediction: Temporal accuracy depends on motor experience.

    PubMed

    Stapel, Janny C; Hunnius, Sabine; Meyer, Marlene; Bekkering, Harold

    2016-03-01

    Predicting others' actions is essential for well-coordinated social interactions. In two experiments including an infant population, this study addresses to what extent motor experience of an observer determines prediction accuracy for others' actions. Results show that infants who were proficient crawlers but inexperienced walkers predicted crawling more accurately than walking, whereas age groups mastering both skills (i.e. toddlers and adults) were equally accurate in predicting walking and crawling. Regardless of experience, human movements were predicted more accurately by all age groups than non-human movement control stimuli. This suggests that for predictions to be accurate, the observed act needs to be established in the motor repertoire of the observer. Through the acquisition of new motor skills, we also become better at predicting others' actions. The findings thus stress the relevance of motor experience for social-cognitive development. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. SAE for the prediction of road traffic status from taxicab operating data and bus smart card data

    NASA Astrophysics Data System (ADS)

    Zhengfeng, Huang; Pengjun, Zheng; Wenjun, Xu; Gang, Ren

    Road traffic status is significant for trip decision and traffic management, and thus should be predicted accurately. A contribution is that we consider multi-modal data for traffic status prediction than only using single source data. With the substantial data from Ningbo Passenger Transport Management Sector (NPTMS), we wished to determine whether it was possible to develop Stacked Autoencoders (SAEs) for accurately predicting road traffic status from taxicab operating data and bus smart card data. We show that SAE performed better than linear regression model and Back Propagation (BP) neural network for determining the relationship between road traffic status and those factors. In a 26-month data experiment using SAE, we show that it is possible to develop highly accurate predictions (91% test accuracy) of road traffic status from daily taxicab operating data and bus smart card data.

  6. Just-in-Time Correntropy Soft Sensor with Noisy Data for Industrial Silicon Content Prediction.

    PubMed

    Chen, Kun; Liang, Yu; Gao, Zengliang; Liu, Yi

    2017-08-08

    Development of accurate data-driven quality prediction models for industrial blast furnaces encounters several challenges mainly because the collected data are nonlinear, non-Gaussian, and uneven distributed. A just-in-time correntropy-based local soft sensing approach is presented to predict the silicon content in this work. Without cumbersome efforts for outlier detection, a correntropy support vector regression (CSVR) modeling framework is proposed to deal with the soft sensor development and outlier detection simultaneously. Moreover, with a continuous updating database and a clustering strategy, a just-in-time CSVR (JCSVR) method is developed. Consequently, more accurate prediction and efficient implementations of JCSVR can be achieved. Better prediction performance of JCSVR is validated on the online silicon content prediction, compared with traditional soft sensors.

  7. Just-in-Time Correntropy Soft Sensor with Noisy Data for Industrial Silicon Content Prediction

    PubMed Central

    Chen, Kun; Liang, Yu; Gao, Zengliang; Liu, Yi

    2017-01-01

    Development of accurate data-driven quality prediction models for industrial blast furnaces encounters several challenges mainly because the collected data are nonlinear, non-Gaussian, and uneven distributed. A just-in-time correntropy-based local soft sensing approach is presented to predict the silicon content in this work. Without cumbersome efforts for outlier detection, a correntropy support vector regression (CSVR) modeling framework is proposed to deal with the soft sensor development and outlier detection simultaneously. Moreover, with a continuous updating database and a clustering strategy, a just-in-time CSVR (JCSVR) method is developed. Consequently, more accurate prediction and efficient implementations of JCSVR can be achieved. Better prediction performance of JCSVR is validated on the online silicon content prediction, compared with traditional soft sensors. PMID:28786957

  8. Risk prediction model: Statistical and artificial neural network approach

    NASA Astrophysics Data System (ADS)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  9. Development of advanced stability theory suction prediction techniques for laminar flow control. [on swept wings

    NASA Technical Reports Server (NTRS)

    Srokowski, A. J.

    1978-01-01

    The problem of obtaining accurate estimates of suction requirements on swept laminar flow control wings was discussed. A fast accurate computer code developed to predict suction requirements by integrating disturbance amplification rates was described. Assumptions and approximations used in the present computer code are examined in light of flow conditions on the swept wing which may limit their validity.

  10. Remaining dischargeable time prediction for lithium-ion batteries using unscented Kalman filter

    NASA Astrophysics Data System (ADS)

    Dong, Guangzhong; Wei, Jingwen; Chen, Zonghai; Sun, Han; Yu, Xiaowei

    2017-10-01

    To overcome the range anxiety, one of the important strategies is to accurately predict the range or dischargeable time of the battery system. To accurately predict the remaining dischargeable time (RDT) of a battery, a RDT prediction framework based on accurate battery modeling and state estimation is presented in this paper. Firstly, a simplified linearized equivalent-circuit-model is developed to simulate the dynamic characteristics of a battery. Then, an online recursive least-square-algorithm method and unscented-Kalman-filter are employed to estimate the system matrices and SOC at every prediction point. Besides, a discrete wavelet transform technique is employed to capture the statistical information of past dynamics of input currents, which are utilized to predict the future battery currents. Finally, the RDT can be predicted based on the battery model, SOC estimation results and predicted future battery currents. The performance of the proposed methodology has been verified by a lithium-ion battery cell. Experimental results indicate that the proposed method can provide an accurate SOC and parameter estimation and the predicted RDT can solve the range anxiety issues.

  11. A Critical Review for Developing Accurate and Dynamic Predictive Models Using Machine Learning Methods in Medicine and Health Care.

    PubMed

    Alanazi, Hamdan O; Abdullah, Abdul Hanan; Qureshi, Kashif Naseer

    2017-04-01

    Recently, Artificial Intelligence (AI) has been used widely in medicine and health care sector. In machine learning, the classification or prediction is a major field of AI. Today, the study of existing predictive models based on machine learning methods is extremely active. Doctors need accurate predictions for the outcomes of their patients' diseases. In addition, for accurate predictions, timing is another significant factor that influences treatment decisions. In this paper, existing predictive models in medicine and health care have critically reviewed. Furthermore, the most famous machine learning methods have explained, and the confusion between a statistical approach and machine learning has clarified. A review of related literature reveals that the predictions of existing predictive models differ even when the same dataset is used. Therefore, existing predictive models are essential, and current methods must be improved.

  12. Estimating wildfire risk on a Mojave Desert landscape using remote sensing and field sampling

    USGS Publications Warehouse

    Van Linn, Peter F.; Nussear, Kenneth E.; Esque, Todd C.; DeFalco, Lesley A.; Inman, Richard D.; Abella, Scott R.

    2013-01-01

    Predicting wildfires that affect broad landscapes is important for allocating suppression resources and guiding land management. Wildfire prediction in the south-western United States is of specific concern because of the increasing prevalence and severe effects of fire on desert shrublands and the current lack of accurate fire prediction tools. We developed a fire risk model to predict fire occurrence in a north-eastern Mojave Desert landscape. First we developed a spatial model using remote sensing data to predict fuel loads based on field estimates of fuels. We then modelled fire risk (interactions of fuel characteristics and environmental conditions conducive to wildfire) using satellite imagery, our model of fuel loads, and spatial data on ignition potential (lightning strikes and distance to roads), topography (elevation and aspect) and climate (maximum and minimum temperatures). The risk model was developed during a fire year at our study landscape and validated at a nearby landscape; model performance was accurate and similar at both sites. This study demonstrates that remote sensing techniques used in combination with field surveys can accurately predict wildfire risk in the Mojave Desert and may be applicable to other arid and semiarid lands where wildfires are prevalent.

  13. Development of Tripropellant CFD Design Code

    NASA Technical Reports Server (NTRS)

    Farmer, Richard C.; Cheng, Gary C.; Anderson, Peter G.

    1998-01-01

    A tripropellant, such as GO2/H2/RP-1, CFD design code has been developed to predict the local mixing of multiple propellant streams as they are injected into a rocket motor. The code utilizes real fluid properties to account for the mixing and finite-rate combustion processes which occur near an injector faceplate, thus the analysis serves as a multi-phase homogeneous spray combustion model. Proper accounting of the combustion allows accurate gas-side temperature predictions which are essential for accurate wall heating analyses. The complex secondary flows which are predicted to occur near a faceplate cannot be quantitatively predicted by less accurate methodology. Test cases have been simulated to describe an axisymmetric tripropellant coaxial injector and a 3-dimensional RP-1/LO2 impinger injector system. The analysis has been shown to realistically describe such injector combustion flowfields. The code is also valuable to design meaningful future experiments by determining the critical location and type of measurements needed.

  14. An accurate model for predicting high frequency noise of nanoscale NMOS SOI transistors

    NASA Astrophysics Data System (ADS)

    Shen, Yanfei; Cui, Jie; Mohammadi, Saeed

    2017-05-01

    A nonlinear and scalable model suitable for predicting high frequency noise of N-type Metal Oxide Semiconductor (NMOS) transistors is presented. The model is developed for a commercial 45 nm CMOS SOI technology and its accuracy is validated through comparison with measured performance of a microwave low noise amplifier. The model employs the virtual source nonlinear core and adds parasitic elements to accurately simulate the RF behavior of multi-finger NMOS transistors up to 40 GHz. For the first time, the traditional long-channel thermal noise model is supplemented with an injection noise model to accurately represent the noise behavior of these short-channel transistors up to 26 GHz. The developed model is simple and easy to extract, yet very accurate.

  15. Parturition prediction and timing of canine pregnancy

    PubMed Central

    Kim, YeunHee; Travis, Alexander J.; Meyers-Wallen, Vicki N.

    2007-01-01

    An accurate method of predicting the date of parturition in the bitch is clinically useful to minimize or prevent reproductive losses by timely intervention. Similarly, an accurate method of timing canine ovulation and gestation is critical for development of assisted reproductive technologies, e.g. estrous synchronization and embryo transfer. This review discusses present methods for accurately timing canine gestational age and outlines their use in clinical management of high-risk pregnancies and embryo transfer research. PMID:17904630

  16. An extreme Arctic cyclone in August 2016 and its predictability on medium-range timescales

    NASA Astrophysics Data System (ADS)

    Yamagami, Akio; Matsueda, Mio; Tanaka, Hiroshi

    2017-04-01

    An extremely strong Arctic cyclone (AC) developed in August 2016. The AC exhibited a minimum sea level pressure (SLP) of 967.2 hPa and covered the entire Pacific sector of the Arctic Ocean at 0000UTC on 16 August. At this time the AC was comparable to the strong AC observed in August 2012, in terms of horizontal extent, position, and intensity as measured by SLP. Two processes contributed to the explosive development of the AC: growth due to baroclinic instability, similar to extratropical cyclones, during the early part of the development stage, and later nonlinear development via the merging of upper warm cores. The AC was maintained for more than one month through multiple mergings with cyclones both generated in the Arctic and migrating northward from lower latitudes, as a result of the high cyclone activity in summer 2016. This study also investigated the predictability of the AC using operational medium-range ensemble forecasts: CMC (Canada), ECMWF (EU), JMA (Japan), NCEP (USA), and UKMO (UK), available at the The Interactive Grand Global Ensemble (TIGGE) database. The minimum SLP of the AC at 0000UTC on 16 August was well predicted by ECMWF 6-day, NCEP and UKMO 5-day, CMC 4-day, and JMA 3-day in advance. The predictability of the minimum SLP of the AC in August 2016 was much higher than that of the AC in 2012 August. Whereas most of the members well predicted the cyclogenesis of the AC, the growth due to baroclinic instability was weaker in some members. Even if the baroclinic growth was predicted well, predicted AC did not develop when the nonlinear development via the merging was not predict accurately. The accurate prediction of the processes in both early and later parts of the development stage was important for the accurate prediction of the development of the AC.

  17. Development of estrogen receptor beta binding prediction model using large sets of chemicals.

    PubMed

    Sakkiah, Sugunadevi; Selvaraj, Chandrabose; Gong, Ping; Zhang, Chaoyang; Tong, Weida; Hong, Huixiao

    2017-11-03

    We developed an ER β binding prediction model to facilitate identification of chemicals specifically bind ER β or ER α together with our previously developed ER α binding model. Decision Forest was used to train ER β binding prediction model based on a large set of compounds obtained from EADB. Model performance was estimated through 1000 iterations of 5-fold cross validations. Prediction confidence was analyzed using predictions from the cross validations. Informative chemical features for ER β binding were identified through analysis of the frequency data of chemical descriptors used in the models in the 5-fold cross validations. 1000 permutations were conducted to assess the chance correlation. The average accuracy of 5-fold cross validations was 93.14% with a standard deviation of 0.64%. Prediction confidence analysis indicated that the higher the prediction confidence the more accurate the predictions. Permutation testing results revealed that the prediction model is unlikely generated by chance. Eighteen informative descriptors were identified to be important to ER β binding prediction. Application of the prediction model to the data from ToxCast project yielded very high sensitivity of 90-92%. Our results demonstrated ER β binding of chemicals could be accurately predicted using the developed model. Coupling with our previously developed ER α prediction model, this model could be expected to facilitate drug development through identification of chemicals that specifically bind ER β or ER α .

  18. Rapid prediction of particulate, humus and resistant fractions of soil organic carbon in reforested lands using infrared spectroscopy.

    PubMed

    Madhavan, Dinesh B; Baldock, Jeff A; Read, Zoe J; Murphy, Simon C; Cunningham, Shaun C; Perring, Michael P; Herrmann, Tim; Lewis, Tom; Cavagnaro, Timothy R; England, Jacqueline R; Paul, Keryn I; Weston, Christopher J; Baker, Thomas G

    2017-05-15

    Reforestation of agricultural lands with mixed-species environmental plantings can effectively sequester C. While accurate and efficient methods for predicting soil organic C content and composition have recently been developed for soils under agricultural land uses, such methods under forested land uses are currently lacking. This study aimed to develop a method using infrared spectroscopy for accurately predicting total organic C (TOC) and its fractions (particulate, POC; humus, HOC; and resistant, ROC organic C) in soils under environmental plantings. Soils were collected from 117 paired agricultural-reforestation sites across Australia. TOC fractions were determined in a subset of 38 reforested soils using physical fractionation by automated wet-sieving and 13 C nuclear magnetic resonance (NMR) spectroscopy. Mid- and near-infrared spectra (MNIRS, 6000-450 cm -1 ) were acquired from finely-ground soils from environmental plantings and agricultural land. Satisfactory prediction models based on MNIRS and partial least squares regression (PLSR) were developed for TOC and its fractions. Leave-one-out cross-validations of MNIRS-PLSR models indicated accurate predictions (R 2  > 0.90, negligible bias, ratio of performance to deviation > 3) and fraction-specific functional group contributions to beta coefficients in the models. TOC and its fractions were predicted using the cross-validated models and soil spectra for 3109 reforested and agricultural soils. The reliability of predictions determined using k-nearest neighbour score distance indicated that >80% of predictions were within the satisfactory inlier limit. The study demonstrated the utility of infrared spectroscopy (MNIRS-PLSR) to rapidly and economically determine TOC and its fractions and thereby accurately describe the effects of land use change such as reforestation on agricultural soils. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Comparison of Predictive Modeling Methods of Aircraft Landing Speed

    NASA Technical Reports Server (NTRS)

    Diallo, Ousmane H.

    2012-01-01

    Expected increases in air traffic demand have stimulated the development of air traffic control tools intended to assist the air traffic controller in accurately and precisely spacing aircraft landing at congested airports. Such tools will require an accurate landing-speed prediction to increase throughput while decreasing necessary controller interventions for avoiding separation violations. There are many practical challenges to developing an accurate landing-speed model that has acceptable prediction errors. This paper discusses the development of a near-term implementation, using readily available information, to estimate/model final approach speed from the top of the descent phase of flight to the landing runway. As a first approach, all variables found to contribute directly to the landing-speed prediction model are used to build a multi-regression technique of the response surface equation (RSE). Data obtained from operations of a major airlines for a passenger transport aircraft type to the Dallas/Fort Worth International Airport are used to predict the landing speed. The approach was promising because it decreased the standard deviation of the landing-speed error prediction by at least 18% from the standard deviation of the baseline error, depending on the gust condition at the airport. However, when the number of variables is reduced to the most likely obtainable at other major airports, the RSE model shows little improvement over the existing methods. Consequently, a neural network that relies on a nonlinear regression technique is utilized as an alternative modeling approach. For the reduced number of variables cases, the standard deviation of the neural network models errors represent over 5% reduction compared to the RSE model errors, and at least 10% reduction over the baseline predicted landing-speed error standard deviation. Overall, the constructed models predict the landing-speed more accurately and precisely than the current state-of-the-art.

  20. A methodology for reduced order modeling and calibration of the upper atmosphere

    NASA Astrophysics Data System (ADS)

    Mehta, Piyush M.; Linares, Richard

    2017-10-01

    Atmospheric drag is the largest source of uncertainty in accurately predicting the orbit of satellites in low Earth orbit (LEO). Accurately predicting drag for objects that traverse LEO is critical to space situational awareness. Atmospheric models used for orbital drag calculations can be characterized either as empirical or physics-based (first principles based). Empirical models are fast to evaluate but offer limited real-time predictive/forecasting ability, while physics based models offer greater predictive/forecasting ability but require dedicated parallel computational resources. Also, calibration with accurate data is required for either type of models. This paper presents a new methodology based on proper orthogonal decomposition toward development of a quasi-physical, predictive, reduced order model that combines the speed of empirical and the predictive/forecasting capabilities of physics-based models. The methodology is developed to reduce the high dimensionality of physics-based models while maintaining its capabilities. We develop the methodology using the Naval Research Lab's Mass Spectrometer Incoherent Scatter model and show that the diurnal and seasonal variations can be captured using a small number of modes and parameters. We also present calibration of the reduced order model using the CHAMP and GRACE accelerometer-derived densities. Results show that the method performs well for modeling and calibration of the upper atmosphere.

  1. Predicting maize phenology: Intercomparison of functions for developmental response to temperature

    USDA-ARS?s Scientific Manuscript database

    Accurate prediction of phenological development in maize is fundamental to determining crop adaptation and yield potential. A number of thermal functions are used in crop models, but their relative precision in predicting maize development has not been quantified. The objectives of this study were t...

  2. Ensemble framework based real-time respiratory motion prediction for adaptive radiotherapy applications.

    PubMed

    Tatinati, Sivanagaraja; Nazarpour, Kianoush; Tech Ang, Wei; Veluvolu, Kalyana C

    2016-08-01

    Successful treatment of tumors with motion-adaptive radiotherapy requires accurate prediction of respiratory motion, ideally with a prediction horizon larger than the latency in radiotherapy system. Accurate prediction of respiratory motion is however a non-trivial task due to the presence of irregularities and intra-trace variabilities, such as baseline drift and temporal changes in fundamental frequency pattern. In this paper, to enhance the accuracy of the respiratory motion prediction, we propose a stacked regression ensemble framework that integrates heterogeneous respiratory motion prediction algorithms. We further address two crucial issues for developing a successful ensemble framework: (1) selection of appropriate prediction methods to ensemble (level-0 methods) among the best existing prediction methods; and (2) finding a suitable generalization approach that can successfully exploit the relative advantages of the chosen level-0 methods. The efficacy of the developed ensemble framework is assessed with real respiratory motion traces acquired from 31 patients undergoing treatment. Results show that the developed ensemble framework improves the prediction performance significantly compared to the best existing methods. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  3. Verification of a 2 kWe Closed-Brayton-Cycle Power Conversion System Mechanical Dynamics Model

    NASA Technical Reports Server (NTRS)

    Ludwiczak, Damian R.; Le, Dzu K.; McNelis, Anne M.; Yu, Albert C.; Samorezov, Sergey; Hervol, Dave S.

    2005-01-01

    Vibration test data from an operating 2 kWe closed-Brayton-cycle (CBC) power conversion system (PCS) located at the NASA Glenn Research Center was used for a comparison with a dynamic disturbance model of the same unit. This effort was performed to show that a dynamic disturbance model of a CBC PCS can be developed that can accurately predict the torque and vibration disturbance fields of such class of rotating machinery. The ability to accurately predict these disturbance fields is required before such hardware can be confidently integrated onto a spacecraft mission. Accurate predictions of CBC disturbance fields will be used for spacecraft control/structure interaction analyses and for understanding the vibration disturbances affecting the scientific instrumentation onboard. This paper discusses how test cell data measurements for the 2 kWe CBC PCS were obtained, the development of a dynamic disturbance model used to predict the transient torque and steady state vibration fields of the same unit, and a comparison of the two sets of data.

  4. Turbine Performance Optimization Task Status

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Turner, James E. (Technical Monitor)

    2001-01-01

    Capability to optimize for turbine performance and accurately predict unsteady loads will allow for increased reliability, Isp, and thrust-to-weight. The development of a fast, accurate aerodynamic design, analysis, and optimization system is required.

  5. Prediction of hearing loss among the noise-exposed workers in a steel factory using artificial intelligence approach.

    PubMed

    Aliabadi, Mohsen; Farhadian, Maryam; Darvishi, Ebrahim

    2015-08-01

    Prediction of hearing loss in noisy workplaces is considered to be an important aspect of hearing conservation program. Artificial intelligence, as a new approach, can be used to predict the complex phenomenon such as hearing loss. Using artificial neural networks, this study aims to present an empirical model for the prediction of the hearing loss threshold among noise-exposed workers. Two hundred and ten workers employed in a steel factory were chosen, and their occupational exposure histories were collected. To determine the hearing loss threshold, the audiometric test was carried out using a calibrated audiometer. The personal noise exposure was also measured using a noise dosimeter in the workstations of workers. Finally, data obtained five variables, which can influence the hearing loss, were used for the development of the prediction model. Multilayer feed-forward neural networks with different structures were developed using MATLAB software. Neural network structures had one hidden layer with the number of neurons being approximately between 5 and 15 neurons. The best developed neural networks with one hidden layer and ten neurons could accurately predict the hearing loss threshold with RMSE = 2.6 dB and R(2) = 0.89. The results also confirmed that neural networks could provide more accurate predictions than multiple regressions. Since occupational hearing loss is frequently non-curable, results of accurate prediction can be used by occupational health experts to modify and improve noise exposure conditions.

  6. Past, present and prospect of an Artificial Intelligence (AI) based model for sediment transport prediction

    NASA Astrophysics Data System (ADS)

    Afan, Haitham Abdulmohsin; El-shafie, Ahmed; Mohtar, Wan Hanna Melini Wan; Yaseen, Zaher Mundher

    2016-10-01

    An accurate model for sediment prediction is a priority for all hydrological researchers. Many conventional methods have shown an inability to achieve an accurate prediction of suspended sediment. These methods are unable to understand the behaviour of sediment transport in rivers due to the complexity, noise, non-stationarity, and dynamism of the sediment pattern. In the past two decades, Artificial Intelligence (AI) and computational approaches have become a remarkable tool for developing an accurate model. These approaches are considered a powerful tool for solving any non-linear model, as they can deal easily with a large number of data and sophisticated models. This paper is a review of all AI approaches that have been applied in sediment modelling. The current research focuses on the development of AI application in sediment transport. In addition, the review identifies major challenges and opportunities for prospective research. Throughout the literature, complementary models superior to classical modelling.

  7. Predicting vapor-liquid phase equilibria with augmented ab initio interatomic potentials

    NASA Astrophysics Data System (ADS)

    Vlasiuk, Maryna; Sadus, Richard J.

    2017-06-01

    The ability of ab initio interatomic potentials to accurately predict vapor-liquid phase equilibria is investigated. Monte Carlo simulations are reported for the vapor-liquid equilibria of argon and krypton using recently developed accurate ab initio interatomic potentials. Seventeen interatomic potentials are studied, formulated from different combinations of two-body plus three-body terms. The simulation results are compared to either experimental or reference data for conditions ranging from the triple point to the critical point. It is demonstrated that the use of ab initio potentials enables systematic improvements to the accuracy of predictions via the addition of theoretically based terms. The contribution of three-body interactions is accounted for using the Axilrod-Teller-Muto plus other multipole contributions and the effective Marcelli-Wang-Sadus potentials. The results indicate that the predictive ability of recent interatomic potentials, obtained from quantum chemical calculations, is comparable to that of accurate empirical models. It is demonstrated that the Marcelli-Wang-Sadus potential can be used in combination with accurate two-body ab initio models for the computationally inexpensive and accurate estimation of vapor-liquid phase equilibria.

  8. Predicting vapor-liquid phase equilibria with augmented ab initio interatomic potentials.

    PubMed

    Vlasiuk, Maryna; Sadus, Richard J

    2017-06-28

    The ability of ab initio interatomic potentials to accurately predict vapor-liquid phase equilibria is investigated. Monte Carlo simulations are reported for the vapor-liquid equilibria of argon and krypton using recently developed accurate ab initio interatomic potentials. Seventeen interatomic potentials are studied, formulated from different combinations of two-body plus three-body terms. The simulation results are compared to either experimental or reference data for conditions ranging from the triple point to the critical point. It is demonstrated that the use of ab initio potentials enables systematic improvements to the accuracy of predictions via the addition of theoretically based terms. The contribution of three-body interactions is accounted for using the Axilrod-Teller-Muto plus other multipole contributions and the effective Marcelli-Wang-Sadus potentials. The results indicate that the predictive ability of recent interatomic potentials, obtained from quantum chemical calculations, is comparable to that of accurate empirical models. It is demonstrated that the Marcelli-Wang-Sadus potential can be used in combination with accurate two-body ab initio models for the computationally inexpensive and accurate estimation of vapor-liquid phase equilibria.

  9. Assessing the accuracy of software predictions of mammalian and microbial metabolites

    EPA Science Inventory

    New chemical development and hazard assessments benefit from accurate predictions of mammalian and microbial metabolites. Fourteen biotransformation libraries encoded in eight software packages that predict metabolite structures were assessed for their sensitivity (proportion of ...

  10. Overview of Aerothermodynamic Loads Definition Study

    NASA Technical Reports Server (NTRS)

    Povinelli, L. A.

    1985-01-01

    The Aerothermodynamic Loads Definition were studied to develop methods to more accurately predict the operating environment in the space shuttle main engine (SSME) components. Development of steady and time-dependent, three-dimensional viscous computer codes and experimental verification and engine diagnostic testing are considered. The steady, nonsteady, and transient operating loads are defined to accurately predict powerhead life. Improvements in the structural durability of the SSME turbine drive systems depends on the knowledge of the aerothermodynamic behavior of the flow through the preburner, turbine, turnaround duct, gas manifold, and injector post regions.

  11. Fat-free mass prediction equations for bioelectric impedance analysis compared to dual energy X-ray absorptiometry in obese adolescents: a validation study.

    PubMed

    Hofsteenge, Geesje H; Chinapaw, Mai J M; Weijs, Peter J M

    2015-10-15

    In clinical practice, patient friendly methods to assess body composition in obese adolescents are needed. Therefore, the bioelectrical impedance analysis (BIA) related fat-free mass (FFM) prediction equations (FFM-BIA) were evaluated in obese adolescents (age 11-18 years) compared to FFM measured by dual-energy x-ray absorptiometry (FFM-DXA) and a new population specific FFM-BIA equation is developed. After an overnight fast, the subjects attended the outpatient clinic. After measuring height and weight, a full body scan by dual-energy x-ray absorptiometry (DXA) and a BIA measurement was performed. Thirteen predictive FFM-BIA equations based on weight, height, age, resistance, reactance and/or impedance were systematically selected and compared to FFM-DXA. Accuracy of FFM-BIA equations was evaluated by the percentage adolescents predicted within 5% of FFM-DXA measured, the mean percentage difference between predicted and measured values (bias) and the Root Mean Squared prediction Error (RMSE). Multiple linear regression was conducted to develop a new BIA equation. Validation was based on 103 adolescents (60% girls), age 14.5 (sd1.7) years, weight 94.1 (sd15.6) kg and FFM-DXA of 56.1 (sd9.8) kg. The percentage accurate estimations varied between equations from 0 to 68%; bias ranged from -29.3 to +36.3% and RMSE ranged from 2.8 to 12.4 kg. An alternative prediction equation was developed: FFM = 0.527 * H(cm)(2)/Imp + 0.306 * weight - 1.862 (R(2) = 0.92, SEE = 2.85 kg). Percentage accurate prediction was 76%. Compared to DXA, the Gray equation underestimated the FFM with 0.4 kg (55.7 ± 8.3), had an RMSE of 3.2 kg, 63% accurate prediction and the smallest bias of (-0.1%). When split by sex, the Gray equation had the narrowest range in accurate predictions, bias, and RMSE. For the assessment of FFM with BIA, the Gray-FFM equation appears to be the most accurate, but 63% is still not at an acceptable accuracy level for obese adolescents. The new equation appears to be appropriate but await further validation. DXA measurement remains the method of choice for FFM in obese adolescents. Netherlands Trial Register ( ISRCTN27626398).

  12. Development of an aerodyanmic theory capable of predicting surface loads on slender wings with vortex flow

    NASA Technical Reports Server (NTRS)

    Gloss, B. B.; Johnson, F. T.

    1976-01-01

    The Boeing Commercial Airplane Company developed an inviscid three-dimensional lifting surface method that shows promise in being able to accurately predict loads, subsonic and supersonic, on wings with leading-edge separation and reattachment.

  13. Development and evaluation of the photoload sampling technique

    Treesearch

    Robert E. Keane; Laura J. Dickinson

    2007-01-01

    Wildland fire managers need better estimates of fuel loading so they can accurately predict potential fire behavior and effects of alternative fuel and ecosystem restoration treatments. This report presents the development and evaluation of a new fuel sampling method, called the photoload sampling technique, to quickly and accurately estimate loadings for six common...

  14. Psychosis prediction and clinical utility in familial high-risk studies: Selective review, synthesis, and implications for early detection and intervention

    PubMed Central

    Shah, Jai L.; Tandon, Neeraj; Keshavan, Matcheri S.

    2016-01-01

    Aim Accurate prediction of which individuals will go on to develop psychosis would assist early intervention and prevention paradigms. We sought to review investigations of prospective psychosis prediction based on markers and variables examined in longitudinal familial high-risk (FHR) studies. Methods We performed literature searches in MedLine, PubMed and PsycINFO for articles assessing performance characteristics of predictive clinical tests in FHR studies of psychosis. Studies were included if they reported one or more predictive variables in subjects at FHR for psychosis. We complemented this search strategy with references drawn from articles, reviews, book chapters and monographs. Results Across generations of familial high-risk projects, predictive studies have investigated behavioral, cognitive, psychometric, clinical, neuroimaging, and other markers. Recent analyses have incorporated multivariate and multi-domain approaches to risk ascertainment, although with still generally modest results. Conclusions While a broad range of risk factors has been identified, no individual marker or combination of markers can at this time enable accurate prospective prediction of emerging psychosis for individuals at FHR. We outline the complex and multi-level nature of psychotic illness, the myriad of factors influencing its development, and methodological hurdles to accurate and reliable prediction. Prospects and challenges for future generations of FHR studies are discussed in the context of early detection and intervention strategies. PMID:23693118

  15. SPATIAL PREDICTION USING COMBINED SOURCES OF DATA

    EPA Science Inventory

    For improved environmental decision-making, it is important to develop new models for spatial prediction that accurately characterize important spatial and temporal patterns of air pollution. As the U .S. Environmental Protection Agency begins to use spatial prediction in the reg...

  16. Accurate approximation method for prediction of class I MHC affinities for peptides of length 8, 10 and 11 using prediction tools trained on 9mers.

    PubMed

    Lundegaard, Claus; Lund, Ole; Nielsen, Morten

    2008-06-01

    Several accurate prediction systems have been developed for prediction of class I major histocompatibility complex (MHC):peptide binding. Most of these are trained on binding affinity data of primarily 9mer peptides. Here, we show how prediction methods trained on 9mer data can be used for accurate binding affinity prediction of peptides of length 8, 10 and 11. The method gives the opportunity to predict peptides with a different length than nine for MHC alleles where no such peptides have been measured. As validation, the performance of this approach is compared to predictors trained on peptides of the peptide length in question. In this validation, the approximation method has an accuracy that is comparable to or better than methods trained on a peptide length identical to the predicted peptides. The algorithm has been implemented in the web-accessible servers NetMHC-3.0: http://www.cbs.dtu.dk/services/NetMHC-3.0, and NetMHCpan-1.1: http://www.cbs.dtu.dk/services/NetMHCpan-1.1

  17. Effective prediction of biodiversity in tidal flat habitats using an artificial neural network.

    PubMed

    Yoo, Jae-Won; Lee, Yong-Woo; Lee, Chang-Gun; Kim, Chang-Soo

    2013-02-01

    Accurate predictions of benthic macrofaunal biodiversity greatly benefit the efficient planning and management of habitat restoration efforts in tidal flat habitats. Artificial neural network (ANN) prediction models for such biodiversity were developed and tested based on 13 biophysical variables, collected from 50 sites of tidal flats along the coast of Korea during 1991-2006. The developed model showed high predictions during training, cross-validation and testing. Besides the training and testing procedures, an independent dataset from a different time period (2007-2010) was used to test the robustness and practical usage of the model. High prediction on the independent dataset (r = 0.84) validated the networks proper learning of predictive relationship and its generality. Key influential variables identified by follow-up sensitivity analyses were related with topographic dimension, environmental heterogeneity, and water column properties. Study demonstrates the successful application of ANN for the accurate prediction of benthic macrofaunal biodiversity and understanding of dynamics of candidate variables. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. 3D gut-liver chip with a PK model for prediction of first-pass metabolism.

    PubMed

    Lee, Dong Wook; Ha, Sang Keun; Choi, Inwook; Sung, Jong Hwan

    2017-11-07

    Accurate prediction of first-pass metabolism is essential for improving the time and cost efficiency of drug development process. Here, we have developed a microfluidic gut-liver co-culture chip that aims to reproduce the first-pass metabolism of oral drugs. This chip consists of two separate layers for gut (Caco-2) and liver (HepG2) cell lines, where cells can be co-cultured in both 2D and 3D forms. Both cell lines were maintained well in the chip, verified by confocal microscopy and measurement of hepatic enzyme activity. We investigated the PK profile of paracetamol in the chip, and corresponding PK model was constructed, which was used to predict PK profiles for different chip design parameters. Simulation results implied that a larger absorption surface area and a higher metabolic capacity are required to reproduce the in vivo PK profile of paracetamol more accurately. Our study suggests the possibility of reproducing the human PK profile on a chip, contributing to accurate prediction of pharmacological effect of drugs.

  19. Type- and Subtype-Specific Influenza Forecast.

    PubMed

    Kandula, Sasikiran; Yang, Wan; Shaman, Jeffrey

    2017-03-01

    Prediction of the growth and decline of infectious disease incidence has advanced considerably in recent years. As these forecasts improve, their public health utility should increase, particularly as interventions are developed that make explicit use of forecast information. It is the task of the research community to increase the content and improve the accuracy of these infectious disease predictions. Presently, operational real-time forecasts of total influenza incidence are produced at the municipal and state level in the United States. These forecasts are generated using ensemble simulations depicting local influenza transmission dynamics, which have been optimized prior to forecast with observations of influenza incidence and data assimilation methods. Here, we explore whether forecasts targeted to predict influenza by type and subtype during 2003-2015 in the United States were more or less accurate than forecasts targeted to predict total influenza incidence. We found that forecasts separated by type/subtype generally produced more accurate predictions and, when summed, produced more accurate predictions of total influenza incidence. These findings indicate that monitoring influenza by type and subtype not only provides more detailed observational content but supports more accurate forecasting. More accurate forecasting can help officials better respond to and plan for current and future influenza activity. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Studying Individual Differences in Predictability with Gamma Regression and Nonlinear Multilevel Models

    ERIC Educational Resources Information Center

    Culpepper, Steven Andrew

    2010-01-01

    Statistical prediction remains an important tool for decisions in a variety of disciplines. An equally important issue is identifying factors that contribute to more or less accurate predictions. The time series literature includes well developed methods for studying predictability and volatility over time. This article develops…

  1. Developing a case-mix model for PPS.

    PubMed

    Goldberg, H B; Delargy, D

    2000-01-01

    Agencies are pinning hopes for success under PPS on an accurate case-mix adjustor. The Health Care Financing Administration (HCFA) tasked Abt Associates Inc. to develop a system to accurately predict the volume and type of home health services each patient requires, based on his or her characteristics (not the service actually received). HCFA wanted this system to be feasible, clinically logical, and valid and accurate. Authors Goldberg and Delargy explain how Abt approached this daunting task.

  2. Reliability of Degree-Day Models to Predict the Development Time of Plutella xylostella (L.) under Field Conditions.

    PubMed

    Marchioro, C A; Krechemer, F S; de Moraes, C P; Foerster, L A

    2015-12-01

    The diamondback moth, Plutella xylostella (L.), is a cosmopolitan pest of brassicaceous crops occurring in regions with highly distinct climate conditions. Several studies have investigated the relationship between temperature and P. xylostella development rate, providing degree-day models for populations from different geographical regions. However, there are no data available to date to demonstrate the suitability of such models to make reliable projections on the development time for this species in field conditions. In the present study, 19 models available in the literature were tested regarding their ability to accurately predict the development time of two cohorts of P. xylostella under field conditions. Only 11 out of the 19 models tested accurately predicted the development time for the first cohort of P. xylostella, but only seven for the second cohort. Five models correctly predicted the development time for both cohorts evaluated. Our data demonstrate that the accuracy of the models available for P. xylostella varies widely and therefore should be used with caution for pest management purposes.

  3. Can phenological models predict tree phenology accurately under climate change conditions?

    NASA Astrophysics Data System (ADS)

    Chuine, Isabelle; Bonhomme, Marc; Legave, Jean Michel; García de Cortázar-Atauri, Inaki; Charrier, Guillaume; Lacointe, André; Améglio, Thierry

    2014-05-01

    The onset of the growing season of trees has been globally earlier by 2.3 days/decade during the last 50 years because of global warming and this trend is predicted to continue according to climate forecast. The effect of temperature on plant phenology is however not linear because temperature has a dual effect on bud development. On one hand, low temperatures are necessary to break bud dormancy, and on the other hand higher temperatures are necessary to promote bud cells growth afterwards. Increasing phenological changes in temperate woody species have strong impacts on forest trees distribution and productivity, as well as crops cultivation areas. Accurate predictions of trees phenology are therefore a prerequisite to understand and foresee the impacts of climate change on forests and agrosystems. Different process-based models have been developed in the last two decades to predict the date of budburst or flowering of woody species. They are two main families: (1) one-phase models which consider only the ecodormancy phase and make the assumption that endodormancy is always broken before adequate climatic conditions for cell growth occur; and (2) two-phase models which consider both the endodormancy and ecodormancy phases and predict a date of dormancy break which varies from year to year. So far, one-phase models have been able to predict accurately tree bud break and flowering under historical climate. However, because they do not consider what happens prior to ecodormancy, and especially the possible negative effect of winter temperature warming on dormancy break, it seems unlikely that they can provide accurate predictions in future climate conditions. It is indeed well known that a lack of low temperature results in abnormal pattern of bud break and development in temperate fruit trees. An accurate modelling of the dormancy break date has thus become a major issue in phenology modelling. Two-phases phenological models predict that global warming should delay or compromise dormancy break at the species equatorward range limits leading to a delay or even impossibility to flower or set new leaves. These models are classically parameterized with flowering or budburst dates only, with no information on the dormancy break date because this information is very scarce. We evaluated the efficiency of a set of process-based phenological models to accurately predict the dormancy break dates of four fruit trees. Our results show that models calibrated solely with flowering or budburst dates do not accurately predict the dormancy break date. Providing dormancy break date for the model parameterization results in much more accurate simulation of this latter, with however a higher error than that on flowering or bud break dates. But most importantly, we show also that models not calibrated with dormancy break dates can generate significant differences in forecasted flowering or bud break dates when using climate scenarios. Our results claim for the urgent need of massive measurements of dormancy break dates in forest and fruit trees to yield more robust projections of phenological changes in a near future.

  4. Extension of Ko Straight-Beam Displacement Theory to Deformed Shape Predictions of Slender Curved Structures

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Fleischer, Van Tran

    2011-01-01

    The Ko displacement theory originally developed for shape predictions of straight beams is extended to shape predictions of curved beams. The surface strains needed for shape predictions were analytically generated from finite-element nodal stress outputs. With the aid of finite-element displacement outputs, mathematical functional forms for curvature-effect correction terms are established and incorporated into straight-beam deflection equations for shape predictions of both cantilever and two-point supported curved beams. The newly established deflection equations for cantilever curved beams could provide quite accurate shape predictions for different cantilever curved beams, including the quarter-circle cantilever beam. Furthermore, the newly formulated deflection equations for two-point supported curved beams could provide accurate shape predictions for a range of two-point supported curved beams, including the full-circular ring. Accuracy of the newly developed curved-beam deflection equations is validated through shape prediction analysis of curved beams embedded in the windward shallow spherical shell of a generic crew exploration vehicle. A single-point collocation method for optimization of shape predictions is discussed in detail

  5. Relationship between the Prediction Accuracy of Tsunami Inundation and Relative Distribution of Tsunami Source and Observation Arrays: A Case Study in Tokyo Bay

    NASA Astrophysics Data System (ADS)

    Takagawa, T.

    2017-12-01

    A rapid and precise tsunami forecast based on offshore monitoring is getting attention to reduce human losses due to devastating tsunami inundation. We developed a forecast method based on the combination of hierarchical Bayesian inversion with pre-computed database and rapid post-computing of tsunami inundation. The method was applied to Tokyo bay to evaluate the efficiency of observation arrays against three tsunamigenic earthquakes. One is a scenario earthquake at Nankai trough and the other two are historic ones of Genroku in 1703 and Enpo in 1677. In general, rich observation array near the tsunami source has an advantage in both accuracy and rapidness of tsunami forecast. To examine the effect of observation time length we used four types of data with the lengths of 5, 10, 20 and 45 minutes after the earthquake occurrences. Prediction accuracy of tsunami inundation was evaluated by the simulated tsunami inundation areas around Tokyo bay due to target earthquakes. The shortest time length of accurate prediction varied with target earthquakes. Here, accurate prediction means the simulated values fall within the 95% credible intervals of prediction. In Enpo earthquake case, 5-minutes observation is enough for accurate prediction for Tokyo bay, but 10-minutes and 45-minutes are needed in the case of Nankai trough and Genroku, respectively. The difference of the shortest time length for accurate prediction shows the strong relationship with the relative distance from the tsunami source and observation arrays. In the Enpo case, offshore tsunami observation points are densely distributed even in the source region. So, accurate prediction can be rapidly achieved within 5 minutes. This precise prediction is useful for early warnings. Even in the worst case of Genroku, where less observation points are available near the source, accurate prediction can be obtained within 45 minutes. This information can be useful to figure out the outline of the hazard in an early stage of reaction.

  6. Predicting post-fire tree mortality for 12 western US conifers using the First-Order Fire Effects Model (FOFEM)

    Treesearch

    Sharon Hood; Duncan Lutes

    2017-01-01

    Accurate prediction of fire-caused tree mortality is critical for making sound land management decisions such as developing burning prescriptions and post-fire management guidelines. To improve efforts to predict post-fire tree mortality, we developed 3-year post-fire mortality models for 12 Western conifer species - white fir (Abies concolor [Gord. &...

  7. A computationally efficient modelling of laminar separation bubbles

    NASA Technical Reports Server (NTRS)

    Dini, Paolo; Maughmer, Mark D.

    1989-01-01

    The goal is to accurately predict the characteristics of the laminar separation bubble and its effects on airfoil performance. Toward this end, a computational model of the separation bubble was developed and incorporated into the Eppler and Somers airfoil design and analysis program. Thus far, the focus of the research was limited to the development of a model which can accurately predict situations in which the interaction between the bubble and the inviscid velocity distribution is weak, the so-called short bubble. A summary of the research performed in the past nine months is presented. The bubble model in its present form is then described. Lastly, the performance of this model in predicting bubble characteristics is shown for a few cases.

  8. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding

    PubMed Central

    Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill

    2017-01-01

    Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding. PMID:28729875

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wosnik, Martin; Bachant, Pete; Neary, Vincent Sinclair

    CACTUS, developed by Sandia National Laboratories, is an open-source code for the design and analysis of wind and hydrokinetic turbines. While it has undergone extensive validation for both vertical axis and horizontal axis wind turbines, and it has been demonstrated to accurately predict the performance of horizontal (axial-flow) hydrokinetic turbines, its ability to predict the performance of crossflow hydrokinetic turbines has yet to be tested. The present study addresses this problem by comparing the predicted performance curves derived from CACTUS simulations of the U.S. Department of Energy’s 1:6 scale reference model crossflow turbine to those derived by experimental measurements inmore » a tow tank using the same model turbine at the University of New Hampshire. It shows that CACTUS cannot accurately predict the performance of this crossflow turbine, raising concerns on its application to crossflow hydrokinetic turbines generally. The lack of quality data on NACA 0021 foil aerodynamic (hydrodynamic) characteristics over the wide range of angles of attack (AoA) and Reynolds numbers is identified as the main cause for poor model prediction. A comparison of several different NACA 0021 foil data sources, derived using both physical and numerical modeling experiments, indicates significant discrepancies at the high AoA experienced by foils on crossflow turbines. Users of CACTUS for crossflow hydrokinetic turbines are, therefore, advised to limit its application to higher tip speed ratios (lower AoA), and to carefully verify the reliability and accuracy of their foil data. Accurate empirical data on the aerodynamic characteristics of the foil is the greatest limitation to predicting performance for crossflow turbines with semi-empirical models like CACTUS. Future improvements of CACTUS for crossflow turbine performance prediction will require the development of accurate foil aerodynamic characteristic data sets within the appropriate ranges of Reynolds numbers and AoA.« less

  10. Studies of HZE particle interactions and transport for space radiation protection purposes

    NASA Technical Reports Server (NTRS)

    Townsend, Lawrence W.; Wilson, John W.; Schimmerling, Walter; Wong, Mervyn

    1987-01-01

    The main emphasis is on developing general methods for accurately predicting high-energy heavy ion (HZE) particle interactions and transport for use by researchers in mission planning studies, in evaluating astronaut self-shielding factors, and in spacecraft shield design and optimization studies. The two research tasks are: (1) to develop computationally fast and accurate solutions to the Boltzmann (transport) equation; and (2) to develop accurate HZE interaction models, from fundamental physical considerations, for use as inputs into these transport codes. Accurate solutions to the HZE transport problem have been formulated through a combination of analytical and numerical techniques. In addition, theoretical models for the input interaction parameters are under development: stopping powers, nuclear absorption cross sections, and fragmentation parameters.

  11. Genome-wide prediction and analysis of human tissue-selective genes using microarray expression data

    PubMed Central

    2013-01-01

    Background Understanding how genes are expressed specifically in particular tissues is a fundamental question in developmental biology. Many tissue-specific genes are involved in the pathogenesis of complex human diseases. However, experimental identification of tissue-specific genes is time consuming and difficult. The accurate predictions of tissue-specific gene targets could provide useful information for biomarker development and drug target identification. Results In this study, we have developed a machine learning approach for predicting the human tissue-specific genes using microarray expression data. The lists of known tissue-specific genes for different tissues were collected from UniProt database, and the expression data retrieved from the previously compiled dataset according to the lists were used for input vector encoding. Random Forests (RFs) and Support Vector Machines (SVMs) were used to construct accurate classifiers. The RF classifiers were found to outperform SVM models for tissue-specific gene prediction. The results suggest that the candidate genes for brain or liver specific expression can provide valuable information for further experimental studies. Our approach was also applied for identifying tissue-selective gene targets for different types of tissues. Conclusions A machine learning approach has been developed for accurately identifying the candidate genes for tissue specific/selective expression. The approach provides an efficient way to select some interesting genes for developing new biomedical markers and improve our knowledge of tissue-specific expression. PMID:23369200

  12. A Battery Health Monitoring Framework for Planetary Rovers

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Kulkarni, Chetan Shrikant

    2014-01-01

    Batteries have seen an increased use in electric ground and air vehicles for commercial, military, and space applications as the primary energy source. An important aspect of using batteries in such contexts is battery health monitoring. Batteries must be carefully monitored such that the battery health can be determined, and end of discharge and end of usable life events may be accurately predicted. For planetary rovers, battery health estimation and prediction is critical to mission planning and decision-making. We develop a model-based approach utilizing computaitonally efficient and accurate electrochemistry models of batteries. An unscented Kalman filter yields state estimates, which are then used to predict the future behavior of the batteries and, specifically, end of discharge. The prediction algorithm accounts for possible future power demands on the rover batteries in order to provide meaningful results and an accurate representation of prediction uncertainty. The framework is demonstrated on a set of lithium-ion batteries powering a rover at NASA.

  13. Prediction of retention times in comprehensive two-dimensional gas chromatography using thermodynamic models.

    PubMed

    McGinitie, Teague M; Harynuk, James J

    2012-09-14

    A method was developed to accurately predict both the primary and secondary retention times for a series of alkanes, ketones and alcohols in a flow-modulated GC×GC system. This was accomplished through the use of a three-parameter thermodynamic model where ΔH, ΔS, and ΔC(p) for an analyte's interaction with the stationary phases in both dimensions are known. Coupling this thermodynamic model with a time summation calculation it was possible to accurately predict both (1)t(r) and (2)t(r) for all analytes. The model was able to predict retention times regardless of the temperature ramp used, with an average error of only 0.64% for (1)t(r) and an average error of only 2.22% for (2)t(r). The model shows promise for the accurate prediction of retention times in GC×GC for a wide range of compounds and is able to utilize data collected from 1D experiments. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Resting Energy Expenditure Prediction in Recreational Athletes of 18–35 Years: Confirmation of Cunningham Equation and an Improved Weight-Based Alternative

    PubMed Central

    ten Haaf, Twan; Weijs, Peter J. M.

    2014-01-01

    Introduction Resting energy expenditure (REE) is expected to be higher in athletes because of their relatively high fat free mass (FFM). Therefore, REE predictive equation for recreational athletes may be required. The aim of this study was to validate existing REE predictive equations and to develop a new recreational athlete specific equation. Methods 90 (53M, 37F) adult athletes, exercising on average 9.1±5.0 hours a week and 5.0±1.8 times a week, were included. REE was measured using indirect calorimetry (Vmax Encore n29), FFM and FM were measured using air displacement plethysmography. Multiple linear regression analysis was used to develop a new FFM-based and weight-based REE predictive equation. The percentage accurate predictions (within 10% of measured REE), percentage bias, root mean square error and limits of agreement were calculated. Results The Cunningham equation and the new weight-based equation and the new FFM-based equation performed equally well. De Lorenzo's equation predicted REE less accurate, but better than the other generally used REE predictive equations. Harris-Benedict, WHO, Schofield, Mifflin and Owen all showed less than 50% accuracy. Conclusion For a population of (Dutch) recreational athletes, the REE can accurately be predicted with the existing Cunningham equation. Since body composition measurement is not always possible, and other generally used equations fail, the new weight-based equation is advised for use in sports nutrition. PMID:25275434

  15. Models for predicting fuel consumption in sagebrush-dominated ecosystems

    Treesearch

    Clinton S. Wright

    2013-01-01

    Fuel consumption predictions are necessary to accurately estimate or model fire effects, including pollutant emissions during wildland fires. Fuel and environmental measurements on a series of operational prescribed fires were used to develop empirical models for predicting fuel consumption in big sagebrush (Artemisia tridentate Nutt.) ecosystems....

  16. Predicting intensity ranks of peptide fragment ions.

    PubMed

    Frank, Ari M

    2009-05-01

    Accurate modeling of peptide fragmentation is necessary for the development of robust scoring functions for peptide-spectrum matches, which are the cornerstone of MS/MS-based identification algorithms. Unfortunately, peptide fragmentation is a complex process that can involve several competing chemical pathways, which makes it difficult to develop generative probabilistic models that describe it accurately. However, the vast amounts of MS/MS data being generated now make it possible to use data-driven machine learning methods to develop discriminative ranking-based models that predict the intensity ranks of a peptide's fragment ions. We use simple sequence-based features that get combined by a boosting algorithm into models that make peak rank predictions with high accuracy. In an accompanying manuscript, we demonstrate how these prediction models are used to significantly improve the performance of peptide identification algorithms. The models can also be useful in the design of optimal multiple reaction monitoring (MRM) transitions, in cases where there is insufficient experimental data to guide the peak selection process. The prediction algorithm can also be run independently through PepNovo+, which is available for download from http://bix.ucsd.edu/Software/PepNovo.html.

  17. Predicting Intensity Ranks of Peptide Fragment Ions

    PubMed Central

    Frank, Ari M.

    2009-01-01

    Accurate modeling of peptide fragmentation is necessary for the development of robust scoring functions for peptide-spectrum matches, which are the cornerstone of MS/MS-based identification algorithms. Unfortunately, peptide fragmentation is a complex process that can involve several competing chemical pathways, which makes it difficult to develop generative probabilistic models that describe it accurately. However, the vast amounts of MS/MS data being generated now make it possible to use data-driven machine learning methods to develop discriminative ranking-based models that predict the intensity ranks of a peptide's fragment ions. We use simple sequence-based features that get combined by a boosting algorithm in to models that make peak rank predictions with high accuracy. In an accompanying manuscript, we demonstrate how these prediction models are used to significantly improve the performance of peptide identification algorithms. The models can also be useful in the design of optimal MRM transitions, in cases where there is insufficient experimental data to guide the peak selection process. The prediction algorithm can also be run independently through PepNovo+, which is available for download from http://bix.ucsd.edu/Software/PepNovo.html. PMID:19256476

  18. Mental models accurately predict emotion transitions.

    PubMed

    Thornton, Mark A; Tamir, Diana I

    2017-06-06

    Successful social interactions depend on people's ability to predict others' future actions and emotions. People possess many mechanisms for perceiving others' current emotional states, but how might they use this information to predict others' future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others' emotional dynamics. People could then use these mental models of emotion transitions to predict others' future emotions from currently observable emotions. To test this hypothesis, studies 1-3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants' ratings of emotion transitions predicted others' experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation-valence, social impact, rationality, and human mind-inform participants' mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants' accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone.

  19. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  20. Analysis of Material Sample Heated by Impinging Hot Hydrogen Jet in a Non-Nuclear Tester

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Foote, John; Litchford, Ron

    2006-01-01

    A computational conjugate heat transfer methodology was developed and anchored with data obtained from a hot-hydrogen jet heated, non-nuclear materials tester, as a first step towards developing an efficient and accurate multiphysics, thermo-fluid computational methodology to predict environments for hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on a multidimensional, finite-volume, turbulent, chemically reacting, thermally radiating, unstructured-grid, and pressure-based formulation. The multiphysics invoked in this study include hydrogen dissociation kinetics and thermodynamics, turbulent flow, convective and thermal radiative, and conjugate heat transfers. Predicted hot hydrogen jet and material surface temperatures were compared with those of measurement. Predicted solid temperatures were compared with those obtained with a standard heat transfer code. The interrogation of physics revealed that reactions of hydrogen dissociation and recombination are highly correlated with local temperature and are necessary for accurate prediction of the hot-hydrogen jet temperature.

  1. A Micromechanics-Based Method for Multiscale Fatigue Prediction

    NASA Astrophysics Data System (ADS)

    Moore, John Allan

    An estimated 80% of all structural failures are due to mechanical fatigue, often resulting in catastrophic, dangerous and costly failure events. However, an accurate model to predict fatigue remains an elusive goal. One of the major challenges is that fatigue is intrinsically a multiscale process, which is dependent on a structure's geometric design as well as its material's microscale morphology. The following work begins with a microscale study of fatigue nucleation around non- metallic inclusions. Based on this analysis, a novel multiscale method for fatigue predictions is developed. This method simulates macroscale geometries explicitly while concurrently calculating the simplified response of microscale inclusions. Thus, providing adequate detail on multiple scales for accurate fatigue life predictions. The methods herein provide insight into the multiscale nature of fatigue, while also developing a tool to aid in geometric design and material optimization for fatigue critical devices such as biomedical stents and artificial heart valves.

  2. Evaluation of a microwave resonator for predicting grain moisture independent of bulk density

    USDA-ARS?s Scientific Manuscript database

    This work evaluated the ability of a planar whispering mode resonator to predict moisture considering moisture and densities expected in an on-harvester application. A calibration model was developed to accurately predict moisture over the moisture, density and temperature ranges evaluated. This mod...

  3. Prediction of Recidivism in Juvenile Offenders Based on Discriminant Analysis.

    ERIC Educational Resources Information Center

    Proefrock, David W.

    The recent development of strong statistical techniques has made accurate predictions of recidivism possible. To investigate the utility of discriminant analysis methodology in making predictions of recidivism in juvenile offenders, the court records of 271 male and female juvenile offenders, aged 12-16, were reviewed. A cross validation group…

  4. GlobalSoilMap France: High-resolution spatial modelling the soils of France up to two meter depth.

    PubMed

    Mulder, V L; Lacoste, M; Richer-de-Forges, A C; Arrouays, D

    2016-12-15

    This work presents the first GlobalSoilMap (GSM) products for France. We developed an automatic procedure for mapping the primary soil properties (clay, silt, sand, coarse elements, pH, soil organic carbon (SOC), cation exchange capacity (CEC) and soil depth). The procedure employed a data-mining technique and a straightforward method for estimating the 90% confidence intervals (CIs). The most accurate models were obtained for pH, sand and silt. Next, CEC, clay and SOC were found reasonably accurate predicted. Coarse elements and soil depth were the least accurate of all models. Overall, all models were considered robust; important indicators for this were 1) the small difference in model diagnostics between the calibration and cross-validation set, 2) the unbiased mean predictions, 3) the smaller spatial structure of the prediction residuals in comparison to the observations and 4) the similar performance compared to other developed GlobalSoilMap products. Nevertheless, the confidence intervals (CIs) were rather wide for all soil properties. The median predictions became less reliable with increasing depth, as indicated by the increase of CIs with depth. In addition, model accuracy and the corresponding CIs varied depending on the soil variable of interest, soil depth and geographic location. These findings indicated that the CIs are as informative as the model diagnostics. In conclusion, the presented method resulted in reasonably accurate predictions for the majority of the soil properties. End users can employ the products for different purposes, as was demonstrated with some practical examples. The mapping routine is flexible for cloud-computing and provides ample opportunity to be further developed when desired by its users. This allows regional and international GSM partners with fewer resources to develop their own products or, otherwise, to improve the current routine and work together towards a robust high-resolution digital soil map of the world. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    NASA Astrophysics Data System (ADS)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  6. Predicting perturbation patterns from the topology of biological networks.

    PubMed

    Santolini, Marc; Barabási, Albert-László

    2018-06-20

    High-throughput technologies, offering an unprecedented wealth of quantitative data underlying the makeup of living systems, are changing biology. Notably, the systematic mapping of the relationships between biochemical entities has fueled the rapid development of network biology, offering a suitable framework to describe disease phenotypes and predict potential drug targets. However, our ability to develop accurate dynamical models remains limited, due in part to the limited knowledge of the kinetic parameters underlying these interactions. Here, we explore the degree to which we can make reasonably accurate predictions in the absence of the kinetic parameters. We find that simple dynamically agnostic models are sufficient to recover the strength and sign of the biochemical perturbation patterns observed in 87 biological models for which the underlying kinetics are known. Surprisingly, a simple distance-based model achieves 65% accuracy. We show that this predictive power is robust to topological and kinetic parameter perturbations, and we identify key network properties that can increase up to 80% the recovery rate of the true perturbation patterns. We validate our approach using experimental data on the chemotactic pathway in bacteria, finding that a network model of perturbation spreading predicts with ∼80% accuracy the directionality of gene expression and phenotype changes in knock-out and overproduction experiments. These findings show that the steady advances in mapping out the topology of biochemical interaction networks opens avenues for accurate perturbation spread modeling, with direct implications for medicine and drug development.

  7. Expediting SRM assay development for large-scale targeted proteomics experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chaochao; Shi, Tujin; Brown, Joseph N.

    2014-08-22

    Due to their high sensitivity and specificity, targeted proteomics measurements, e.g. selected reaction monitoring (SRM), are becoming increasingly popular for biological and translational applications. Selection of optimal transitions and optimization of collision energy (CE) are important assay development steps for achieving sensitive detection and accurate quantification; however, these steps can be labor-intensive, especially for large-scale applications. Herein, we explored several options for accelerating SRM assay development evaluated in the context of a relatively large set of 215 synthetic peptide targets. We first showed that HCD fragmentation is very similar to CID in triple quadrupole (QQQ) instrumentation, and by selection ofmore » top six y fragment ions from HCD spectra, >86% of top transitions optimized from direct infusion on QQQ instrument are covered. We also demonstrated that the CE calculated by existing prediction tools was less accurate for +3 precursors, and a significant increase in intensity for transitions could be obtained using a new CE prediction equation constructed from the present experimental data. Overall, our study illustrates the feasibility of expediting the development of larger numbers of high-sensitivity SRM assays through automation of transitions selection and accurate prediction of optimal CE to improve both SRM throughput and measurement quality.« less

  8. Simplified methods of predicting aircraft rolling moments due to vortex encounters

    DOT National Transportation Integrated Search

    1977-05-01

    Computational methods suitable for fast and accurate prediction of rolling moments on aircraft : encountering wake vortices are presented. Appropriate modifications to strip theory are developed which account for the effects of finite wingspan. It is...

  9. Experimental evaluation of radiosity for room sound-field prediction.

    PubMed

    Hodgson, Murray; Nosal, Eva-Marie

    2006-08-01

    An acoustical radiosity model was evaluated for how it performs in predicting real room sound fields. This was done by comparing radiosity predictions with experimental results for three existing rooms--a squash court, a classroom, and an office. Radiosity predictions were also compared with those by ray tracing--a "reference" prediction model--for both specular and diffuse surface reflection. Comparisons were made for detailed and discretized echograms, sound-decay curves, sound-propagation curves, and the variations with frequency of four room-acoustical parameters--EDT, RT, D50, and C80. In general, radiosity and diffuse ray tracing gave very similar predictions. Predictions by specular ray tracing were often very different. Radiosity agreed well with experiment in some cases, less well in others. Definitive conclusions regarding the accuracy with which the rooms were modeled, or the accuracy of the radiosity approach, were difficult to draw. The results suggest that radiosity predicts room sound fields with some accuracy, at least as well as diffuse ray tracing and, in general, better than specular ray tracing. The predictions of detailed echograms are less accurate, those of derived room-acoustical parameters more accurate. The results underline the need to develop experimental methods for accurately characterizing the absorptive and reflective characteristics of room surfaces, possible including phase.

  10. Predicting the digestible energy of corn determined with growing swine from nutrient composition and cross-species measurements.

    PubMed

    Smith, B; Hassen, A; Hinds, M; Rice, D; Jones, D; Sauber, T; Iiams, C; Sevenich, D; Allen, R; Owens, F; McNaughton, J; Parsons, C

    2015-03-01

    The DE values of corn grain for pigs will differ among corn sources. More accurate prediction of DE may improve diet formulation and reduce diet cost. Corn grain sources ( = 83) were assayed with growing swine (20 kg) in DE experiments with total collection of feces, with 3-wk-old broiler chick in nitrogen-corrected apparent ME (AME) trials and with cecectomized adult roosters in nitrogen-corrected true ME (TME) studies. Additional AME data for the corn grain source set was generated based on an existing near-infrared transmittance prediction model (near-infrared transmittance-predicted AME [NIT-AME]). Corn source nutrient composition was determined by wet chemistry methods. These data were then used to 1) test the accuracy of predicting swine DE of individual corn sources based on available literature equations and nutrient composition and 2) develop models for predicting DE of sources from nutrient composition and the cross-species information gathered above (AME, NIT-AME, and TME). The overall measured DE, AME, NIT-AME, and TME values were 4,105 ± 11, 4,006 ± 10, 4,004 ± 10, and 4,086 ± 12 kcal/kg DM, respectively. Prediction models were developed using 80% of the corn grain sources; the remaining 20% was reserved for validation of the developed prediction equation. Literature equations based on nutrient composition proved imprecise for predicting corn DE; the root mean square error of prediction ranged from 105 to 331 kcal/kg, an equivalent of 2.6 to 8.8% error. Yet among the corn composition traits, 4-variable models developed in the current study provided adequate prediction of DE (model ranging from 0.76 to 0.79 and root mean square error [RMSE] of 50 kcal/kg). When prediction equations were tested using the validation set, these models had a 1 to 1.2% error of prediction. Simple linear equations from AME, NIT-AME, or TME provided an accurate prediction of DE for individual sources ( ranged from 0.65 to 0.73 and RMSE ranged from 50 to 61 kcal/kg). Percentage error of prediction based on the validation data set was greater (1.4%) for the TME model than for the NIT-AME or AME models (1 and 1.2%, respectively), indicating that swine DE values could be accurately predicted by using AME or NIT-AME. In conclusion, regression equations developed from broiler measurements or from analyzed nutrient composition proved adequate to reliably predict the DE of commercially available corn hybrids for growing pigs.

  11. End-of-Discharge and End-of-Life Prediction in Lithium-Ion Batteries with Electrochemistry-Based Aging Models

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Kulkarni, Chetan S.

    2016-01-01

    As batteries become increasingly prevalent in complex systems such as aircraft and electric cars, monitoring and predicting battery state of charge and state of health becomes critical. In order to accurately predict the remaining battery power to support system operations for informed operational decision-making, age-dependent changes in dynamics must be accounted for. Using an electrochemistry-based model, we investigate how key parameters of the battery change as aging occurs, and develop models to describe aging through these key parameters. Using these models, we demonstrate how we can (i) accurately predict end-of-discharge for aged batteries, and (ii) predict the end-of-life of a battery as a function of anticipated usage. The approach is validated through an experimental set of randomized discharge profiles.

  12. Testing DRAINMOD-FOREST for predicting evapotranspiration in a mid-rotation pine plantation

    Treesearch

    Shiying Tian; Mohamed A. Youssef; Ge Sun; George M. Chescheir; Asko Noormets; Devendra M. Amatya; R. Wayne Skaggs; John S. King; Steve McNulty; Michael Gavazzi; Guofang Miao; Jean-Christophe Domec

    2015-01-01

    Evapotranspiration (ET) is a key component of the hydrologic cycle in terrestrial ecosystems and accurate description of ET processes is essential for developing reliable ecohydrological models. This study investigated the accuracy of ET prediction by the DRAINMOD-FOREST after its calibration/validation for predicting commonly measured hydrological variables. The model...

  13. Sensorless Modeling of Varying Pulse Width Modulator Resolutions in Three-Phase Induction Motors

    PubMed Central

    Marko, Matthew David; Shevach, Glenn

    2017-01-01

    A sensorless algorithm was developed to predict rotor speeds in an electric three-phase induction motor. This sensorless model requires a measurement of the stator currents and voltages, and the rotor speed is predicted accurately without any mechanical measurement of the rotor speed. A model of an electric vehicle undergoing acceleration was built, and the sensorless prediction of the simulation rotor speed was determined to be robust even in the presence of fluctuating motor parameters and significant sensor errors. Studies were conducted for varying pulse width modulator resolutions, and the sensorless model was accurate for all resolutions of sinusoidal voltage functions. PMID:28076418

  14. Sensorless Modeling of Varying Pulse Width Modulator Resolutions in Three-Phase Induction Motors.

    PubMed

    Marko, Matthew David; Shevach, Glenn

    2017-01-01

    A sensorless algorithm was developed to predict rotor speeds in an electric three-phase induction motor. This sensorless model requires a measurement of the stator currents and voltages, and the rotor speed is predicted accurately without any mechanical measurement of the rotor speed. A model of an electric vehicle undergoing acceleration was built, and the sensorless prediction of the simulation rotor speed was determined to be robust even in the presence of fluctuating motor parameters and significant sensor errors. Studies were conducted for varying pulse width modulator resolutions, and the sensorless model was accurate for all resolutions of sinusoidal voltage functions.

  15. Development of equations for predicting Puerto Rican subtropical dry forest biomass and volume

    Treesearch

    Thomas J. Brandeis; Matthew Delaney; Bernard R. Parresol; Larry Royer

    2006-01-01

    Carbon accounting, forest health monitoring and sustainable management of the subtropical dry forests of Puerto Rico and other Caribbean Islands require an accurate assessment of forest aboveground biomass (AGB) and stem volume. One means of improving assessment accuracy is the development of predictive equations derived from locally collected data. Forest inventory...

  16. Development of equations for predicting Puerto Rican subtropical dry forest biomass and volume.

    Treesearch

    Thomas J. Brandeis; Matthew Delaney; Bernard R. Parresol; Larry Royer

    2006-01-01

    Carbon accounting, forest health monitoring and sustainable management of the subtropical dry forests of Puerto Rico and other Caribbean Islands require an accurate assessment of forest aboveground biomass (AGB) and stem volume. One means of improving assessment accuracy is the development of predictive equations derived from locally collected data. Forest inventory...

  17. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  18. Enhancing Elementary Pre-service Teachers' Plant Processes Conceptions

    NASA Astrophysics Data System (ADS)

    Thompson, Stephen L.; Lotter, Christine; Fann, Xumei; Taylor, Laurie

    2016-06-01

    Researchers examined how an inquiry-based instructional treatment emphasizing interrelated plant processes influenced 210 elementary pre-service teachers' (PTs) conceptions of three plant processes, photosynthesis, cellular respiration, and transpiration, and the interrelated nature of these processes. The instructional treatment required PTs to predict the fate of a healthy plant in a sealed terrarium (Plant-in-a-Jar), justify their predictions, observe the plant over a 5-week period, and complete guided inquiry activities centered on one of the targeted plant processes each week. Data sources included PTs' pre- and post-predictions with accompanying justifications, course artifacts such as weekly terrarium observations and science journal entries, and group models of the interrelated plant processes occurring within the sealed terraria. A subset of 33 volunteer PTs also completed interviews the week the Plant-in-a-Jar scenario was introduced and approximately 4 months after the instructional intervention ended. Pre- and post-predictions from all PTs as well as interview responses from the subgroup of PTs, were coded into categories based on key plant processes emphasized in the Next Generation Science Standards. Study findings revealed that PTs developed more accurate conceptions of plant processes and their interrelated nature as a result of the instructional intervention. Primary patterns of change in PTs' plant process conceptions included development of more accurate conceptions of how water is used by plants, more accurate conceptions of photosynthesis features, and more accurate conceptions of photosynthesis and cellular respiration as transformative processes.

  19. Mental models accurately predict emotion transitions

    PubMed Central

    Thornton, Mark A.; Tamir, Diana I.

    2017-01-01

    Successful social interactions depend on people’s ability to predict others’ future actions and emotions. People possess many mechanisms for perceiving others’ current emotional states, but how might they use this information to predict others’ future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others’ emotional dynamics. People could then use these mental models of emotion transitions to predict others’ future emotions from currently observable emotions. To test this hypothesis, studies 1–3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants’ ratings of emotion transitions predicted others’ experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation—valence, social impact, rationality, and human mind—inform participants’ mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants’ accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone. PMID:28533373

  20. GENOMIC PREDICTOR OF RESPONSE AND SURVIVAL FOLLOWING TAXANE-ANTHRACYCLINE CHEMOTHERAPY FOR INVASIVE BREAST CANCER

    PubMed Central

    Hatzis, Christos; Pusztai, Lajos; Valero, Vicente; Booser, Daniel J.; Esserman, Laura; Lluch, Ana; Vidaurre, Tatiana; Holmes, Frankie; Souchon, Eduardo; Martin, Miguel; Cotrina, José; Gomez, Henry; Hubbard, Rebekah; Chacón, J. Ignacio; Ferrer-Lozano, Jaime; Dyer, Richard; Buxton, Meredith; Gong, Yun; Wu, Yun; Ibrahim, Nuhad; Andreopoulou, Eleni; Ueno, Naoto T.; Hunt, Kelly; Yang, Wei; Nazario, Arlene; DeMichele, Angela; O’Shaughnessy, Joyce; Hortobagyi, Gabriel N.; Symmans, W. Fraser

    2017-01-01

    CONTEXT Accurate prediction of who will (or won’t) have high probability of survival benefit from standard treatments is fundamental for individualized cancer treatment strategies. OBJECTIVE To develop a predictor of response and survival from chemotherapy for newly diagnosed invasive breast cancer. DESIGN Development of different predictive signatures for resistance and response to neoadjuvant chemotherapy (stratified according to estrogen receptor (ER) status) from gene expression microarrays of newly diagnosed breast cancer (310 patients). Then prediction of breast cancer treatment-sensitivity using the combination of signatures for: 1) sensitivity to endocrine therapy, 2) chemo-resistance, and 3) chemo-sensitivity. Independent validation (198 patients) and comparison with other reported genomic predictors of chemotherapy response. SETTING Prospective multicenter study to develop and test genomic predictors for neoadjuvant chemotherapy. PATIENTS Newly diagnosed HER2-negative breast cancer treated with chemotherapy containing sequential taxane and anthracycline-based regimens then endocrine therapy (if hormone receptor-positive). MAIN OUTCOME MEASURES Distant relapse-free survival (DRFS) if predicted treatment-sensitive and absolute risk reduction (ARR, difference in DRFS of the two predicted groups) at median follow-up (3 years), and their 95% confidence intervals (CI). RESULTS Patients in the independent validation cohort (99% clinical Stage II–III) who were predicted to be treatment-sensitive (28% of total) had DRFS of 92% (CI 85–100) and survival benefit compared to others (absolute risk reduction (ARR) 18%; CI 6–28). Predictions were accurate if breast cancer was ER-positive (30% predicted sensitive, DRFS 97%, CI 91–100; ARR 11%, CI 0.1–21) or ER-negative (26% predicted sensitive, DRFS 83%, CI 68–100; ARR 26%, CI 4–28), and were significant in multivariate analysis after adjusting for relevant clinical-pathologic characteristics. Other genomic predictors showed paradoxically worse survival if predicted to be responsive to chemotherapy. CONCLUSION A genomic predictor combining ER status, predicted chemo-resistance, predicted chemo-sensitivity, and predicted endocrine sensitivity accurately identified patients with survival benefit following taxane-anthracycline chemotherapy. PMID:21558518

  1. Rapid and accurate prediction of degradant formation rates in pharmaceutical formulations using high-performance liquid chromatography-mass spectrometry.

    PubMed

    Darrington, Richard T; Jiao, Jim

    2004-04-01

    Rapid and accurate stability prediction is essential to pharmaceutical formulation development. Commonly used stability prediction methods include monitoring parent drug loss at intended storage conditions or initial rate determination of degradants under accelerated conditions. Monitoring parent drug loss at the intended storage condition does not provide a rapid and accurate stability assessment because often <0.5% drug loss is all that can be observed in a realistic time frame, while the accelerated initial rate method in conjunction with extrapolation of rate constants using the Arrhenius or Eyring equations often introduces large errors in shelf-life prediction. In this study, the shelf life prediction of a model pharmaceutical preparation utilizing sensitive high-performance liquid chromatography-mass spectrometry (LC/MS) to directly quantitate degradant formation rates at the intended storage condition is proposed. This method was compared to traditional shelf life prediction approaches in terms of time required to predict shelf life and associated error in shelf life estimation. Results demonstrated that the proposed LC/MS method using initial rates analysis provided significantly improved confidence intervals for the predicted shelf life and required less overall time and effort to obtain the stability estimation compared to the other methods evaluated. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association.

  2. Comparison of geostatistical interpolation and remote sensing techniques for estimating long-term exposure to ambient PM2.5 concentrations across the continental United States.

    PubMed

    Lee, Seung-Jae; Serre, Marc L; van Donkelaar, Aaron; Martin, Randall V; Burnett, Richard T; Jerrett, Michael

    2012-12-01

    A better understanding of the adverse health effects of chronic exposure to fine particulate matter (PM2.5) requires accurate estimates of PM2.5 variation at fine spatial scales. Remote sensing has emerged as an important means of estimating PM2.5 exposures, but relatively few studies have compared remote-sensing estimates to those derived from monitor-based data. We evaluated and compared the predictive capabilities of remote sensing and geostatistical interpolation. We developed a space-time geostatistical kriging model to predict PM2.5 over the continental United States and compared resulting predictions to estimates derived from satellite retrievals. The kriging estimate was more accurate for locations that were about 100 km from a monitoring station, whereas the remote sensing estimate was more accurate for locations that were > 100 km from a monitoring station. Based on this finding, we developed a hybrid map that combines the kriging and satellite-based PM2.5 estimates. We found that for most of the populated areas of the continental United States, geostatistical interpolation produced more accurate estimates than remote sensing. The differences between the estimates resulting from the two methods, however, were relatively small. In areas with extensive monitoring networks, the interpolation may provide more accurate estimates, but in the many areas of the world without such monitoring, remote sensing can provide useful exposure estimates that perform nearly as well.

  3. Prediction of phospholipidosis-inducing potential of drugs by in vitro biochemical and physicochemical assays followed by multivariate analysis.

    PubMed

    Kuroda, Yukihiro; Saito, Madoka

    2010-03-01

    An in vitro method to predict phospholipidosis-inducing potential of cationic amphiphilic drugs (CADs) was developed using biochemical and physicochemical assays. The following parameters were applied to principal component analysis, as well as physicochemical parameters: pK(a) and clogP; dissociation constant of CADs from phospholipid, inhibition of enzymatic phospholipid degradation, and metabolic stability of CADs. In the score plot, phospholipidosis-inducing drugs (amiodarone, propranolol, imipramine, chloroquine) were plotted locally forming the subspace for positive CADs; while non-inducing drugs (chlorpromazine, chloramphenicol, disopyramide, lidocaine) were placed scattering out of the subspace, allowing a clear discrimination between both classes of CADs. CADs that often produce false results by conventional physicochemical or cell-based assay methods were accurately determined by our method. Basic and lipophilic disopyramide could be accurately predicted as a nonphospholipidogenic drug. Moreover, chlorpromazine, which is often falsely predicted as a phospholipidosis-inducing drug by in vitro methods, could be accurately determined. Because this method uses the pharmacokinetic parameters pK(a), clogP, and metabolic stability, which are usually obtained in the early stages of drug development, the method newly requires only the two parameters, binding to phospholipid, and inhibition of lipid degradation enzyme. Therefore, this method provides a cost-effective approach to predict phospholipidosis-inducing potential of a drug. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  4. Fast and Accurate Prediction of Stratified Steel Temperature During Holding Period of Ladle

    NASA Astrophysics Data System (ADS)

    Deodhar, Anirudh; Singh, Umesh; Shukla, Rishabh; Gautham, B. P.; Singh, Amarendra K.

    2017-04-01

    Thermal stratification of liquid steel in a ladle during the holding period and the teeming operation has a direct bearing on the superheat available at the caster and hence on the caster set points such as casting speed and cooling rates. The changes in the caster set points are typically carried out based on temperature measurements at the end of tundish outlet. Thermal prediction models provide advance knowledge of the influence of process and design parameters on the steel temperature at various stages. Therefore, they can be used in making accurate decisions about the caster set points in real time. However, this requires both fast and accurate thermal prediction models. In this work, we develop a surrogate model for the prediction of thermal stratification using data extracted from a set of computational fluid dynamics (CFD) simulations, pre-determined using design of experiments technique. Regression method is used for training the predictor. The model predicts the stratified temperature profile instantaneously, for a given set of process parameters such as initial steel temperature, refractory heat content, slag thickness, and holding time. More than 96 pct of the predicted values are within an error range of ±5 K (±5 °C), when compared against corresponding CFD results. Considering its accuracy and computational efficiency, the model can be extended for thermal control of casting operations. This work also sets a benchmark for developing similar thermal models for downstream processes such as tundish and caster.

  5. Risk Prediction Models for Acute Kidney Injury in Critically Ill Patients: Opus in Progressu.

    PubMed

    Neyra, Javier A; Leaf, David E

    2018-05-31

    Acute kidney injury (AKI) is a complex systemic syndrome associated with high morbidity and mortality. Among critically ill patients admitted to intensive care units (ICUs), the incidence of AKI is as high as 50% and is associated with dismal outcomes. Thus, the development and validation of clinical risk prediction tools that accurately identify patients at high risk for AKI in the ICU is of paramount importance. We provide a comprehensive review of 3 clinical risk prediction tools that have been developed for incident AKI occurring in the first few hours or days following admission to the ICU. We found substantial heterogeneity among the clinical variables that were examined and included as significant predictors of AKI in the final models. The area under the receiver operating characteristic curves was ∼0.8 for all 3 models, indicating satisfactory model performance, though positive predictive values ranged from only 23 to 38%. Hence, further research is needed to develop more accurate and reproducible clinical risk prediction tools. Strategies for improved assessment of AKI susceptibility in the ICU include the incorporation of dynamic (time-varying) clinical parameters, as well as biomarker, functional, imaging, and genomic data. © 2018 S. Karger AG, Basel.

  6. Improving medical decisions for incapacitated persons: does focusing on "accurate predictions" lead to an inaccurate picture?

    PubMed

    Kim, Scott Y H

    2014-04-01

    The Patient Preference Predictor (PPP) proposal places a high priority on the accuracy of predicting patients' preferences and finds the performance of surrogates inadequate. However, the quest to develop a highly accurate, individualized statistical model has significant obstacles. First, it will be impossible to validate the PPP beyond the limit imposed by 60%-80% reliability of people's preferences for future medical decisions--a figure no better than the known average accuracy of surrogates. Second, evidence supports the view that a sizable minority of persons may not even have preferences to predict. Third, many, perhaps most, people express their autonomy just as much by entrusting their loved ones to exercise their judgment than by desiring to specifically control future decisions. Surrogate decision making faces none of these issues and, in fact, it may be more efficient, accurate, and authoritative than is commonly assumed.

  7. Overview of aerothermodynamic loads definition study

    NASA Technical Reports Server (NTRS)

    Gaugler, Raymond E.

    1991-01-01

    The objective of the Aerothermodynamic Loads Definition Study is to develop methods of accurately predicting the operating environment in advanced Earth-to-Orbit (ETO) propulsion systems, such as the Space Shuttle Main Engine (SSME) powerhead. Development of time averaged and time dependent three dimensional viscous computer codes as well as experimental verification and engine diagnostic testing are considered to be essential in achieving that objective. Time-averaged, nonsteady, and transient operating loads must all be well defined in order to accurately predict powerhead life. Described here is work in unsteady heat flow analysis, improved modeling of preburner flow, turbulence modeling for turbomachinery, computation of three dimensional flow with heat transfer, and unsteady viscous multi-blade row turbine analysis.

  8. Evaluation and comparison of predictive individual-level general surrogates.

    PubMed

    Gabriel, Erin E; Sachs, Michael C; Halloran, M Elizabeth

    2018-07-01

    An intermediate response measure that accurately predicts efficacy in a new setting at the individual level could be used both for prediction and personalized medical decisions. In this article, we define a predictive individual-level general surrogate (PIGS), which is an individual-level intermediate response that can be used to accurately predict individual efficacy in a new setting. While methods for evaluating trial-level general surrogates, which are predictors of trial-level efficacy, have been developed previously, few, if any, methods have been developed to evaluate individual-level general surrogates, and no methods have formalized the use of cross-validation to quantify the expected prediction error. Our proposed method uses existing methods of individual-level surrogate evaluation within a given clinical trial setting in combination with cross-validation over a set of clinical trials to evaluate surrogate quality and to estimate the absolute prediction error that is expected in a new trial setting when using a PIGS. Simulations show that our method performs well across a variety of scenarios. We use our method to evaluate and to compare candidate individual-level general surrogates over a set of multi-national trials of a pentavalent rotavirus vaccine.

  9. Developing a stochastic traffic volume prediction model for public-private partnership projects

    NASA Astrophysics Data System (ADS)

    Phong, Nguyen Thanh; Likhitruangsilp, Veerasak; Onishi, Masamitsu

    2017-11-01

    Transportation projects require an enormous amount of capital investment resulting from their tremendous size, complexity, and risk. Due to the limitation of public finances, the private sector is invited to participate in transportation project development. The private sector can entirely or partially invest in transportation projects in the form of Public-Private Partnership (PPP) scheme, which has been an attractive option for several developing countries, including Vietnam. There are many factors affecting the success of PPP projects. The accurate prediction of traffic volume is considered one of the key success factors of PPP transportation projects. However, only few research works investigated how to predict traffic volume over a long period of time. Moreover, conventional traffic volume forecasting methods are usually based on deterministic models which predict a single value of traffic volume but do not consider risk and uncertainty. This knowledge gap makes it difficult for concessionaires to estimate PPP transportation project revenues accurately. The objective of this paper is to develop a probabilistic traffic volume prediction model. First, traffic volumes were estimated following the Geometric Brownian Motion (GBM) process. Monte Carlo technique is then applied to simulate different scenarios. The results show that this stochastic approach can systematically analyze variations in the traffic volume and yield more reliable estimates for PPP projects.

  10. Development and validation of a near-infrared spectroscopy method for the prediction of acrylamide content in French-fried potato

    USDA-ARS?s Scientific Manuscript database

    This study investigated the ability of near-infrared spectroscopy (NIRS) to predict acrylamide content in French-fried potato. Potato flour spiked with acrylamide (50-8000 µg/kg) was used to determine if acrylamide could be accurately predicted in a potato matrix. French fries produced with various ...

  11. A human judgment approach to epidemiological forecasting

    PubMed Central

    Farrow, David C.; Brooks, Logan C.; Rosenfeld, Roni

    2017-01-01

    Infectious diseases impose considerable burden on society, despite significant advances in technology and medicine over the past century. Advanced warning can be helpful in mitigating and preparing for an impending or ongoing epidemic. Historically, such a capability has lagged for many reasons, including in particular the uncertainty in the current state of the system and in the understanding of the processes that drive epidemic trajectories. Presently we have access to data, models, and computational resources that enable the development of epidemiological forecasting systems. Indeed, several recent challenges hosted by the U.S. government have fostered an open and collaborative environment for the development of these technologies. The primary focus of these challenges has been to develop statistical and computational methods for epidemiological forecasting, but here we consider a serious alternative based on collective human judgment. We created the web-based “Epicast” forecasting system which collects and aggregates epidemic predictions made in real-time by human participants, and with these forecasts we ask two questions: how accurate is human judgment, and how do these forecasts compare to their more computational, data-driven alternatives? To address the former, we assess by a variety of metrics how accurately humans are able to predict influenza and chikungunya trajectories. As for the latter, we show that real-time, combined human predictions of the 2014–2015 and 2015–2016 U.S. flu seasons are often more accurate than the same predictions made by several statistical systems, especially for short-term targets. We conclude that there is valuable predictive power in collective human judgment, and we discuss the benefits and drawbacks of this approach. PMID:28282375

  12. A human judgment approach to epidemiological forecasting.

    PubMed

    Farrow, David C; Brooks, Logan C; Hyun, Sangwon; Tibshirani, Ryan J; Burke, Donald S; Rosenfeld, Roni

    2017-03-01

    Infectious diseases impose considerable burden on society, despite significant advances in technology and medicine over the past century. Advanced warning can be helpful in mitigating and preparing for an impending or ongoing epidemic. Historically, such a capability has lagged for many reasons, including in particular the uncertainty in the current state of the system and in the understanding of the processes that drive epidemic trajectories. Presently we have access to data, models, and computational resources that enable the development of epidemiological forecasting systems. Indeed, several recent challenges hosted by the U.S. government have fostered an open and collaborative environment for the development of these technologies. The primary focus of these challenges has been to develop statistical and computational methods for epidemiological forecasting, but here we consider a serious alternative based on collective human judgment. We created the web-based "Epicast" forecasting system which collects and aggregates epidemic predictions made in real-time by human participants, and with these forecasts we ask two questions: how accurate is human judgment, and how do these forecasts compare to their more computational, data-driven alternatives? To address the former, we assess by a variety of metrics how accurately humans are able to predict influenza and chikungunya trajectories. As for the latter, we show that real-time, combined human predictions of the 2014-2015 and 2015-2016 U.S. flu seasons are often more accurate than the same predictions made by several statistical systems, especially for short-term targets. We conclude that there is valuable predictive power in collective human judgment, and we discuss the benefits and drawbacks of this approach.

  13. Assessing patient risk of central line-associated bacteremia via machine learning.

    PubMed

    Beeler, Cole; Dbeibo, Lana; Kelley, Kristen; Thatcher, Levi; Webb, Douglas; Bah, Amadou; Monahan, Patrick; Fowler, Nicole R; Nicol, Spencer; Judy-Malcolm, Alisa; Azar, Jose

    2018-04-13

    Central line-associated bloodstream infections (CLABSIs) contribute to increased morbidity, length of hospital stay, and cost. Despite progress in understanding the risk factors, there remains a need to accurately predict the risk of CLABSIs and, in real time, prevent them from occurring. A predictive model was developed using retrospective data from a large academic healthcare system. Models were developed with machine learning via construction of random forests using validated input variables. Fifteen variables accounted for the most significant effect on CLABSI prediction based on a retrospective study of 70,218 unique patient encounters between January 1, 2013, and May 31, 2016. The area under the receiver operating characteristic curve for the best-performing model was 0.82 in production. This model has multiple applications for resource allocation for CLABSI prevention, including serving as a tool to target patients at highest risk for potentially cost-effective but otherwise time-limited interventions. Machine learning can be used to develop accurate models to predict the risk of CLABSI in real time prior to the development of infection. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  14. When high working memory capacity is and is not beneficial for predicting nonlinear processes.

    PubMed

    Fischer, Helen; Holt, Daniel V

    2017-04-01

    Predicting the development of dynamic processes is vital in many areas of life. Previous findings are inconclusive as to whether higher working memory capacity (WMC) is always associated with using more accurate prediction strategies, or whether higher WMC can also be associated with using overly complex strategies that do not improve accuracy. In this study, participants predicted a range of systematically varied nonlinear processes based on exponential functions where prediction accuracy could or could not be enhanced using well-calibrated rules. Results indicate that higher WMC participants seem to rely more on well-calibrated strategies, leading to more accurate predictions for processes with highly nonlinear trajectories in the prediction region. Predictions of lower WMC participants, in contrast, point toward an increased use of simple exemplar-based prediction strategies, which perform just as well as more complex strategies when the prediction region is approximately linear. These results imply that with respect to predicting dynamic processes, working memory capacity limits are not generally a strength or a weakness, but that this depends on the process to be predicted.

  15. A Unified Model of Performance: Validation of its Predictions across Different Sleep/Wake Schedules.

    PubMed

    Ramakrishnan, Sridhar; Wesensten, Nancy J; Balkin, Thomas J; Reifman, Jaques

    2016-01-01

    Historically, mathematical models of human neurobehavioral performance developed on data from one sleep study were limited to predicting performance in similar studies, restricting their practical utility. We recently developed a unified model of performance (UMP) to predict the effects of the continuum of sleep loss-from chronic sleep restriction (CSR) to total sleep deprivation (TSD) challenges-and validated it using data from two studies of one laboratory. Here, we significantly extended this effort by validating the UMP predictions across a wide range of sleep/wake schedules from different studies and laboratories. We developed the UMP on psychomotor vigilance task (PVT) lapse data from one study encompassing four different CSR conditions (7 d of 3, 5, 7, and 9 h of sleep/night), and predicted performance in five other studies (from four laboratories), including different combinations of TSD (40 to 88 h), CSR (2 to 6 h of sleep/night), control (8 to 10 h of sleep/night), and nap (nocturnal and diurnal) schedules. The UMP accurately predicted PVT performance trends across 14 different sleep/wake conditions, yielding average prediction errors between 7% and 36%, with the predictions lying within 2 standard errors of the measured data 87% of the time. In addition, the UMP accurately predicted performance impairment (average error of 15%) for schedules (TSD and naps) not used in model development. The unified model of performance can be used as a tool to help design sleep/wake schedules to optimize the extent and duration of neurobehavioral performance and to accelerate recovery after sleep loss. © 2016 Associated Professional Sleep Societies, LLC.

  16. Performance characterization of complex fuel port geometries for hybrid rocket fuel grains

    NASA Astrophysics Data System (ADS)

    Bath, Andrew

    This research investigated the 3D printing and burning of fuel grains with complex geometry and the development of software capable of modeling and predicting the regression of a cross-section of these complex fuel grains. The software developed did predict the geometry to a fair degree of accuracy, especially when enhanced corner rounding was turned on. The model does have some drawbacks, notably being relatively slow, and does not perfectly predict the regression. If corner rounding is turned off, however, the model does become much faster; although less accurate, this method does still predict a relatively accurate resulting burn geometry, and is fast enough to be used for performance-tuning or genetic algorithms. In addition to the modeling method, preliminary investigations into the burning behavior of fuel grains with a helical flow path were performed. The helix fuel grains have a regression rate of nearly 3 times that of any other fuel grain geometry, primarily due to the enhancement of the friction coefficient between the flow and flow path.

  17. Coating Life Prediction

    NASA Technical Reports Server (NTRS)

    Nesbitt, J. A.; Gedwill, M. A.

    1984-01-01

    Hot-section gas-turbine components typically require some form of coating for oxidation and corrosion protection. Efficient use of coatings requires reliable and accurate predictions of the protective life of the coating. Currently engine inspections and component replacements are often made on a conservative basis. As a result, there is a constant need to improve and develop the life-prediction capability of metallic coatings for use in various service environments. The purpose of this present work is aimed at developing of an improved methodology for predicting metallic coating lives in an oxidizing environment and in a corrosive environment.

  18. In Search of Black Swans: Identifying Students at Risk of Failing Licensing Examinations.

    PubMed

    Barber, Cassandra; Hammond, Robert; Gula, Lorne; Tithecott, Gary; Chahine, Saad

    2018-03-01

    To determine which admissions variables and curricular outcomes are predictive of being at risk of failing the Medical Council of Canada Qualifying Examination Part 1 (MCCQE1), how quickly student risk of failure can be predicted, and to what extent predictive modeling is possible and accurate in estimating future student risk. Data from five graduating cohorts (2011-2015), Schulich School of Medicine & Dentistry, Western University, were collected and analyzed using hierarchical generalized linear models (HGLMs). Area under the receiver operating characteristic curve (AUC) was used to evaluate the accuracy of predictive models and determine whether they could be used to predict future risk, using the 2016 graduating cohort. Four predictive models were developed to predict student risk of failure at admissions, year 1, year 2, and pre-MCCQE1. The HGLM analyses identified gender, MCAT verbal reasoning score, two preclerkship course mean grades, and the year 4 summative objective structured clinical examination score as significant predictors of student risk. The predictive accuracy of the models varied. The pre-MCCQE1 model was the most accurate at predicting a student's risk of failing (AUC 0.66-0.93), while the admissions model was not predictive (AUC 0.25-0.47). Key variables predictive of students at risk were found. The predictive models developed suggest, while it is not possible to identify student risk at admission, we can begin to identify and monitor students within the first year. Using such models, programs may be able to identify and monitor students at risk quantitatively and develop tailored intervention strategies.

  19. Predicting Microbial Fuel Cell Biofilm Communities and Bioreactor Performance using Artificial Neural Networks.

    PubMed

    Lesnik, Keaton Larson; Liu, Hong

    2017-09-19

    The complex interactions that occur in mixed-species bioelectrochemical reactors, like microbial fuel cells (MFCs), make accurate predictions of performance outcomes under untested conditions difficult. While direct correlations between any individual waste stream characteristic or microbial community structure and reactor performance have not been able to be directly established, the increase in sequencing data and readily available computational power enables the development of alternate approaches. In the current study, 33 MFCs were evaluated under a range of conditions including eight separate substrates and three different wastewaters. Artificial Neural Networks (ANNs) were used to establish mathematical relationships between wastewater/solution characteristics, biofilm communities, and reactor performance. ANN models that incorporated biotic interactions predicted reactor performance outcomes more accurately than those that did not. The average percent error of power density predictions was 16.01 ± 4.35%, while the average percent error of Coulombic efficiency and COD removal rate predictions were 1.77 ± 0.57% and 4.07 ± 1.06%, respectively. Predictions of power density improved to within 5.76 ± 3.16% percent error through classifying taxonomic data at the family versus class level. Results suggest that the microbial communities and performance of bioelectrochemical systems can be accurately predicted using data-mining, machine-learning techniques.

  20. Resting energy expenditure prediction in recreational athletes of 18-35 years: confirmation of Cunningham equation and an improved weight-based alternative.

    PubMed

    ten Haaf, Twan; Weijs, Peter J M

    2014-01-01

    Resting energy expenditure (REE) is expected to be higher in athletes because of their relatively high fat free mass (FFM). Therefore, REE predictive equation for recreational athletes may be required. The aim of this study was to validate existing REE predictive equations and to develop a new recreational athlete specific equation. 90 (53 M, 37 F) adult athletes, exercising on average 9.1 ± 5.0 hours a week and 5.0 ± 1.8 times a week, were included. REE was measured using indirect calorimetry (Vmax Encore n29), FFM and FM were measured using air displacement plethysmography. Multiple linear regression analysis was used to develop a new FFM-based and weight-based REE predictive equation. The percentage accurate predictions (within 10% of measured REE), percentage bias, root mean square error and limits of agreement were calculated. Results: The Cunningham equation and the new weight-based equation REE(kJ / d) = 49.940* weight(kg) + 2459.053* height(m) - 34.014* age(y) + 799.257* sex(M = 1,F = 0) + 122.502 and the new FFM-based equation REE(kJ / d) = 95.272*FFM(kg) + 2026.161 performed equally well. De Lorenzo's equation predicted REE less accurate, but better than the other generally used REE predictive equations. Harris-Benedict, WHO, Schofield, Mifflin and Owen all showed less than 50% accuracy. For a population of (Dutch) recreational athletes, the REE can accurately be predicted with the existing Cunningham equation. Since body composition measurement is not always possible, and other generally used equations fail, the new weight-based equation is advised for use in sports nutrition.

  1. SnowyOwl: accurate prediction of fungal genes by using RNA-Seq and homology information to select among ab initio models

    PubMed Central

    2014-01-01

    Background Locating the protein-coding genes in novel genomes is essential to understanding and exploiting the genomic information but it is still difficult to accurately predict all the genes. The recent availability of detailed information about transcript structure from high-throughput sequencing of messenger RNA (RNA-Seq) delineates many expressed genes and promises increased accuracy in gene prediction. Computational gene predictors have been intensively developed for and tested in well-studied animal genomes. Hundreds of fungal genomes are now or will soon be sequenced. The differences of fungal genomes from animal genomes and the phylogenetic sparsity of well-studied fungi call for gene-prediction tools tailored to them. Results SnowyOwl is a new gene prediction pipeline that uses RNA-Seq data to train and provide hints for the generation of Hidden Markov Model (HMM)-based gene predictions and to evaluate the resulting models. The pipeline has been developed and streamlined by comparing its predictions to manually curated gene models in three fungal genomes and validated against the high-quality gene annotation of Neurospora crassa; SnowyOwl predicted N. crassa genes with 83% sensitivity and 65% specificity. SnowyOwl gains sensitivity by repeatedly running the HMM gene predictor Augustus with varied input parameters and selectivity by choosing the models with best homology to known proteins and best agreement with the RNA-Seq data. Conclusions SnowyOwl efficiently uses RNA-Seq data to produce accurate gene models in both well-studied and novel fungal genomes. The source code for the SnowyOwl pipeline (in Python) and a web interface (in PHP) is freely available from http://sourceforge.net/projects/snowyowl/. PMID:24980894

  2. Validation of Models Used to Inform Colorectal Cancer Screening Guidelines: Accuracy and Implications.

    PubMed

    Rutter, Carolyn M; Knudsen, Amy B; Marsh, Tracey L; Doria-Rose, V Paul; Johnson, Eric; Pabiniak, Chester; Kuntz, Karen M; van Ballegooijen, Marjolein; Zauber, Ann G; Lansdorp-Vogelaar, Iris

    2016-07-01

    Microsimulation models synthesize evidence about disease processes and interventions, providing a method for predicting long-term benefits and harms of prevention, screening, and treatment strategies. Because models often require assumptions about unobservable processes, assessing a model's predictive accuracy is important. We validated 3 colorectal cancer (CRC) microsimulation models against outcomes from the United Kingdom Flexible Sigmoidoscopy Screening (UKFSS) Trial, a randomized controlled trial that examined the effectiveness of one-time flexible sigmoidoscopy screening to reduce CRC mortality. The models incorporate different assumptions about the time from adenoma initiation to development of preclinical and symptomatic CRC. Analyses compare model predictions to study estimates across a range of outcomes to provide insight into the accuracy of model assumptions. All 3 models accurately predicted the relative reduction in CRC mortality 10 years after screening (predicted hazard ratios, with 95% percentile intervals: 0.56 [0.44, 0.71], 0.63 [0.51, 0.75], 0.68 [0.53, 0.83]; estimated with 95% confidence interval: 0.56 [0.45, 0.69]). Two models with longer average preclinical duration accurately predicted the relative reduction in 10-year CRC incidence. Two models with longer mean sojourn time accurately predicted the number of screen-detected cancers. All 3 models predicted too many proximal adenomas among patients referred to colonoscopy. Model accuracy can only be established through external validation. Analyses such as these are therefore essential for any decision model. Results supported the assumptions that the average time from adenoma initiation to development of preclinical cancer is long (up to 25 years), and mean sojourn time is close to 4 years, suggesting the window for early detection and intervention by screening is relatively long. Variation in dwell time remains uncertain and could have important clinical and policy implications. © The Author(s) 2016.

  3. Inverse and Predictive Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Syracuse, Ellen Marie

    The LANL Seismo-Acoustic team has a strong capability in developing data-driven models that accurately predict a variety of observations. These models range from the simple – one-dimensional models that are constrained by a single dataset and can be used for quick and efficient predictions – to the complex – multidimensional models that are constrained by several types of data and result in more accurate predictions. Team members typically build models of geophysical characteristics of Earth and source distributions at scales of 1 to 1000s of km, the techniques used are applicable for other types of physical characteristics at an evenmore » greater range of scales. The following cases provide a snapshot of some of the modeling work done by the Seismo- Acoustic team at LANL.« less

  4. Modeling methodology for the accurate and prompt prediction of symptomatic events in chronic diseases.

    PubMed

    Pagán, Josué; Risco-Martín, José L; Moya, José M; Ayala, José L

    2016-08-01

    Prediction of symptomatic crises in chronic diseases allows to take decisions before the symptoms occur, such as the intake of drugs to avoid the symptoms or the activation of medical alarms. The prediction horizon is in this case an important parameter in order to fulfill the pharmacokinetics of medications, or the time response of medical services. This paper presents a study about the prediction limits of a chronic disease with symptomatic crises: the migraine. For that purpose, this work develops a methodology to build predictive migraine models and to improve these predictions beyond the limits of the initial models. The maximum prediction horizon is analyzed, and its dependency on the selected features is studied. A strategy for model selection is proposed to tackle the trade off between conservative but robust predictive models, with respect to less accurate predictions with higher horizons. The obtained results show a prediction horizon close to 40min, which is in the time range of the drug pharmacokinetics. Experiments have been performed in a realistic scenario where input data have been acquired in an ambulatory clinical study by the deployment of a non-intrusive Wireless Body Sensor Network. Our results provide an effective methodology for the selection of the future horizon in the development of prediction algorithms for diseases experiencing symptomatic crises. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Reliable and accurate point-based prediction of cumulative infiltration using soil readily available characteristics: A comparison between GMDH, ANN, and MLR

    NASA Astrophysics Data System (ADS)

    Rahmati, Mehdi

    2017-08-01

    Developing accurate and reliable pedo-transfer functions (PTFs) to predict soil non-readily available characteristics is one of the most concerned topic in soil science and selecting more appropriate predictors is a crucial factor in PTFs' development. Group method of data handling (GMDH), which finds an approximate relationship between a set of input and output variables, not only provide an explicit procedure to select the most essential PTF input variables, but also results in more accurate and reliable estimates than other mostly applied methodologies. Therefore, the current research was aimed to apply GMDH in comparison with multivariate linear regression (MLR) and artificial neural network (ANN) to develop several PTFs to predict soil cumulative infiltration point-basely at specific time intervals (0.5-45 min) using soil readily available characteristics (RACs). In this regard, soil infiltration curves as well as several soil RACs including soil primary particles (clay (CC), silt (Si), and sand (Sa)), saturated hydraulic conductivity (Ks), bulk (Db) and particle (Dp) densities, organic carbon (OC), wet-aggregate stability (WAS), electrical conductivity (EC), and soil antecedent (θi) and field saturated (θfs) water contents were measured at 134 different points in Lighvan watershed, northwest of Iran. Then, applying GMDH, MLR, and ANN methodologies, several PTFs have been developed to predict cumulative infiltrations using two sets of selected soil RACs including and excluding Ks. According to the test data, results showed that developed PTFs by GMDH and MLR procedures using all soil RACs including Ks resulted in more accurate (with E values of 0.673-0.963) and reliable (with CV values lower than 11 percent) predictions of cumulative infiltrations at different specific time steps. In contrast, ANN procedure had lower accuracy (with E values of 0.356-0.890) and reliability (with CV values up to 50 percent) compared to GMDH and MLR. The results also revealed that Ks exclusion from input variables list caused around 30 percent decrease in PTFs accuracy for all applied procedures. However, it seems that Ks exclusion resulted in more practical PTFs especially in the case of GMDH network applying input variables which are less time consuming than Ks. In general, it is concluded that GMDH provides more accurate and reliable estimates of cumulative infiltration (a non-readily available characteristic of soil) with a minimum set of input variables (2-4 input variables) and can be promising strategy to model soil infiltration combining the advantages of ANN and MLR methodologies.

  6. Future missions studies: Combining Schatten's solar activity prediction model with a chaotic prediction model

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.

    1991-01-01

    K. Schatten (1991) recently developed a method for combining his prediction model with our chaotic model. The philosophy behind this combined model and his method of combination is explained. Because the Schatten solar prediction model (KS) uses a dynamo to mimic solar dynamics, accurate prediction is limited to long-term solar behavior (10 to 20 years). The Chaotic prediction model (SA) uses the recently developed techniques of nonlinear dynamics to predict solar activity. It can be used to predict activity only up to the horizon. In theory, the chaotic prediction should be several orders of magnitude better than statistical predictions up to that horizon; beyond the horizon, chaotic predictions would theoretically be just as good as statistical predictions. Therefore, chaos theory puts a fundamental limit on predictability.

  7. Rapid analysis of composition and reactivity in cellulosic biomass feedstocks with near-infrared spectroscopy.

    PubMed

    Payne, Courtney E; Wolfrum, Edward J

    2015-01-01

    Obtaining accurate chemical composition and reactivity (measures of carbohydrate release and yield) information for biomass feedstocks in a timely manner is necessary for the commercialization of biofuels. Our objective was to use near-infrared (NIR) spectroscopy and partial least squares (PLS) multivariate analysis to develop calibration models to predict the feedstock composition and the release and yield of soluble carbohydrates generated by a bench-scale dilute acid pretreatment and enzymatic hydrolysis assay. Major feedstocks included in the calibration models are corn stover, sorghum, switchgrass, perennial cool season grasses, rice straw, and miscanthus. We present individual model statistics to demonstrate model performance and validation samples to more accurately measure predictive quality of the models. The PLS-2 model for composition predicts glucan, xylan, lignin, and ash (wt%) with uncertainties similar to primary measurement methods. A PLS-2 model was developed to predict glucose and xylose release following pretreatment and enzymatic hydrolysis. An additional PLS-2 model was developed to predict glucan and xylan yield. PLS-1 models were developed to predict the sum of glucose/glucan and xylose/xylan for release and yield (grams per gram). The release and yield models have higher uncertainties than the primary methods used to develop the models. It is possible to build effective multispecies feedstock models for composition, as well as carbohydrate release and yield. The model for composition is useful for predicting glucan, xylan, lignin, and ash with good uncertainties. The release and yield models have higher uncertainties; however, these models are useful for rapidly screening sample populations to identify unusual samples.

  8. A numerical solution of Duffing's equations including the prediction of jump phenomena

    NASA Technical Reports Server (NTRS)

    Moyer, E. T., Jr.; Ghasghai-Abdi, E.

    1987-01-01

    Numerical methodology for the solution of Duffing's differential equation is presented. Algorithms for the prediction of multiple equilibrium solutions and jump phenomena are developed. In addition, a filtering algorithm for producing steady state solutions is presented. The problem of a rigidly clamped circular plate subjected to cosinusoidal pressure loading is solved using the developed algorithms (the plate is assumed to be in the geometrically nonlinear range). The results accurately predict regions of solution multiplicity and jump phenomena.

  9. Ceramic Matrix Composites (CMC) Life Prediction Development - 2003

    NASA Technical Reports Server (NTRS)

    Levine, Stanley R.; Calomino, Anthony M.; Verrilli, Michael J.; Thomas, David J.; Halbig, Michael C.; Opila, Elizabeth J.; Ellis, John R.

    2003-01-01

    Accurate life prediction is critical to successful use of ceramic matrix composites (CMCs). The tools to accomplish this are immature and not oriented toward the behavior of carbon fiber reinforced silicon carbide (C/SiC), the primary system of interest for many reusable and single mission launch vehicle propulsion and airframe applications. This paper describes an approach and progress made to satisfy the need to develop an integrated life prediction system that addresses mechanical durability and environmental degradation of C/SiC.

  10. Analytical prediction of digital signal crosstalk of FCC

    NASA Technical Reports Server (NTRS)

    Belleisle, A. P.

    1972-01-01

    The results are presented of study effort whose aim was the development of accurate means of analyzing and predicting signal cross-talk in multi-wire digital data cables. A complete analytical model is developed n + 1 wire systems of uniform transmission lines with arbitrary linear boundary conditions. In addition, a minimum set of parameter measurements required for the application of the model are presented. Comparisons between cross-talk predicted by this model and actual measured cross-talk are shown for a six conductor ribbon cable.

  11. A Unified Model of Performance: Validation of its Predictions across Different Sleep/Wake Schedules

    PubMed Central

    Ramakrishnan, Sridhar; Wesensten, Nancy J.; Balkin, Thomas J.; Reifman, Jaques

    2016-01-01

    Study Objectives: Historically, mathematical models of human neurobehavioral performance developed on data from one sleep study were limited to predicting performance in similar studies, restricting their practical utility. We recently developed a unified model of performance (UMP) to predict the effects of the continuum of sleep loss—from chronic sleep restriction (CSR) to total sleep deprivation (TSD) challenges—and validated it using data from two studies of one laboratory. Here, we significantly extended this effort by validating the UMP predictions across a wide range of sleep/wake schedules from different studies and laboratories. Methods: We developed the UMP on psychomotor vigilance task (PVT) lapse data from one study encompassing four different CSR conditions (7 d of 3, 5, 7, and 9 h of sleep/night), and predicted performance in five other studies (from four laboratories), including different combinations of TSD (40 to 88 h), CSR (2 to 6 h of sleep/night), control (8 to 10 h of sleep/night), and nap (nocturnal and diurnal) schedules. Results: The UMP accurately predicted PVT performance trends across 14 different sleep/wake conditions, yielding average prediction errors between 7% and 36%, with the predictions lying within 2 standard errors of the measured data 87% of the time. In addition, the UMP accurately predicted performance impairment (average error of 15%) for schedules (TSD and naps) not used in model development. Conclusions: The unified model of performance can be used as a tool to help design sleep/wake schedules to optimize the extent and duration of neurobehavioral performance and to accelerate recovery after sleep loss. Citation: Ramakrishnan S, Wesensten NJ, Balkin TJ, Reifman J. A unified model of performance: validation of its predictions across different sleep/wake schedules. SLEEP 2016;39(1):249–262. PMID:26518594

  12. Sub-Model Partial Least Squares for Improved Accuracy in Quantitative Laser Induced Breakdown Spectroscopy

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Clegg, S. M.; Frydenvang, J.

    2015-12-01

    One of the primary challenges faced by the ChemCam instrument on the Curiosity Mars rover is developing a regression model that can accurately predict the composition of the wide range of target types encountered (basalts, calcium sulfate, feldspar, oxides, etc.). The original calibration used 69 rock standards to train a partial least squares (PLS) model for each major element. By expanding the suite of calibration samples to >400 targets spanning a wider range of compositions, the accuracy of the model was improved, but some targets with "extreme" compositions (e.g. pure minerals) were still poorly predicted. We have therefore developed a simple method, referred to as "submodel PLS", to improve the performance of PLS across a wide range of target compositions. In addition to generating a "full" (0-100 wt.%) PLS model for the element of interest, we also generate several overlapping submodels (e.g. for SiO2, we generate "low" (0-50 wt.%), "mid" (30-70 wt.%), and "high" (60-100 wt.%) models). The submodels are generally more accurate than the "full" model for samples within their range because they are able to adjust for matrix effects that are specific to that range. To predict the composition of an unknown target, we first predict the composition with the submodels and the "full" model. Then, based on the predicted composition from the "full" model, the appropriate submodel prediction can be used (e.g. if the full model predicts a low composition, use the "low" model result, which is likely to be more accurate). For samples with "full" predictions that occur in a region of overlap between submodels, the submodel predictions are "blended" using a simple linear weighted sum. The submodel PLS method shows improvements in most of the major elements predicted by ChemCam and reduces the occurrence of negative predictions for low wt.% targets. Submodel PLS is currently being used in conjunction with ICA regression for the major element compositions of ChemCam data.

  13. Development and Preliminary Performance of a Risk Factor Screen to Predict Posttraumatic Psychological Disorder After Trauma Exposure

    PubMed Central

    Carlson, Eve B.; Palmieri, Patrick A.; Spain, David A.

    2017-01-01

    Objective We examined data from a prospective study of risk factors that increase vulnerability or resilience, exacerbate distress, or foster recovery to determine whether risk factors accurately predict which individuals will later have high posttraumatic (PT) symptom levels and whether brief measures of risk factors also accurately predict later symptom elevations. Method Using data from 129 adults exposed to traumatic injury of self or a loved one, we conducted receiver operating characteristic (ROC) analyses of 14 risk factors assessed by full-length measures, determined optimal cutoff scores and calculated predictive performance for the nine that were most predictive. For five risk factors, we identified sets of items that accounted for 90% of variance in total scores and calculated predictive performance for sets of brief risk measures. Results A set of nine risk factors assessed by full measures identified 89% of those who later had elevated PT symptoms (sensitivity) and 78% of those who did not (specificity). A set of four brief risk factor measures assessed soon after injury identified 86% of those who later had elevated PT symptoms and 72% of those who did not. Conclusions Use of sets of brief risk factor measures shows promise of accurate prediction of PT psychological disorder and probable PTSD or depression. Replication of predictive accuracy is needed in a new and larger sample. PMID:28622811

  14. Temperature-mediated development thresholds of Sparganothis sulfureana (Lepidoptera: Tortricidae) in cranberries

    USDA-ARS?s Scientific Manuscript database

    Larvae of Sparganothis sulfureana Clemens, frequently attack cranberries, often resulting in economic damage to the crop. Because temperature dictates insect growth rate, development can be accurately estimated based on daily temperature measurements. To better predict S. sulfureana development acro...

  15. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.; Harris, Robert E.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate Discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured mesh Discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  16. The development of a plant risk evaluation (PRE) tool for assessing the invasive potential of ornamental plants.

    PubMed

    Conser, Christiana; Seebacher, Lizbeth; Fujino, David W; Reichard, Sarah; DiTomaso, Joseph M

    2015-01-01

    Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when "needs further evaluation" classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When "needs further evaluation" classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program.

  17. The Development of a Plant Risk Evaluation (PRE) Tool for Assessing the Invasive Potential of Ornamental Plants

    PubMed Central

    Conser, Christiana; Seebacher, Lizbeth; Fujino, David W.; Reichard, Sarah; DiTomaso, Joseph M.

    2015-01-01

    Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when “needs further evaluation” classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When “needs further evaluation” classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program. PMID:25803830

  18. Can phenological models predict tree phenology accurately in the future? The unrevealed hurdle of endodormancy break.

    PubMed

    Chuine, Isabelle; Bonhomme, Marc; Legave, Jean-Michel; García de Cortázar-Atauri, Iñaki; Charrier, Guillaume; Lacointe, André; Améglio, Thierry

    2016-10-01

    The onset of the growing season of trees has been earlier by 2.3 days per decade during the last 40 years in temperate Europe because of global warming. The effect of temperature on plant phenology is, however, not linear because temperature has a dual effect on bud development. On one hand, low temperatures are necessary to break bud endodormancy, and, on the other hand, higher temperatures are necessary to promote bud cell growth afterward. Different process-based models have been developed in the last decades to predict the date of budbreak of woody species. They predict that global warming should delay or compromise endodormancy break at the species equatorward range limits leading to a delay or even impossibility to flower or set new leaves. These models are classically parameterized with flowering or budbreak dates only, with no information on the endodormancy break date because this information is very scarce. Here, we evaluated the efficiency of a set of phenological models to accurately predict the endodormancy break dates of three fruit trees. Our results show that models calibrated solely with budbreak dates usually do not accurately predict the endodormancy break date. Providing endodormancy break date for the model parameterization results in much more accurate prediction of this latter, with, however, a higher error than that on budbreak dates. Most importantly, we show that models not calibrated with endodormancy break dates can generate large discrepancies in forecasted budbreak dates when using climate scenarios as compared to models calibrated with endodormancy break dates. This discrepancy increases with mean annual temperature and is therefore the strongest after 2050 in the southernmost regions. Our results claim for the urgent need of massive measurements of endodormancy break dates in forest and fruit trees to yield more robust projections of phenological changes in a near future. © 2016 John Wiley & Sons Ltd.

  19. Simulation of cryogenic turbopump annular seals

    NASA Astrophysics Data System (ADS)

    Palazzolo, Alan B.

    1992-12-01

    The goal of the current work is to develop software that can accurately predict the dynamic coefficients, forces, leakage and horsepower loss for annular seals which have a potential for affecting the rotordynamic behavior of the pumps. The fruit of last year's research was the computer code SEALPAL which included capabilities for linear tapered geometry, Moody friction factor and inlet pre-swirl. This code produced results which in most cases compared very well with check cases presented in the literature. TAMUSEAL Icode, which was written to improve SEALPAL by correcting a bug and by adding more accurate integration algorithms and additional capabilities, was then used to predict dynamic coefficients and leakage for the NASA/Pratt and Whitney Alternate Turbopump Development (ATD) LOX Pump's seal.

  20. Competitive Abilities in Experimental Microcosms Are Accurately Predicted by a Demographic Index for R*

    PubMed Central

    Murrell, Ebony G.; Juliano, Steven A.

    2012-01-01

    Resource competition theory predicts that R*, the equilibrium resource amount yielding zero growth of a consumer population, should predict species' competitive abilities for that resource. This concept has been supported for unicellular organisms, but has not been well-tested for metazoans, probably due to the difficulty of raising experimental populations to equilibrium and measuring population growth rates for species with long or complex life cycles. We developed an index (Rindex) of R* based on demography of one insect cohort, growing from egg to adult in a non-equilibrium setting, and tested whether Rindex yielded accurate predictions of competitive abilities using mosquitoes as a model system. We estimated finite rate of increase (λ′) from demographic data for cohorts of three mosquito species raised with different detritus amounts, and estimated each species' Rindex using nonlinear regressions of λ′ vs. initial detritus amount. All three species' Rindex differed significantly, and accurately predicted competitive hierarchy of the species determined in simultaneous pairwise competition experiments. Our Rindex could provide estimates and rigorous statistical comparisons of competitive ability for organisms for which typical chemostat methods and equilibrium population conditions are impractical. PMID:22970128

  1. PSORTb 3.0: improved protein subcellular localization prediction with refined localization subcategories and predictive capabilities for all prokaryotes.

    PubMed

    Yu, Nancy Y; Wagner, James R; Laird, Matthew R; Melli, Gabor; Rey, Sébastien; Lo, Raymond; Dao, Phuong; Sahinalp, S Cenk; Ester, Martin; Foster, Leonard J; Brinkman, Fiona S L

    2010-07-01

    PSORTb has remained the most precise bacterial protein subcellular localization (SCL) predictor since it was first made available in 2003. However, the recall needs to be improved and no accurate SCL predictors yet make predictions for archaea, nor differentiate important localization subcategories, such as proteins targeted to a host cell or bacterial hyperstructures/organelles. Such improvements should preferably be encompassed in a freely available web-based predictor that can also be used as a standalone program. We developed PSORTb version 3.0 with improved recall, higher proteome-scale prediction coverage, and new refined localization subcategories. It is the first SCL predictor specifically geared for all prokaryotes, including archaea and bacteria with atypical membrane/cell wall topologies. It features an improved standalone program, with a new batch results delivery system complementing its web interface. We evaluated the most accurate SCL predictors using 5-fold cross validation plus we performed an independent proteomics analysis, showing that PSORTb 3.0 is the most accurate but can benefit from being complemented by Proteome Analyst predictions. http://www.psort.org/psortb (download open source software or use the web interface). psort-mail@sfu.ca Supplementary data are available at Bioinformatics online.

  2. pKa prediction of monoprotic small molecules the SMARTS way.

    PubMed

    Lee, Adam C; Yu, Jing-Yu; Crippen, Gordon M

    2008-10-01

    Realizing favorable absorption, distribution, metabolism, elimination, and toxicity profiles is a necessity due to the high attrition rate of lead compounds in drug development today. The ability to accurately predict bioavailability can help save time and money during the screening and optimization processes. As several robust programs already exist for predicting logP, we have turned our attention to the fast and robust prediction of pK(a) for small molecules. Using curated data from the Beilstein Database and Lange's Handbook of Chemistry, we have created a decision tree based on a novel set of SMARTS strings that can accurately predict the pK(a) for monoprotic compounds with R(2) of 0.94 and root mean squared error of 0.68. Leave-some-out (10%) cross-validation achieved Q(2) of 0.91 and root mean squared error of 0.80.

  3. [Effect of heat transfer in the packages on the stability of thiamine nitrate under uncontrolled temperature conditions].

    PubMed

    Nakamura, Toru; Yamaji, Takayuki; Takayama, Kozo

    2013-01-01

    To accurately predict the stability of thiamine nitrate as a model drug in pharmaceutical products under uncontrolled temperature conditions, the average reaction rate constant was determined, taking into account the heat transfer from the atmosphere to the product. The stability tests of thiamine nitrate in the three packages with different heat transfers were performed under non-isothermal conditions. The stability data observed were compared with the predictions based on a newly developed method, showing that the stability was well predicted by the method involving the heat transfer. By contrast, there were some deviations observed from the predicted data, without considering heat transfer in the packages with low heat transfer. The above-mentioned result clearly shows that heat transfer should be considered to ensure accurate prediction of the stability of commercial pharmaceutical products under non-isothermal atmospheres.

  4. A new powerful parameterization tool for managing groundwater resources and predicting land subsidence in Las Vegas Valley

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Nunes, V. D.; Burbey, T. J.; Borggaard, J.

    2012-12-01

    More than 1.5 m of subsidence has been observed in Las Vegas Valley since 1935 as a result of groundwater pumping that commenced in 1905 (Bell, 2002). The compaction of the aquifer system has led to several large subsidence bowls and deleterious earth fissures. The highly heterogeneous aquifer system with its variably thick interbeds makes predicting the magnitude and location of subsidence extremely difficult. Several numerical groundwater flow models of the Las Vegas basin have been previously developed; however none of them have been able to accurately simulate the observed subsidence patterns or magnitudes because of inadequate parameterization. To better manage groundwater resources and predict future subsidence we have updated and developed a more accurate groundwater management model for Las Vegas Valley by developing a new adjoint parameter estimation package (APE) that is used in conjunction with UCODE along with MODFLOW and the SUB (subsidence) and HFB (horizontal flow barrier) packages. The APE package is used with UCODE to automatically identify suitable parameter zonations and inversely calculate parameter values from hydraulic head and subsidence measurements, which are highly sensitive to both elastic (Ske) and inelastic (Skv) storage coefficients. With the advent of InSAR (Interferometric synthetic aperture radar), distributed spatial and temporal subsidence measurements can be obtained, which greatly enhance the accuracy of parameter estimation. This automation process can remove user bias and provide a far more accurate and robust parameter zonation distribution. The outcome of this work yields a more accurate and powerful tool for managing groundwater resources in Las Vegas Valley to date.

  5. External validation of a nomogram for prediction of side-specific extracapsular extension at robotic radical prostatectomy.

    PubMed

    Zorn, Kevin C; Gallina, Andrea; Hutterer, Georg C; Walz, Jochen; Shalhav, Arieh L; Zagaja, Gregory P; Valiquette, Luc; Gofrit, Ofer N; Orvieto, Marcelo A; Taxy, Jerome B; Karakiewicz, Pierre I

    2007-11-01

    Several staging tools have been developed for open radical prostatectomy (ORP) patients. However, the validity of these tools has never been formally tested in patients treated with robot-assisted laparoscopic radical prostatectomy (RALP). We tested the accuracy of an ORP-derived nomogram in predicting the rate of extracapsular extension (ECE) in a large RALP cohort. Serum prostate specific antigen (PSA) and side-specific clinical stage and biopsy Gleason sum information were used in a previously validated nomogram predicting side-specific ECE. The nomogram-derived predictions were compared with the observed rate of ECE, and the accuracy of the predictions was quantified. Each prostate lobe was analyzed independently. As complete data were available for 576 patients, the analyses targeted 1152 prostate lobes. Median age and serum PSA concentration at radical prostatectomy were 60 years and 5.4 ng/mL, respectively. The majority of side-specific clinical stages were T(1c) (993; 86.2%). Most side-specific biopsy Gleason sums were 6 (572; 49.7%). The median side-specific percentages of positive cores and of cancer were, respectively, 20.0% and 5.0%. At final pathologic review, 107 patients (18.6%) had ECE, and side-specific ECE was present in 117 patients (20.3%). The nomogram was 89% accurate in the RALP cohort v 84% in the previously reported ORP validation. The ORP side-specific ECE nomogram is highly accurate in the RALP population, suggesting that predictive and possibly prognostic tools developed in ORP patients may be equally accurate in their RALP counterparts.

  6. Assessment of driver stopping prediction models before and after the onset of yellow using two driving simulator datasets.

    PubMed

    Ghanipoor Machiani, Sahar; Abbas, Montasir

    2016-11-01

    Accurate modeling of driver decisions in dilemma zones (DZ), where drivers are not sure whether to stop or go at the onset of yellow, can be used to increase safety at signalized intersections. This study utilized data obtained from two different driving simulator studies (VT-SCORES and NADS datasets) to investigate the possibility of developing accurate driver-decision prediction/classification models in DZ. Canonical discriminant analysis was used to construct the prediction models, and two timeframes were considered. The first timeframe used data collected during green immediately before the onset of yellow, and the second timeframe used data collected during the first three seconds after the onset of yellow. Signal protection algorithms could use the results of the prediction model during the first timeframe to decide the best time for ending the green signal, and could use the results of the prediction model during the first three seconds of yellow to extend the clearance interval. It was found that the discriminant model using data collected during the first three seconds of yellow was the most accurate, at 99% accuracy. It was also found that data collection should focus on variables that are related to speed, acceleration, time, and distance to intersection, as opposed to secondary variables, such as pavement conditions, since secondary variables did not significantly change the accuracy of the prediction models. The results reveal a promising possibility for incorporating the developed models in traffic-signal controllers to improve DZ-protection strategies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Assessment of Erysiphe necator ascospore release models for use in the Mediterranean climate of western Oregon

    USDA-ARS?s Scientific Manuscript database

    Predictive models have been developed in several major grape growing regions to correlate environmental conditions to Erysiphe necator ascospore release; however, these models may not accurately predict ascospore release in other viticulture regions with differing climatic conditions. To assess asco...

  8. Predicting tree mortality following gypsy moth defoliation

    Treesearch

    D.E. Fosbroke; R.R. Hicks; K.W. Gottschalk

    1991-01-01

    Appropriate application of gypsy moth control strategies requires an accurate prediction of the distribution and intensity of tree mortality prior to defoliation. This prior information is necessary to better target investments in control activities where they are needed. This poster lays the groundwork for developing hazard-rating systems for forests of the...

  9. Effects of soil moisture on the diurnal pattern of pesticide emission: Numerical simulation and sensitivity analysis

    USDA-ARS?s Scientific Manuscript database

    Accurate prediction of pesticide volatilization is important for the protection of human and environmental health. Due to the complexity of the volatilization process, sophisticated predictive models are needed, especially for dry soil conditions. A mathematical model was developed to allow simulati...

  10. Mortality Probability Model III and Simplified Acute Physiology Score II

    PubMed Central

    Vasilevskis, Eduard E.; Kuzniewicz, Michael W.; Cason, Brian A.; Lane, Rondall K.; Dean, Mitzi L.; Clay, Ted; Rennie, Deborah J.; Vittinghoff, Eric; Dudley, R. Adams

    2009-01-01

    Background: To develop and compare ICU length-of-stay (LOS) risk-adjustment models using three commonly used mortality or LOS prediction models. Methods: Between 2001 and 2004, we performed a retrospective, observational study of 11,295 ICU patients from 35 hospitals in the California Intensive Care Outcomes Project. We compared the accuracy of the following three LOS models: a recalibrated acute physiology and chronic health evaluation (APACHE) IV-LOS model; and models developed using risk factors in the mortality probability model III at zero hours (MPM0) and the simplified acute physiology score (SAPS) II mortality prediction model. We evaluated models by calculating the following: (1) grouped coefficients of determination; (2) differences between observed and predicted LOS across subgroups; and (3) intraclass correlations of observed/expected LOS ratios between models. Results: The grouped coefficients of determination were APACHE IV with coefficients recalibrated to the LOS values of the study cohort (APACHE IVrecal) [R2 = 0.422], mortality probability model III at zero hours (MPM0 III) [R2 = 0.279], and simplified acute physiology score (SAPS II) [R2 = 0.008]. For each decile of predicted ICU LOS, the mean predicted LOS vs the observed LOS was significantly different (p ≤ 0.05) for three, two, and six deciles using APACHE IVrecal, MPM0 III, and SAPS II, respectively. Plots of the predicted vs the observed LOS ratios of the hospitals revealed a threefold variation in LOS among hospitals with high model correlations. Conclusions: APACHE IV and MPM0 III were more accurate than SAPS II for the prediction of ICU LOS. APACHE IV is the most accurate and best calibrated model. Although it is less accurate, MPM0 III may be a reasonable option if the data collection burden or the treatment effect bias is a consideration. PMID:19363210

  11. Development of the Metacognitive Skills of Prediction and Evaluation in Children With or Without Math Disability

    PubMed Central

    Garrett, Adia J.; Mazzocco, Michèle M. M.; Baker, Linda

    2009-01-01

    Metacognition refers to knowledge about one’s own cognition. The present study was designed to assess metacognitive skills that either precede or follow task engagement, rather than the processes that occur during a task. Specifically, we examined prediction and evaluation skills among children with (n = 17) or without (n = 179) mathematics learning disability (MLD), from grades 2 to 4. Children were asked to predict which of several math problems they could solve correctly; later, they were asked to solve those problems. They were asked to evaluate whether their solution to each of another set of problems was correct. Children’s ability to evaluate their answers to math problems improved from grade 2 to grade 3, whereas there was no change over time in the children’s ability to predict which problems they could solve correctly. Children with MLD were less accurate than children without MLD in evaluating both their correct and incorrect solutions, and they were less accurate at predicting which problems they could solve correctly. However, children with MLD were as accurate as their peers in correctly predicting that they could not solve specific math problems. The findings have implications for the usefulness of children’s self-review during mathematics problem solving. PMID:20084181

  12. Rapid and accurate prediction and scoring of water molecules in protein binding sites.

    PubMed

    Ross, Gregory A; Morris, Garrett M; Biggin, Philip C

    2012-01-01

    Water plays a critical role in ligand-protein interactions. However, it is still challenging to predict accurately not only where water molecules prefer to bind, but also which of those water molecules might be displaceable. The latter is often seen as a route to optimizing affinity of potential drug candidates. Using a protocol we call WaterDock, we show that the freely available AutoDock Vina tool can be used to predict accurately the binding sites of water molecules. WaterDock was validated using data from X-ray crystallography, neutron diffraction and molecular dynamics simulations and correctly predicted 97% of the water molecules in the test set. In addition, we combined data-mining, heuristic and machine learning techniques to develop probabilistic water molecule classifiers. When applied to WaterDock predictions in the Astex Diverse Set of protein ligand complexes, we could identify whether a water molecule was conserved or displaced to an accuracy of 75%. A second model predicted whether water molecules were displaced by polar groups or by non-polar groups to an accuracy of 80%. These results should prove useful for anyone wishing to undertake rational design of new compounds where the displacement of water molecules is being considered as a route to improved affinity.

  13. Estimation of state and material properties during heat-curing molding of composite materials using data assimilation: A numerical study.

    PubMed

    Matsuzaki, Ryosuke; Tachikawa, Takeshi; Ishizuka, Junya

    2018-03-01

    Accurate simulations of carbon fiber-reinforced plastic (CFRP) molding are vital for the development of high-quality products. However, such simulations are challenging and previous attempts to improve the accuracy of simulations by incorporating the data acquired from mold monitoring have not been completely successful. Therefore, in the present study, we developed a method to accurately predict various CFRP thermoset molding characteristics based on data assimilation, a process that combines theoretical and experimental values. The degree of cure as well as temperature and thermal conductivity distributions during the molding process were estimated using both temperature data and numerical simulations. An initial numerical experiment demonstrated that the internal mold state could be determined solely from the surface temperature values. A subsequent numerical experiment to validate this method showed that estimations based on surface temperatures were highly accurate in the case of degree of cure and internal temperature, although predictions of thermal conductivity were more difficult.

  14. An unexpected way forward: towards a more accurate and rigorous protein-protein binding affinity scoring function by eliminating terms from an already simple scoring function.

    PubMed

    Swanson, Jon; Audie, Joseph

    2018-01-01

    A fundamental and unsolved problem in biophysical chemistry is the development of a computationally simple, physically intuitive, and generally applicable method for accurately predicting and physically explaining protein-protein binding affinities from protein-protein interaction (PPI) complex coordinates. Here, we propose that the simplification of a previously described six-term PPI scoring function to a four term function results in a simple expression of all physically and statistically meaningful terms that can be used to accurately predict and explain binding affinities for a well-defined subset of PPIs that are characterized by (1) crystallographic coordinates, (2) rigid-body association, (3) normal interface size, and hydrophobicity and hydrophilicity, and (4) high quality experimental binding affinity measurements. We further propose that the four-term scoring function could be regarded as a core expression for future development into a more general PPI scoring function. Our work has clear implications for PPI modeling and structure-based drug design.

  15. A physical multifield model predicts the development of volume and structure in the human brain

    NASA Astrophysics Data System (ADS)

    Rooij, Rijk de; Kuhl, Ellen

    2018-03-01

    The prenatal development of the human brain is characterized by a rapid increase in brain volume and a development of a highly folded cortex. At the cellular level, these events are enabled by symmetric and asymmetric cell division in the ventricular regions of the brain followed by an outwards cell migration towards the peripheral regions. The role of mechanics during brain development has been suggested and acknowledged in past decades, but remains insufficiently understood. Here we propose a mechanistic model that couples cell division, cell migration, and brain volume growth to accurately model the developing brain between weeks 10 and 29 of gestation. Our model accurately predicts a 160-fold volume increase from 1.5 cm3 at week 10 to 235 cm3 at week 29 of gestation. In agreement with human brain development, the cortex begins to form around week 22 and accounts for about 30% of the total brain volume at week 29. Our results show that cell division and coupling between cell density and volume growth are essential to accurately model brain volume development, whereas cell migration and diffusion contribute mainly to the development of the cortex. We demonstrate that complex folding patterns, including sinusoidal folds and creases, emerge naturally as the cortex develops, even for low stiffness contrasts between the cortex and subcortex.

  16. Regge calculus and observations. II. Further applications.

    NASA Astrophysics Data System (ADS)

    Williams, Ruth M.; Ellis, G. F. R.

    1984-11-01

    The method, developed in an earlier paper, for tracing geodesies of particles and light rays through Regge calculus space-times, is applied to a number of problems in the Schwarzschild geometry. It is possible to obtain accurate predictions of light bending by taking sufficiently small Regge blocks. Calculations of perihelion precession, Thomas precession, and the distortion of a ball of fluid moving on a geodesic can also show good agreement with the analytic solution. However difficulties arise in obtaining accurate predictions for general orbits in these space-times. Applications to other problems in general relativity are discussed briefly.

  17. A phenology model for Sparganothis fruitworm in Cranberries

    USDA-ARS?s Scientific Manuscript database

    Larvae of Sparganothis sulfureana Clemens, frequently attack cranberries, often resulting in economic damage to the crop. Because temperature dictates insect growth rate, development can be accurately estimated based on daily temperature measurements. To better predict S. sulfureana development acro...

  18. Evaluation of axial pile bearing capacity based on pile driving analyzer (PDA) test using Neural Network

    NASA Astrophysics Data System (ADS)

    Maizir, H.; Suryanita, R.

    2018-01-01

    A few decades, many methods have been developed to predict and evaluate the bearing capacity of driven piles. The problem of the predicting and assessing the bearing capacity of the pile is very complicated and not yet established, different soil testing and evaluation produce a widely different solution. However, the most important thing is to determine methods used to predict and evaluate the bearing capacity of the pile to the required degree of accuracy and consistency value. Accurate prediction and evaluation of axial bearing capacity depend on some variables, such as the type of soil, diameter, and length of pile, etc. The aims of the study of Artificial Neural Networks (ANNs) are utilized to obtain more accurate and consistent axial bearing capacity of a driven pile. ANNs can be described as mapping an input to the target output data. The method using the ANN model developed to predict and evaluate the axial bearing capacity of the pile based on the pile driving analyzer (PDA) test data for more than 200 selected data. The results of the predictions obtained by the ANN model and the PDA test were then compared. This research as the neural network models give a right prediction and evaluation of the axial bearing capacity of piles using neural networks.

  19. Assessing the accuracy of improved force-matched water models derived from Ab initio molecular dynamics simulations.

    PubMed

    Köster, Andreas; Spura, Thomas; Rutkai, Gábor; Kessler, Jan; Wiebeler, Hendrik; Vrabec, Jadran; Kühne, Thomas D

    2016-07-15

    The accuracy of water models derived from ab initio molecular dynamics simulations by means on an improved force-matching scheme is assessed for various thermodynamic, transport, and structural properties. It is found that although the resulting force-matched water models are typically less accurate than fully empirical force fields in predicting thermodynamic properties, they are nevertheless much more accurate than generally appreciated in reproducing the structure of liquid water and in fact superseding most of the commonly used empirical water models. This development demonstrates the feasibility to routinely parametrize computationally efficient yet predictive potential energy functions based on accurate ab initio molecular dynamics simulations for a large variety of different systems. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Development of a Physiologically-Based Pharmacokinetic Model of the Rat Central Nervous System

    PubMed Central

    Badhan, Raj K. Singh; Chenel, Marylore; Penny, Jeffrey I.

    2014-01-01

    Central nervous system (CNS) drug disposition is dictated by a drug’s physicochemical properties and its ability to permeate physiological barriers. The blood–brain barrier (BBB), blood-cerebrospinal fluid barrier and centrally located drug transporter proteins influence drug disposition within the central nervous system. Attainment of adequate brain-to-plasma and cerebrospinal fluid-to-plasma partitioning is important in determining the efficacy of centrally acting therapeutics. We have developed a physiologically-based pharmacokinetic model of the rat CNS which incorporates brain interstitial fluid (ISF), choroidal epithelial and total cerebrospinal fluid (CSF) compartments and accurately predicts CNS pharmacokinetics. The model yielded reasonable predictions of unbound brain-to-plasma partition ratio (Kpuu,brain) and CSF:plasma ratio (CSF:Plasmau) using a series of in vitro permeability and unbound fraction parameters. When using in vitro permeability data obtained from L-mdr1a cells to estimate rat in vivo permeability, the model successfully predicted, to within 4-fold, Kpuu,brain and CSF:Plasmau for 81.5% of compounds simulated. The model presented allows for simultaneous simulation and analysis of both brain biophase and CSF to accurately predict CNS pharmacokinetics from preclinical drug parameters routinely available during discovery and development pathways. PMID:24647103

  1. A Thermo-Poromechanics Finite Element Model for Predicting Arterial Tissue Fusion

    NASA Astrophysics Data System (ADS)

    Fankell, Douglas P.

    This work provides modeling efforts and supplemental experimental work performed towards the ultimate goal of modeling heat transfer, mass transfer, and deformation occurring in biological tissue, in particular during arterial fusion and cutting. Developing accurate models of these processes accomplishes two goals. First, accurate models would enable engineers to design devices to be safer and less expensive. Second, the mechanisms behind tissue fusion and cutting are widely unknown; models with the ability to accurately predict physical phenomena occurring in the tissue will allow for insight into the underlying mechanisms of the processes. This work presents three aims and the efforts in achieving them, leading to an accurate model of tissue fusion and more broadly the thermo-poromechanics (TPM) occurring within biological tissue. Chapters 1 and 2 provide the motivation for developing accurate TPM models of biological tissue and an overview of previous modeling efforts. In Chapter 3, a coupled thermo-structural finite element (FE) model with the ability to predict arterial cutting is offered. From the work presented in Chapter 3, it became obvious a more detailed model was needed. Chapter 4 meets this need by presenting small strain TPM theory and its implementation in an FE code. The model is then used to simulate thermal tissue fusion. These simulations show the model's promise in predicting the water content and temperature of arterial wall tissue during the fusion process, but it is limited by its small deformation assumptions. Chapters 5-7 attempt to address this limitation by developing and implementing a large deformation TPM FE model. Chapters 5, 6, and 7 present a thermodynamically consistent, large deformation TPM FE model and its ability to simulate tissue fusion. Ultimately, this work provides several methods of simulating arterial tissue fusion and the thermo-poromechanics of biological tissue. It is the first work, to the author's knowledge, to simulate the fully coupled TPM of biological tissue and the first to present a fully coupled large deformation TPM FE model. In doing so, a stepping stone for more advanced modeling of biological tissue has been laid.

  2. Predicting Gang Fight Participation in a General Youth Sample via the HEW Youth Development Model's Community Program Impact Scales, Age, and Sex.

    ERIC Educational Resources Information Center

    Truckenmiller, James L.

    The accurate prediction of violence has been in the spotlight of critical concern in recent years. To investigate the relative predictive power of peer pressure, youth perceived negative labeling, youth perceived access to educational and occupational roles, social alienation, self-esteem, sex, and age with regard to gang fight participation…

  3. Predictability of the future development of aggressive behavior of cranial dural arteriovenous fistulas based on decision tree analysis.

    PubMed

    Satomi, Junichiro; Ghaibeh, A Ammar; Moriguchi, Hiroki; Nagahiro, Shinji

    2015-07-01

    The severity of clinical signs and symptoms of cranial dural arteriovenous fistulas (DAVFs) are well correlated with their pattern of venous drainage. Although the presence of cortical venous drainage can be considered a potential predictor of aggressive DAVF behaviors, such as intracranial hemorrhage or progressive neurological deficits due to venous congestion, accurate statistical analyses are currently not available. Using a decision tree data mining method, the authors aimed at clarifying the predictability of the future development of aggressive behaviors of DAVF and at identifying the main causative factors. Of 266 DAVF patients, 89 were eligible for analysis. Under observational management, 51 patients presented with intracranial hemorrhage/infarction during the follow-up period. The authors created a decision tree able to assess the risk for the development of aggressive DAVF behavior. Evaluated by 10-fold cross-validation, the decision tree's accuracy, sensitivity, and specificity were 85.28%, 88.33%, and 80.83%, respectively. The tree shows that the main factor in symptomatic patients was the presence of cortical venous drainage. In its absence, the lesion location determined the risk of a DAVF developing aggressive behavior. Decision tree analysis accurately predicts the future development of aggressive DAVF behavior.

  4. Evaluating the Performance of a New Model for Predicting the Growth of Clostridium perfringens in Cooked, Uncured Meat and Poultry Products under Isothermal, Heating, and Dynamically Cooling Conditions.

    PubMed

    Huang, Lihan

    2016-07-01

    Clostridium perfringens type A is a significant public health threat and its spores may germinate, outgrow, and multiply during cooling of cooked meats. This study applies a new C. perfringens growth model in the USDA Integrated Pathogen Modeling Program-Dynamic Prediction (IPMP Dynamic Prediction) Dynamic Prediction to predict the growth from spores of C. perfringens in cooked uncured meat and poultry products using isothermal, dynamic heating, and cooling data reported in the literature. The residual errors of predictions (observation-prediction) are analyzed, and the root-mean-square error (RMSE) calculated. For isothermal and heating profiles, each data point in growth curves is compared. The mean residual errors (MRE) of predictions range from -0.40 to 0.02 Log colony forming units (CFU)/g, with a RMSE of approximately 0.6 Log CFU/g. For cooling, the end point predictions are conservative in nature, with an MRE of -1.16 Log CFU/g for single-rate cooling and -0.66 Log CFU/g for dual-rate cooling. The RMSE is between 0.6 and 0.7 Log CFU/g. Compared with other models reported in the literature, this model makes more accurate and fail-safe predictions. For cooling, the percentage for accurate and fail-safe predictions is between 97.6% and 100%. Under criterion 1, the percentage of accurate predictions is 47.5% for single-rate cooling and 66.7% for dual-rate cooling, while the fail-dangerous predictions are between 0% and 2.4%. This study demonstrates that IPMP Dynamic Prediction can be used by food processors and regulatory agencies as a tool to predict the growth of C. perfringens in uncured cooked meats and evaluate the safety of cooked or heat-treated uncured meat and poultry products exposed to cooling deviations or to develop customized cooling schedules. This study also demonstrates the need for more accurate data collection during cooling. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  5. Integrating Growth Variability of the Ilium, Fifth Lumbar Vertebra, and Clavicle with Multivariate Adaptive Regression Splines Models for Subadult Age Estimation.

    PubMed

    Corron, Louise; Marchal, François; Condemi, Silvana; Telmon, Norbert; Chaumoitre, Kathia; Adalian, Pascal

    2018-05-31

    Subadult age estimation should rely on sampling and statistical protocols capturing development variability for more accurate age estimates. In this perspective, measurements were taken on the fifth lumbar vertebrae and/or clavicles of 534 French males and females aged 0-19 years and the ilia of 244 males and females aged 0-12 years. These variables were fitted in nonparametric multivariate adaptive regression splines (MARS) models with 95% prediction intervals (PIs) of age. The models were tested on two independent samples from Marseille and the Luis Lopes reference collection from Lisbon. Models using ilium width and module, maximum clavicle length, and lateral vertebral body heights were more than 92% accurate. Precision was lower for postpubertal individuals. Integrating punctual nonlinearities of the relationship between age and the variables and dynamic prediction intervals incorporated the normal increase in interindividual growth variability (heteroscedasticity of variance) with age for more biologically accurate predictions. © 2018 American Academy of Forensic Sciences.

  6. External validation of a simple clinical tool used to predict falls in people with Parkinson disease

    PubMed Central

    Duncan, Ryan P.; Cavanaugh, James T.; Earhart, Gammon M.; Ellis, Terry D.; Ford, Matthew P.; Foreman, K. Bo; Leddy, Abigail L.; Paul, Serene S.; Canning, Colleen G.; Thackeray, Anne; Dibble, Leland E.

    2015-01-01

    Background Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. METHODS We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. RESULTS The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76 –0.89), comparable to the developmental study. CONCLUSION The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual’s risk of an impending fall. PMID:26003412

  7. External validation of a simple clinical tool used to predict falls in people with Parkinson disease.

    PubMed

    Duncan, Ryan P; Cavanaugh, James T; Earhart, Gammon M; Ellis, Terry D; Ford, Matthew P; Foreman, K Bo; Leddy, Abigail L; Paul, Serene S; Canning, Colleen G; Thackeray, Anne; Dibble, Leland E

    2015-08-01

    Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76-0.89), comparable to the developmental study. The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual's risk of an impending fall. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. The New York State risk score for predicting in-hospital/30-day mortality following percutaneous coronary intervention.

    PubMed

    Hannan, Edward L; Farrell, Louise Szypulski; Walford, Gary; Jacobs, Alice K; Berger, Peter B; Holmes, David R; Stamato, Nicholas J; Sharma, Samin; King, Spencer B

    2013-06-01

    This study sought to develop a percutaneous coronary intervention (PCI) risk score for in-hospital/30-day mortality. Risk scores are simplified linear scores that provide clinicians with quick estimates of patients' short-term mortality rates for informed consent and to determine the appropriate intervention. Earlier PCI risk scores were based on in-hospital mortality. However, for PCI, a substantial percentage of patients die within 30 days of the procedure after discharge. New York's Percutaneous Coronary Interventions Reporting System was used to develop an in-hospital/30-day logistic regression model for patients undergoing PCI in 2010, and this model was converted into a simple linear risk score that estimates mortality rates. The score was validated by applying it to 2009 New York PCI data. Subsequent analyses evaluated the ability of the score to predict complications and length of stay. A total of 54,223 patients were used to develop the risk score. There are 11 risk factors that make up the score, with risk factor scores ranging from 1 to 9, and the highest total score is 34. The score was validated based on patients undergoing PCI in the previous year, and accurately predicted mortality for all patients as well as patients who recently suffered a myocardial infarction (MI). The PCI risk score developed here enables clinicians to estimate in-hospital/30-day mortality very quickly and quite accurately. It accurately predicts mortality for patients undergoing PCI in the previous year and for MI patients, and is also moderately related to perioperative complications and length of stay. Copyright © 2013 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  9. Atmospheric dispersion prediction and source estimation of hazardous gas using artificial neural network, particle swarm optimization and expectation maximization

    NASA Astrophysics Data System (ADS)

    Qiu, Sihang; Chen, Bin; Wang, Rongxiao; Zhu, Zhengqiu; Wang, Yuan; Qiu, Xiaogang

    2018-04-01

    Hazardous gas leak accident has posed a potential threat to human beings. Predicting atmospheric dispersion and estimating its source become increasingly important in emergency management. Current dispersion prediction and source estimation models cannot satisfy the requirement of emergency management because they are not equipped with high efficiency and accuracy at the same time. In this paper, we develop a fast and accurate dispersion prediction and source estimation method based on artificial neural network (ANN), particle swarm optimization (PSO) and expectation maximization (EM). The novel method uses a large amount of pre-determined scenarios to train the ANN for dispersion prediction, so that the ANN can predict concentration distribution accurately and efficiently. PSO and EM are applied for estimating the source parameters, which can effectively accelerate the process of convergence. The method is verified by the Indianapolis field study with a SF6 release source. The results demonstrate the effectiveness of the method.

  10. Interpreting Disruption Prediction Models to Improve Plasma Control

    NASA Astrophysics Data System (ADS)

    Parsons, Matthew

    2017-10-01

    In order for the tokamak to be a feasible design for a fusion reactor, it is necessary to minimize damage to the machine caused by plasma disruptions. Accurately predicting disruptions is a critical capability for triggering any mitigative actions, and a modest amount of attention has been given to efforts that employ machine learning techniques to make these predictions. By monitoring diagnostic signals during a discharge, such predictive models look for signs that the plasma is about to disrupt. Typically these predictive models are interpreted simply to give a `yes' or `no' response as to whether a disruption is approaching. However, it is possible to extract further information from these models to indicate which input signals are more strongly correlated with the plasma approaching a disruption. If highly accurate predictive models can be developed, this information could be used in plasma control schemes to make better decisions about disruption avoidance. This work was supported by a Grant from the 2016-2017 Fulbright U.S. Student Program, administered by the Franco-American Fulbright Commission in France.

  11. A hybrid intelligent method for three-dimensional short-term prediction of dissolved oxygen content in aquaculture.

    PubMed

    Chen, Yingyi; Yu, Huihui; Cheng, Yanjun; Cheng, Qianqian; Li, Daoliang

    2018-01-01

    A precise predictive model is important for obtaining a clear understanding of the changes in dissolved oxygen content in crab ponds. Highly accurate interval forecasting of dissolved oxygen content is fundamental to reduce risk, and three-dimensional prediction can provide more accurate results and overall guidance. In this study, a hybrid three-dimensional (3D) dissolved oxygen content prediction model based on a radial basis function (RBF) neural network, K-means and subtractive clustering was developed and named the subtractive clustering (SC)-K-means-RBF model. In this modeling process, K-means and subtractive clustering methods were employed to enhance the hyperparameters required in the RBF neural network model. The comparison of the predicted results of different traditional models validated the effectiveness and accuracy of the proposed hybrid SC-K-means-RBF model for three-dimensional prediction of dissolved oxygen content. Consequently, the proposed model can effectively display the three-dimensional distribution of dissolved oxygen content and serve as a guide for feeding and future studies.

  12. Gene expression models for prediction of longitudinal dispersion coefficient in streams

    NASA Astrophysics Data System (ADS)

    Sattar, Ahmed M. A.; Gharabaghi, Bahram

    2015-05-01

    Longitudinal dispersion is the key hydrologic process that governs transport of pollutants in natural streams. It is critical for spill action centers to be able to predict the pollutant travel time and break-through curves accurately following accidental spills in urban streams. This study presents a novel gene expression model for longitudinal dispersion developed using 150 published data sets of geometric and hydraulic parameters in natural streams in the United States, Canada, Europe, and New Zealand. The training and testing of the model were accomplished using randomly-selected 67% (100 data sets) and 33% (50 data sets) of the data sets, respectively. Gene expression programming (GEP) is used to develop empirical relations between the longitudinal dispersion coefficient and various control variables, including the Froude number which reflects the effect of reach slope, aspect ratio, and the bed material roughness on the dispersion coefficient. Two GEP models have been developed, and the prediction uncertainties of the developed GEP models are quantified and compared with those of existing models, showing improved prediction accuracy in favor of GEP models. Finally, a parametric analysis is performed for further verification of the developed GEP models. The main reason for the higher accuracy of the GEP models compared to the existing regression models is that exponents of the key variables (aspect ratio and bed material roughness) are not constants but a function of the Froude number. The proposed relations are both simple and accurate and can be effectively used to predict the longitudinal dispersion coefficients in natural streams.

  13. Toward accurate prediction of pKa values for internal protein residues: the importance of conformational relaxation and desolvation energy.

    PubMed

    Wallace, Jason A; Wang, Yuhang; Shi, Chuanyin; Pastoor, Kevin J; Nguyen, Bao-Linh; Xia, Kai; Shen, Jana K

    2011-12-01

    Proton uptake or release controls many important biological processes, such as energy transduction, virus replication, and catalysis. Accurate pK(a) prediction informs about proton pathways, thereby revealing detailed acid-base mechanisms. Physics-based methods in the framework of molecular dynamics simulations not only offer pK(a) predictions but also inform about the physical origins of pK(a) shifts and provide details of ionization-induced conformational relaxation and large-scale transitions. One such method is the recently developed continuous constant pH molecular dynamics (CPHMD) method, which has been shown to be an accurate and robust pK(a) prediction tool for naturally occurring titratable residues. To further examine the accuracy and limitations of CPHMD, we blindly predicted the pK(a) values for 87 titratable residues introduced in various hydrophobic regions of staphylococcal nuclease and variants. The predictions gave a root-mean-square deviation of 1.69 pK units from experiment, and there were only two pK(a)'s with errors greater than 3.5 pK units. Analysis of the conformational fluctuation of titrating side-chains in the context of the errors of calculated pK(a) values indicate that explicit treatment of conformational flexibility and the associated dielectric relaxation gives CPHMD a distinct advantage. Analysis of the sources of errors suggests that more accurate pK(a) predictions can be obtained for the most deeply buried residues by improving the accuracy in calculating desolvation energies. Furthermore, it is found that the generalized Born implicit-solvent model underlying the current CPHMD implementation slightly distorts the local conformational environment such that the inclusion of an explicit-solvent representation may offer improvement of accuracy. Copyright © 2011 Wiley-Liss, Inc.

  14. Rapid analysis of composition and reactivity in cellulosic biomass feedstocks with near-infrared spectroscopy

    DOE PAGES

    Payne, Courtney E.; Wolfrum, Edward J.

    2015-03-12

    Obtaining accurate chemical composition and reactivity (measures of carbohydrate release and yield) information for biomass feedstocks in a timely manner is necessary for the commercialization of biofuels. Our objective was to use near-infrared (NIR) spectroscopy and partial least squares (PLS) multivariate analysis to develop calibration models to predict the feedstock composition and the release and yield of soluble carbohydrates generated by a bench-scale dilute acid pretreatment and enzymatic hydrolysis assay. Major feedstocks included in the calibration models are corn stover, sorghum, switchgrass, perennial cool season grasses, rice straw, and miscanthus. Here are the results: We present individual model statistics tomore » demonstrate model performance and validation samples to more accurately measure predictive quality of the models. The PLS-2 model for composition predicts glucan, xylan, lignin, and ash (wt%) with uncertainties similar to primary measurement methods. A PLS-2 model was developed to predict glucose and xylose release following pretreatment and enzymatic hydrolysis. An additional PLS-2 model was developed to predict glucan and xylan yield. PLS-1 models were developed to predict the sum of glucose/glucan and xylose/xylan for release and yield (grams per gram). The release and yield models have higher uncertainties than the primary methods used to develop the models. In conclusion, it is possible to build effective multispecies feedstock models for composition, as well as carbohydrate release and yield. The model for composition is useful for predicting glucan, xylan, lignin, and ash with good uncertainties. The release and yield models have higher uncertainties; however, these models are useful for rapidly screening sample populations to identify unusual samples.« less

  15. Rapid analysis of composition and reactivity in cellulosic biomass feedstocks with near-infrared spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Courtney E.; Wolfrum, Edward J.

    Obtaining accurate chemical composition and reactivity (measures of carbohydrate release and yield) information for biomass feedstocks in a timely manner is necessary for the commercialization of biofuels. Our objective was to use near-infrared (NIR) spectroscopy and partial least squares (PLS) multivariate analysis to develop calibration models to predict the feedstock composition and the release and yield of soluble carbohydrates generated by a bench-scale dilute acid pretreatment and enzymatic hydrolysis assay. Major feedstocks included in the calibration models are corn stover, sorghum, switchgrass, perennial cool season grasses, rice straw, and miscanthus. Here are the results: We present individual model statistics tomore » demonstrate model performance and validation samples to more accurately measure predictive quality of the models. The PLS-2 model for composition predicts glucan, xylan, lignin, and ash (wt%) with uncertainties similar to primary measurement methods. A PLS-2 model was developed to predict glucose and xylose release following pretreatment and enzymatic hydrolysis. An additional PLS-2 model was developed to predict glucan and xylan yield. PLS-1 models were developed to predict the sum of glucose/glucan and xylose/xylan for release and yield (grams per gram). The release and yield models have higher uncertainties than the primary methods used to develop the models. In conclusion, it is possible to build effective multispecies feedstock models for composition, as well as carbohydrate release and yield. The model for composition is useful for predicting glucan, xylan, lignin, and ash with good uncertainties. The release and yield models have higher uncertainties; however, these models are useful for rapidly screening sample populations to identify unusual samples.« less

  16. Predicting β-turns and their types using predicted backbone dihedral angles and secondary structures

    PubMed Central

    2010-01-01

    Background β-turns are secondary structure elements usually classified as coil. Their prediction is important, because of their role in protein folding and their frequent occurrence in protein chains. Results We have developed a novel method that predicts β-turns and their types using information from multiple sequence alignments, predicted secondary structures and, for the first time, predicted dihedral angles. Our method uses support vector machines, a supervised classification technique, and is trained and tested on three established datasets of 426, 547 and 823 protein chains. We achieve a Matthews correlation coefficient of up to 0.49, when predicting the location of β-turns, the highest reported value to date. Moreover, the additional dihedral information improves the prediction of β-turn types I, II, IV, VIII and "non-specific", achieving correlation coefficients up to 0.39, 0.33, 0.27, 0.14 and 0.38, respectively. Our results are more accurate than other methods. Conclusions We have created an accurate predictor of β-turns and their types. Our method, called DEBT, is available online at http://comp.chem.nottingham.ac.uk/debt/. PMID:20673368

  17. Predicting beta-turns and their types using predicted backbone dihedral angles and secondary structures.

    PubMed

    Kountouris, Petros; Hirst, Jonathan D

    2010-07-31

    Beta-turns are secondary structure elements usually classified as coil. Their prediction is important, because of their role in protein folding and their frequent occurrence in protein chains. We have developed a novel method that predicts beta-turns and their types using information from multiple sequence alignments, predicted secondary structures and, for the first time, predicted dihedral angles. Our method uses support vector machines, a supervised classification technique, and is trained and tested on three established datasets of 426, 547 and 823 protein chains. We achieve a Matthews correlation coefficient of up to 0.49, when predicting the location of beta-turns, the highest reported value to date. Moreover, the additional dihedral information improves the prediction of beta-turn types I, II, IV, VIII and "non-specific", achieving correlation coefficients up to 0.39, 0.33, 0.27, 0.14 and 0.38, respectively. Our results are more accurate than other methods. We have created an accurate predictor of beta-turns and their types. Our method, called DEBT, is available online at http://comp.chem.nottingham.ac.uk/debt/.

  18. DEVELOPING SITE-SPECIFIC MODELS FOR FORECASTING BACTERIA LEVELS AT COASTAL BEACHES

    EPA Science Inventory

    The U.S.Beaches Environmental Assessment and Coastal Health Act of 2000 authorizes studies of pathogen indicators in coastal recreation waters that develop appropriate, accurate, expeditious, and cost-effective methods (including predictive models) for quantifying pathogens in co...

  19. Distinguishing between the Permeability Relationships with Absorption and Metabolism To Improve BCS and BDDCS Predictions in Early Drug Discovery

    PubMed Central

    2015-01-01

    The biopharmaceutics classification system (BCS) and biopharmaceutics drug distribution classification system (BDDCS) are complementary classification systems that can improve, simplify, and accelerate drug discovery, development, and regulatory processes. Drug permeability has been widely accepted as a screening tool for determining intestinal absorption via the BCS during the drug development and regulatory approval processes. Currently, predicting clinically significant drug interactions during drug development is a known challenge for industry and regulatory agencies. The BDDCS, a modification of BCS that utilizes drug metabolism instead of intestinal permeability, predicts drug disposition and potential drug–drug interactions in the intestine, the liver, and most recently the brain. Although correlations between BCS and BDDCS have been observed with drug permeability rates, discrepancies have been noted in drug classifications between the two systems utilizing different permeability models, which are accepted as surrogate models for demonstrating human intestinal permeability by the FDA. Here, we recommend the most applicable permeability models for improving the prediction of BCS and BDDCS classifications. We demonstrate that the passive transcellular permeability rate, characterized by means of permeability models that are deficient in transporter expression and paracellular junctions (e.g., PAMPA and Caco-2), will most accurately predict BDDCS metabolism. These systems will inaccurately predict BCS classifications for drugs that particularly are substrates of highly expressed intestinal transporters. Moreover, in this latter case, a system more representative of complete human intestinal permeability is needed to accurately predict BCS absorption. PMID:24628254

  20. Distinguishing between the permeability relationships with absorption and metabolism to improve BCS and BDDCS predictions in early drug discovery.

    PubMed

    Larregieu, Caroline A; Benet, Leslie Z

    2014-04-07

    The biopharmaceutics classification system (BCS) and biopharmaceutics drug distribution classification system (BDDCS) are complementary classification systems that can improve, simplify, and accelerate drug discovery, development, and regulatory processes. Drug permeability has been widely accepted as a screening tool for determining intestinal absorption via the BCS during the drug development and regulatory approval processes. Currently, predicting clinically significant drug interactions during drug development is a known challenge for industry and regulatory agencies. The BDDCS, a modification of BCS that utilizes drug metabolism instead of intestinal permeability, predicts drug disposition and potential drug-drug interactions in the intestine, the liver, and most recently the brain. Although correlations between BCS and BDDCS have been observed with drug permeability rates, discrepancies have been noted in drug classifications between the two systems utilizing different permeability models, which are accepted as surrogate models for demonstrating human intestinal permeability by the FDA. Here, we recommend the most applicable permeability models for improving the prediction of BCS and BDDCS classifications. We demonstrate that the passive transcellular permeability rate, characterized by means of permeability models that are deficient in transporter expression and paracellular junctions (e.g., PAMPA and Caco-2), will most accurately predict BDDCS metabolism. These systems will inaccurately predict BCS classifications for drugs that particularly are substrates of highly expressed intestinal transporters. Moreover, in this latter case, a system more representative of complete human intestinal permeability is needed to accurately predict BCS absorption.

  1. Fuel consumption models for pine flatwoods fuel types in the southeastern United States

    Treesearch

    Clinton S. Wright

    2013-01-01

    Modeling fire effects, including terrestrial and atmospheric carbon fluxes and pollutant emissions during wildland fires, requires accurate predictions of fuel consumption. Empirical models were developed for predicting fuel consumption from fuel and environmental measurements on a series of operational prescribed fires in pine flatwoods ecosystems in the southeastern...

  2. Identification of drug metabolites in human plasma or serum integrating metabolite prediction, LC-HRMS and untargeted data processing.

    PubMed

    Jacobs, Peter L; Ridder, Lars; Ruijken, Marco; Rosing, Hilde; Jager, Nynke Gl; Beijnen, Jos H; Bas, Richard R; van Dongen, William D

    2013-09-01

    Comprehensive identification of human drug metabolites in first-in-man studies is crucial to avoid delays in later stages of drug development. We developed an efficient workflow for systematic identification of human metabolites in plasma or serum that combines metabolite prediction, high-resolution accurate mass LC-MS and MS vendor independent data processing. Retrospective evaluation of predictions for 14 (14)C-ADME studies published in the period 2007-January 2012 indicates that on average 90% of the major metabolites in human plasma can be identified by searching for accurate masses of predicted metabolites. Furthermore, the workflow can identify unexpected metabolites in the same processing run, by differential analysis of samples of drug-dosed subjects and (placebo-dosed, pre-dose or otherwise blank) control samples. To demonstrate the utility of the workflow we applied it to identify tamoxifen metabolites in serum of a breast cancer patient treated with tamoxifen. Previously published metabolites were confirmed in this study and additional metabolites were identified, two of which are discussed to illustrate the advantages of the workflow.

  3. Limb-Enhancer Genie: An accessible resource of accurate enhancer predictions in the developing limb

    DOE PAGES

    Monti, Remo; Barozzi, Iros; Osterwalder, Marco; ...

    2017-08-21

    Epigenomic mapping of enhancer-associated chromatin modifications facilitates the genome-wide discovery of tissue-specific enhancers in vivo. However, reliance on single chromatin marks leads to high rates of false-positive predictions. More sophisticated, integrative methods have been described, but commonly suffer from limited accessibility to the resulting predictions and reduced biological interpretability. Here we present the Limb-Enhancer Genie (LEG), a collection of highly accurate, genome-wide predictions of enhancers in the developing limb, available through a user-friendly online interface. We predict limb enhancers using a combination of > 50 published limb-specific datasets and clusters of evolutionarily conserved transcription factor binding sites, taking advantage ofmore » the patterns observed at previously in vivo validated elements. By combining different statistical models, our approach outperforms current state-of-the-art methods and provides interpretable measures of feature importance. Our results indicate that including a previously unappreciated score that quantifies tissue-specific nuclease accessibility significantly improves prediction performance. We demonstrate the utility of our approach through in vivo validation of newly predicted elements. Moreover, we describe general features that can guide the type of datasets to include when predicting tissue-specific enhancers genome-wide, while providing an accessible resource to the general biological community and facilitating the functional interpretation of genetic studies of limb malformations.« less

  4. Biomarker Surrogates Do Not Accurately Predict Sputum Eosinophils and Neutrophils in Asthma

    PubMed Central

    Hastie, Annette T.; Moore, Wendy C.; Li, Huashi; Rector, Brian M.; Ortega, Victor E.; Pascual, Rodolfo M.; Peters, Stephen P.; Meyers, Deborah A.; Bleecker, Eugene R.

    2013-01-01

    Background Sputum eosinophils (Eos) are a strong predictor of airway inflammation, exacerbations, and aid asthma management, whereas sputum neutrophils (Neu) indicate a different severe asthma phenotype, potentially less responsive to TH2-targeted therapy. Variables such as blood Eos, total IgE, fractional exhaled nitric oxide (FeNO) or FEV1% predicted, may predict airway Eos, while age, FEV1%predicted, or blood Neu may predict sputum Neu. Availability and ease of measurement are useful characteristics, but accuracy in predicting airway Eos and Neu, individually or combined, is not established. Objectives To determine whether blood Eos, FeNO, and IgE accurately predict sputum eosinophils, and age, FEV1% predicted, and blood Neu accurately predict sputum neutrophils (Neu). Methods Subjects in the Wake Forest Severe Asthma Research Program (N=328) were characterized by blood and sputum cells, healthcare utilization, lung function, FeNO, and IgE. Multiple analytical techniques were utilized. Results Despite significant association with sputum Eos, blood Eos, FeNO and total IgE did not accurately predict sputum Eos, and combinations of these variables failed to improve prediction. Age, FEV1%predicted and blood Neu were similarly unsatisfactory for prediction of sputum Neu. Factor analysis and stepwise selection found FeNO, IgE and FEV1% predicted, but not blood Eos, correctly predicted 69% of sputum Eos

  5. Identifying and Tracking Pedestrians Based on Sensor Fusion and Motion Stability Predictions

    PubMed Central

    Musleh, Basam; García, Fernando; Otamendi, Javier; Armingol, José Mª; de la Escalera, Arturo

    2010-01-01

    The lack of trustworthy sensors makes development of Advanced Driver Assistance System (ADAS) applications a tough task. It is necessary to develop intelligent systems by combining reliable sensors and real-time algorithms to send the proper, accurate messages to the drivers. In this article, an application to detect and predict the movement of pedestrians in order to prevent an imminent collision has been developed and tested under real conditions. The proposed application, first, accurately measures the position of obstacles using a two-sensor hybrid fusion approach: a stereo camera vision system and a laser scanner. Second, it correctly identifies pedestrians using intelligent algorithms based on polylines and pattern recognition related to leg positions (laser subsystem) and dense disparity maps and u-v disparity (vision subsystem). Third, it uses statistical validation gates and confidence regions to track the pedestrian within the detection zones of the sensors and predict their position in the upcoming frames. The intelligent sensor application has been experimentally tested with success while tracking pedestrians that cross and move in zigzag fashion in front of a vehicle. PMID:22163639

  6. Identifying and tracking pedestrians based on sensor fusion and motion stability predictions.

    PubMed

    Musleh, Basam; García, Fernando; Otamendi, Javier; Armingol, José Maria; de la Escalera, Arturo

    2010-01-01

    The lack of trustworthy sensors makes development of Advanced Driver Assistance System (ADAS) applications a tough task. It is necessary to develop intelligent systems by combining reliable sensors and real-time algorithms to send the proper, accurate messages to the drivers. In this article, an application to detect and predict the movement of pedestrians in order to prevent an imminent collision has been developed and tested under real conditions. The proposed application, first, accurately measures the position of obstacles using a two-sensor hybrid fusion approach: a stereo camera vision system and a laser scanner. Second, it correctly identifies pedestrians using intelligent algorithms based on polylines and pattern recognition related to leg positions (laser subsystem) and dense disparity maps and u-v disparity (vision subsystem). Third, it uses statistical validation gates and confidence regions to track the pedestrian within the detection zones of the sensors and predict their position in the upcoming frames. The intelligent sensor application has been experimentally tested with success while tracking pedestrians that cross and move in zigzag fashion in front of a vehicle.

  7. Accurate prediction of energy expenditure using a shoe-based activity monitor.

    PubMed

    Sazonova, Nadezhda; Browning, Raymond C; Sazonov, Edward

    2011-07-01

    The aim of this study was to develop and validate a method for predicting energy expenditure (EE) using a footwear-based system with integrated accelerometer and pressure sensors. We developed a footwear-based device with an embedded accelerometer and insole pressure sensors for the prediction of EE. The data from the device can be used to perform accurate recognition of major postures and activities and to estimate EE using the acceleration, pressure, and posture/activity classification information in a branched algorithm without the need for individual calibration. We measured EE via indirect calorimetry as 16 adults (body mass index=19-39 kg·m) performed various low- to moderate-intensity activities and compared measured versus predicted EE using several models based on the acceleration and pressure signals. Inclusion of pressure data resulted in better accuracy of EE prediction during static postures such as sitting and standing. The activity-based branched model that included predictors from accelerometer and pressure sensors (BACC-PS) achieved the lowest error (e.g., root mean squared error (RMSE)=0.69 METs) compared with the accelerometer-only-based branched model BACC (RMSE=0.77 METs) and nonbranched model (RMSE=0.94-0.99 METs). Comparison of EE prediction models using data from both legs versus models using data from a single leg indicates that only one shoe needs to be equipped with sensors. These results suggest that foot acceleration combined with insole pressure measurement, when used in an activity-specific branched model, can accurately estimate the EE associated with common daily postures and activities. The accuracy and unobtrusiveness of a footwear-based device may make it an effective physical activity monitoring tool.

  8. Predicting College Students' First Year Success: Should Soft Skills Be Taken into Consideration to More Accurately Predict the Academic Achievement of College Freshmen?

    ERIC Educational Resources Information Center

    Powell, Erica Dion

    2013-01-01

    This study presents a survey developed to measure the skills of entering college freshmen in the areas of responsibility, motivation, study habits, literacy, and stress management, and explores the predictive power of this survey as a measure of academic performance during the first semester of college. The survey was completed by 334 incoming…

  9. Computational modeling of human oral bioavailability: what will be next?

    PubMed

    Cabrera-Pérez, Miguel Ángel; Pham-The, Hai

    2018-06-01

    The oral route is the most convenient way of administrating drugs. Therefore, accurate determination of oral bioavailability is paramount during drug discovery and development. Quantitative structure-property relationship (QSPR), rule-of-thumb (RoT) and physiologically based-pharmacokinetic (PBPK) approaches are promising alternatives to the early oral bioavailability prediction. Areas covered: The authors give insight into the factors affecting bioavailability, the fundamental theoretical framework and the practical aspects of computational methods for predicting this property. They also give their perspectives on future computational models for estimating oral bioavailability. Expert opinion: Oral bioavailability is a multi-factorial pharmacokinetic property with its accurate prediction challenging. For RoT and QSPR modeling, the reliability of datasets, the significance of molecular descriptor families and the diversity of chemometric tools used are important factors that define model predictability and interpretability. Likewise, for PBPK modeling the integrity of the pharmacokinetic data, the number of input parameters, the complexity of statistical analysis and the software packages used are relevant factors in bioavailability prediction. Although these approaches have been utilized independently, the tendency to use hybrid QSPR-PBPK approaches together with the exploration of ensemble and deep-learning systems for QSPR modeling of oral bioavailability has opened new avenues for development promising tools for oral bioavailability prediction.

  10. Local Debonding and Fiber Breakage in Composite Materials Modeled Accurately

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2001-01-01

    A prerequisite for full utilization of composite materials in aerospace components is accurate design and life prediction tools that enable the assessment of component performance and reliability. Such tools assist both structural analysts, who design and optimize structures composed of composite materials, and materials scientists who design and optimize the composite materials themselves. NASA Glenn Research Center's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package (http://www.grc.nasa.gov/WWW/LPB/mac) addresses this need for composite design and life prediction tools by providing a widely applicable and accurate approach to modeling composite materials. Furthermore, MAC/GMC serves as a platform for incorporating new local models and capabilities that are under development at NASA, thus enabling these new capabilities to progress rapidly to a stage in which they can be employed by the code's end users.

  11. Effect of accuracy of wind power prediction on power system operator

    NASA Technical Reports Server (NTRS)

    Schlueter, R. A.; Sigari, G.; Costi, T.

    1985-01-01

    This research project proposed a modified unit commitment that schedules connection and disconnection of generating units in response to load. A modified generation control is also proposed that controls steam units under automatic generation control, fast responding diesels, gas turbines and hydro units under a feedforward control, and wind turbine array output under a closed loop array control. This modified generation control and unit commitment require prediction of trend wind power variation one hour ahead and the prediction of error in this trend wind power prediction one half hour ahead. An improved meter for predicting trend wind speed variation is developed. Methods for accurately simulating the wind array power from a limited number of wind speed prediction records was developed. Finally, two methods for predicting the error in the trend wind power prediction were developed. This research provides a foundation for testing and evaluating the modified unit commitment and generation control that was developed to maintain operating reliability at a greatly reduced overall production cost for utilities with wind generation capacity.

  12. Towards Assessing the Human Trajectory Planning Horizon

    PubMed Central

    Nitsch, Verena; Meinzer, Dominik; Wollherr, Dirk

    2016-01-01

    Mobile robots are envisioned to cooperate closely with humans and to integrate seamlessly into a shared environment. For locomotion, these environments resemble traversable areas which are shared between multiple agents like humans and robots. The seamless integration of mobile robots into these environments requires accurate predictions of human locomotion. This work considers optimal control and model predictive control approaches for accurate trajectory prediction and proposes to integrate aspects of human behavior to improve their performance. Recently developed models are not able to reproduce accurately trajectories that result from sudden avoidance maneuvers. Particularly, the human locomotion behavior when handling disturbances from other agents poses a problem. The goal of this work is to investigate whether humans alter their trajectory planning horizon, in order to resolve abruptly emerging collision situations. By modeling humans as model predictive controllers, the influence of the planning horizon is investigated in simulations. Based on these results, an experiment is designed to identify, whether humans initiate a change in their locomotion planning behavior while moving in a complex environment. The results support the hypothesis, that humans employ a shorter planning horizon to avoid collisions that are triggered by unexpected disturbances. Observations presented in this work are expected to further improve the generalizability and accuracy of prediction methods based on dynamic models. PMID:27936015

  13. Towards Assessing the Human Trajectory Planning Horizon.

    PubMed

    Carton, Daniel; Nitsch, Verena; Meinzer, Dominik; Wollherr, Dirk

    2016-01-01

    Mobile robots are envisioned to cooperate closely with humans and to integrate seamlessly into a shared environment. For locomotion, these environments resemble traversable areas which are shared between multiple agents like humans and robots. The seamless integration of mobile robots into these environments requires accurate predictions of human locomotion. This work considers optimal control and model predictive control approaches for accurate trajectory prediction and proposes to integrate aspects of human behavior to improve their performance. Recently developed models are not able to reproduce accurately trajectories that result from sudden avoidance maneuvers. Particularly, the human locomotion behavior when handling disturbances from other agents poses a problem. The goal of this work is to investigate whether humans alter their trajectory planning horizon, in order to resolve abruptly emerging collision situations. By modeling humans as model predictive controllers, the influence of the planning horizon is investigated in simulations. Based on these results, an experiment is designed to identify, whether humans initiate a change in their locomotion planning behavior while moving in a complex environment. The results support the hypothesis, that humans employ a shorter planning horizon to avoid collisions that are triggered by unexpected disturbances. Observations presented in this work are expected to further improve the generalizability and accuracy of prediction methods based on dynamic models.

  14. Advanced Earth-to-orbit propulsion technology program overview: Impact of civil space technology initiative

    NASA Technical Reports Server (NTRS)

    Stephenson, Frank W., Jr.

    1988-01-01

    The NASA Earth-to-Orbit (ETO) Propulsion Technology Program is dedicated to advancing rocket engine technologies for the development of fully reusable engine systems that will enable space transportation systems to achieve low cost, routine access to space. The program addresses technology advancements in the areas of engine life extension/prediction, performance enhancements, reduced ground operations costs, and in-flight fault tolerant engine operations. The primary objective is to acquire increased knowledge and understanding of rocket engine chemical and physical processes in order to evolve more realistic analytical simulations of engine internal environments, to derive more accurate predictions of steady and unsteady loads, and using improved structural analyses, to more accurately predict component life and performance, and finally to identify and verify more durable advanced design concepts. In addition, efforts were focused on engine diagnostic needs and advances that would allow integrated health monitoring systems to be developed for enhanced maintainability, automated servicing, inspection, and checkout, and ultimately, in-flight fault tolerant engine operations.

  15. Entropies of negative incomes, Pareto-distributed loss, and financial crises.

    PubMed

    Gao, Jianbo; Hu, Jing; Mao, Xiang; Zhou, Mi; Gurbaxani, Brian; Lin, Johnny

    2011-01-01

    Health monitoring of world economy is an important issue, especially in a time of profound economic difficulty world-wide. The most important aspect of health monitoring is to accurately predict economic downturns. To gain insights into how economic crises develop, we present two metrics, positive and negative income entropy and distribution analysis, to analyze the collective "spatial" and temporal dynamics of companies in nine sectors of the world economy over a 19 year period from 1990-2008. These metrics provide accurate predictive skill with a very low false-positive rate in predicting downturns. The new metrics also provide evidence of phase transition-like behavior prior to the onset of recessions. Such a transition occurs when negative pretax incomes prior to or during economic recessions transition from a thin-tailed exponential distribution to the higher entropy Pareto distribution, and develop even heavier tails than those of the positive pretax incomes. These features propagate from the crisis initiating sector of the economy to other sectors.

  16. The solidification velocity of nickel and titanium alloys

    NASA Astrophysics Data System (ADS)

    Altgilbers, Alex Sho

    2002-09-01

    The solidification velocity of several Ni-Ti, Ni-Sn, Ni-Si, Ti-Al and Ti-Ni alloys were measured as a function of undercooling. From these results, a model for alloy solidification was developed that can be used to predict the solidification velocity as a function of undercooling more accurately. During this investigation a phenomenon was observed in the solidification velocity that is a direct result of the addition of the various alloying elements to nickel and titanium. The additions of the alloying elements resulted in an additional solidification velocity plateau at intermediate undercoolings. Past work has shown a solidification velocity plateau at high undercoolings can be attributed to residual oxygen. It is shown that a logistic growth model is a more accurate model for predicting the solidification of alloys. Additionally, a numerical model is developed from simple description of the effect of solute on the solidification velocity, which utilizes a Boltzmann logistic function to predict the plateaus that occur at intermediate undercoolings.

  17. A reexamination of age-related variation in body weight and morphometry of Maryland nutria

    USGS Publications Warehouse

    Sherfy, M.H.; Mollett, T.A.; McGowan, K.R.; Daugherty, S.L.

    2006-01-01

    Age-related variation in morphometry has been documented for many species. Knowledge of growth patterns can be useful for modeling energetics, detecting physiological influences on populations, and predicting age. These benefits have shown value in understanding population dynamics of invasive species, particularly in developing efficient control and eradication programs. However, development and evaluation of descriptive and predictive models is a critical initial step in this process. Accordingly, we used data from necropsies of 1,544 nutria (Myocastor coypus) collected in Maryland, USA, to evaluate the accuracy of previously published models for prediction of nutria age from body weight. Published models underestimated body weights of our animals, especially for ages <3. We used cross-validation procedures to develop and evaluate models for describing nutria growth patterns and for predicting nutria age. We derived models from a randomly selected model-building data set (n = 192-193 M, 217-222 F) and evaluated them with the remaining animals (n = 487-488 M, 642-647 F). We used nonlinear regression to develop Gompertz growth-curve models relating morphometric variables to age. Predicted values of morphometric variables fell within the 95% confidence limits of their true values for most age classes. We also developed predictive models for estimating nutria age from morphometry, using linear regression of log-transformed age on morphometric variables. The evaluation data set corresponded with 95% prediction intervals from the new models. Predictive models for body weight and length provided greater accuracy and less bias than models for foot length and axillary girth. Our growth models accurately described age-related variation in nutria morphometry, and our predictive models provided accurate estimates of ages from morphometry that will be useful for live-captured individuals. Our models offer better accuracy and precision than previously published models, providing a capacity for modeling energetics and growth patterns of Maryland nutria as well as an empirical basis for determining population age structure from live-captured animals.

  18. Machine learning bandgaps of double perovskites

    PubMed Central

    Pilania, G.; Mannodi-Kanakkithodi, A.; Uberuaga, B. P.; Ramprasad, R.; Gubernatis, J. E.; Lookman, T.

    2016-01-01

    The ability to make rapid and accurate predictions on bandgaps of double perovskites is of much practical interest for a range of applications. While quantum mechanical computations for high-fidelity bandgaps are enormously computation-time intensive and thus impractical in high throughput studies, informatics-based statistical learning approaches can be a promising alternative. Here we demonstrate a systematic feature-engineering approach and a robust learning framework for efficient and accurate predictions of electronic bandgaps of double perovskites. After evaluating a set of more than 1.2 million features, we identify lowest occupied Kohn-Sham levels and elemental electronegativities of the constituent atomic species as the most crucial and relevant predictors. The developed models are validated and tested using the best practices of data science and further analyzed to rationalize their prediction performance. PMID:26783247

  19. Male pattern baldness (image)

    MedlinePlus

    Male pattern baldness is a sex-linked characteristic that is passed from mother to child. A man can more accurately predict his chances of developing male pattern baldness by observing his mother's father than ...

  20. Accurate Binding Free Energy Predictions in Fragment Optimization.

    PubMed

    Steinbrecher, Thomas B; Dahlgren, Markus; Cappel, Daniel; Lin, Teng; Wang, Lingle; Krilov, Goran; Abel, Robert; Friesner, Richard; Sherman, Woody

    2015-11-23

    Predicting protein-ligand binding free energies is a central aim of computational structure-based drug design (SBDD)--improved accuracy in binding free energy predictions could significantly reduce costs and accelerate project timelines in lead discovery and optimization. The recent development and validation of advanced free energy calculation methods represents a major step toward this goal. Accurately predicting the relative binding free energy changes of modifications to ligands is especially valuable in the field of fragment-based drug design, since fragment screens tend to deliver initial hits of low binding affinity that require multiple rounds of synthesis to gain the requisite potency for a project. In this study, we show that a free energy perturbation protocol, FEP+, which was previously validated on drug-like lead compounds, is suitable for the calculation of relative binding strengths of fragment-sized compounds as well. We study several pharmaceutically relevant targets with a total of more than 90 fragments and find that the FEP+ methodology, which uses explicit solvent molecular dynamics and physics-based scoring with no parameters adjusted, can accurately predict relative fragment binding affinities. The calculations afford R(2)-values on average greater than 0.5 compared to experimental data and RMS errors of ca. 1.1 kcal/mol overall, demonstrating significant improvements over the docking and MM-GBSA methods tested in this work and indicating that FEP+ has the requisite predictive power to impact fragment-based affinity optimization projects.

  1. High-temperature fatigue in metals - A brief review of life prediction methods developed at the Lewis Research Center of NASA

    NASA Technical Reports Server (NTRS)

    Halford, G. R.

    1983-01-01

    The presentation focuses primarily on the progress we at NASA Lewis Research Center have made. The understanding of the phenomenological processes of high temperature fatigue of metals for the purpose of calculating lives of turbine engine hot section components is discussed. Improved understanding resulted in the development of accurate and physically correct life prediction methods such as Strain-Range partitioning for calculating creep fatigue interactions and the Double Linear Damage Rule for predicting potentially severe interactions between high and low cycle fatigue. Examples of other life prediction methods are also discussed. Previously announced in STAR as A83-12159

  2. Machine learning predictions of molecular properties: Accurate many-body potentials and nonlocality in chemical space

    DOE PAGES

    Hansen, Katja; Biegler, Franziska; Ramakrishnan, Raghunathan; ...

    2015-06-04

    Simultaneously accurate and efficient prediction of molecular properties throughout chemical compound space is a critical ingredient toward rational compound design in chemical and pharmaceutical industries. Aiming toward this goal, we develop and apply a systematic hierarchy of efficient empirical methods to estimate atomization and total energies of molecules. These methods range from a simple sum over atoms, to addition of bond energies, to pairwise interatomic force fields, reaching to the more sophisticated machine learning approaches that are capable of describing collective interactions between many atoms or bonds. In the case of equilibrium molecular geometries, even simple pairwise force fields demonstratemore » prediction accuracy comparable to benchmark energies calculated using density functional theory with hybrid exchange-correlation functionals; however, accounting for the collective many-body interactions proves to be essential for approaching the “holy grail” of chemical accuracy of 1 kcal/mol for both equilibrium and out-of-equilibrium geometries. This remarkable accuracy is achieved by a vectorized representation of molecules (so-called Bag of Bonds model) that exhibits strong nonlocality in chemical space. The same representation allows us to predict accurate electronic properties of molecules, such as their polarizability and molecular frontier orbital energies.« less

  3. Machine Learning Predictions of Molecular Properties: Accurate Many-Body Potentials and Nonlocality in Chemical Space

    PubMed Central

    2015-01-01

    Simultaneously accurate and efficient prediction of molecular properties throughout chemical compound space is a critical ingredient toward rational compound design in chemical and pharmaceutical industries. Aiming toward this goal, we develop and apply a systematic hierarchy of efficient empirical methods to estimate atomization and total energies of molecules. These methods range from a simple sum over atoms, to addition of bond energies, to pairwise interatomic force fields, reaching to the more sophisticated machine learning approaches that are capable of describing collective interactions between many atoms or bonds. In the case of equilibrium molecular geometries, even simple pairwise force fields demonstrate prediction accuracy comparable to benchmark energies calculated using density functional theory with hybrid exchange-correlation functionals; however, accounting for the collective many-body interactions proves to be essential for approaching the “holy grail” of chemical accuracy of 1 kcal/mol for both equilibrium and out-of-equilibrium geometries. This remarkable accuracy is achieved by a vectorized representation of molecules (so-called Bag of Bonds model) that exhibits strong nonlocality in chemical space. In addition, the same representation allows us to predict accurate electronic properties of molecules, such as their polarizability and molecular frontier orbital energies. PMID:26113956

  4. All-atom 3D structure prediction of transmembrane β-barrel proteins from sequences.

    PubMed

    Hayat, Sikander; Sander, Chris; Marks, Debora S; Elofsson, Arne

    2015-04-28

    Transmembrane β-barrels (TMBs) carry out major functions in substrate transport and protein biogenesis but experimental determination of their 3D structure is challenging. Encouraged by successful de novo 3D structure prediction of globular and α-helical membrane proteins from sequence alignments alone, we developed an approach to predict the 3D structure of TMBs. The approach combines the maximum-entropy evolutionary coupling method for predicting residue contacts (EVfold) with a machine-learning approach (boctopus2) for predicting β-strands in the barrel. In a blinded test for 19 TMB proteins of known structure that have a sufficient number of diverse homologous sequences available, this combined method (EVfold_bb) predicts hydrogen-bonded residue pairs between adjacent β-strands at an accuracy of ∼70%. This accuracy is sufficient for the generation of all-atom 3D models. In the transmembrane barrel region, the average 3D structure accuracy [template-modeling (TM) score] of top-ranked models is 0.54 (ranging from 0.36 to 0.85), with a higher (44%) number of residue pairs in correct strand-strand registration than in earlier methods (18%). Although the nonbarrel regions are predicted less accurately overall, the evolutionary couplings identify some highly constrained loop residues and, for FecA protein, the barrel including the structure of a plug domain can be accurately modeled (TM score = 0.68). Lower prediction accuracy tends to be associated with insufficient sequence information and we therefore expect increasing numbers of β-barrel families to become accessible to accurate 3D structure prediction as the number of available sequences increases.

  5. Predicting survival across chronic interstitial lung disease: the ILD-GAP model.

    PubMed

    Ryerson, Christopher J; Vittinghoff, Eric; Ley, Brett; Lee, Joyce S; Mooney, Joshua J; Jones, Kirk D; Elicker, Brett M; Wolters, Paul J; Koth, Laura L; King, Talmadge E; Collard, Harold R

    2014-04-01

    Risk prediction is challenging in chronic interstitial lung disease (ILD) because of heterogeneity in disease-specific and patient-specific variables. Our objective was to determine whether mortality is accurately predicted in patients with chronic ILD using the GAP model, a clinical prediction model based on sex, age, and lung physiology, that was previously validated in patients with idiopathic pulmonary fibrosis. Patients with idiopathic pulmonary fibrosis (n=307), chronic hypersensitivity pneumonitis (n=206), connective tissue disease-associated ILD (n=281), idiopathic nonspecific interstitial pneumonia (n=45), or unclassifiable ILD (n=173) were selected from an ongoing database (N=1,012). Performance of the previously validated GAP model was compared with novel prediction models in each ILD subtype and the combined cohort. Patients with follow-up pulmonary function data were used for longitudinal model validation. The GAP model had good performance in all ILD subtypes (c-index, 74.6 in the combined cohort), which was maintained at all stages of disease severity and during follow-up evaluation. The GAP model had similar performance compared with alternative prediction models. A modified ILD-GAP Index was developed for application across all ILD subtypes to provide disease-specific survival estimates using a single risk prediction model. This was done by adding a disease subtype variable that accounted for better adjusted survival in connective tissue disease-associated ILD, chronic hypersensitivity pneumonitis, and idiopathic nonspecific interstitial pneumonia. The GAP model accurately predicts risk of death in chronic ILD. The ILD-GAP model accurately predicts mortality in major chronic ILD subtypes and at all stages of disease.

  6. Predictive sensor method and apparatus

    NASA Technical Reports Server (NTRS)

    Cambridge, Vivien J.; Koger, Thomas L.

    1993-01-01

    A microprocessor and electronics package employing predictive methodology was developed to accelerate the response time of slowly responding hydrogen sensors. The system developed improved sensor response time from approximately 90 seconds to 8.5 seconds. The microprocessor works in real-time providing accurate hydrogen concentration corrected for fluctuations in sensor output resulting from changes in atmospheric pressure and temperature. Following the successful development of the hydrogen sensor system, the system and predictive methodology was adapted to a commercial medical thermometer probe. Results of the experiment indicate that, with some customization of hardware and software, response time improvements are possible for medical thermometers as well as other slowly responding sensors.

  7. Prediction of Scour below Flip Bucket using Soft Computing Techniques

    NASA Astrophysics Data System (ADS)

    Azamathulla, H. Md.; Ab Ghani, Aminuddin; Azazi Zakaria, Nor

    2010-05-01

    The accurate prediction of the depth of scour around hydraulic structure (trajectory spillways) has been based on the experimental studies and the equations developed are mainly empirical in nature. This paper evaluates the performance of the soft computing (intelligence) techiques, Adaptive Neuro-Fuzzy System (ANFIS) and Genetic expression Programming (GEP) approach, in prediction of scour below a flip bucket spillway. The results are very promising, which support the use of these intelligent techniques in prediction of highly non-linear scour parameters.

  8. Development of Predictive Models of Injury for the Lower Extremity, Lumbar, and Thoracic Spine after Discharge from Physical Rehabilitation

    DTIC Science & Technology

    2016-10-01

    prediction models will vary by age and sex . Hypothesis 3: A multi-factorial prediction model that accurately predicts risk of new and recurring injuries...members for injury risk after they have been cleared to return to duty from an injury is of great importance. The purpose of this project is to determine ...It turns out that many patients are not formally discharged from rehabilitation. Many of them “ self -discharge” and just stop coming back, either

  9. Life prediction technologies for aeronautical propulsion systems

    NASA Technical Reports Server (NTRS)

    Mcgaw, Michael A.

    1990-01-01

    Fatigue and fracture problems continue to occur in aeronautical gas turbine engines. Components whose useful life is limited by these failure modes include turbine hot-section blades, vanes, and disks. Safety considerations dictate that catastrophic failures be avoided, while economic considerations dictate that catastrophic failures be avoided, while economic considerations dictate that noncatastrophic failures occur as infrequently as possible. Therefore, the decision in design is making the tradeoff between engine performance and durability. LeRC has contributed to the aeropropulsion industry in the area of life prediction technology for over 30 years, developing creep and fatigue life prediction methodologies for hot-section materials. At the present time, emphasis is being placed on the development of methods capable of handling both thermal and mechanical fatigue under severe environments. Recent accomplishments include the development of more accurate creep-fatigue life prediction methods such as the total strain version of LeRC's strain-range partitioning (SRP) and the HOST-developed cyclic damage accumulation (CDA) model. Other examples include the development of a more accurate cumulative fatigue damage rule - the double damage curve approach (DDCA), which provides greatly improved accuracy in comparison with usual cumulative fatigue design rules. Other accomplishments in the area of high-temperature fatigue crack growth may also be mentioned. Finally, we are looking to the future and are beginning to do research on the advanced methods which will be required for development of advanced materials and propulsion systems over the next 10-20 years.

  10. On-line prediction of the glucose concentration of CHO cell cultivations by NIR and Raman spectroscopy: Comparative scalability test with a shake flask model system.

    PubMed

    Kozma, Bence; Hirsch, Edit; Gergely, Szilveszter; Párta, László; Pataki, Hajnalka; Salgó, András

    2017-10-25

    In this study, near-infrared (NIR) and Raman spectroscopy were compared in parallel to predict the glucose concentration of Chinese hamster ovary cell cultivations. A shake flask model system was used to quickly generate spectra similar to bioreactor cultivations therefore accelerating the development of a working model prior to actual cultivations. Automated variable selection and several pre-processing methods were tested iteratively during model development using spectra from six shake flask cultivations. The target was to achieve the lowest error of prediction for the glucose concentration in two independent shake flasks. The best model was then used to test the scalability of the two techniques by predicting spectra of a 10l and a 100l scale bioreactor cultivation. The NIR spectroscopy based model could follow the trend of the glucose concentration but it was not sufficiently accurate for bioreactor monitoring. On the other hand, the Raman spectroscopy based model predicted the concentration of glucose in both cultivation scales sufficiently accurately with an error around 4mM (0.72g/l), that is satisfactory for the on-line bioreactor monitoring purposes of the biopharma industry. Therefore, the shake flask model system was proven to be suitable for scalable spectroscopic model development. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Mapping ecological systems with a random foret model: tradeoffs between errors and bias

    Treesearch

    Emilie Grossmann; Janet Ohmann; James Kagan; Heather May; Matthew Gregory

    2010-01-01

    New methods for predictive vegetation mapping allow improved estimations of plant community composition across large regions. Random Forest (RF) models limit over-fitting problems of other methods, and are known for making accurate classification predictions from noisy, nonnormal data, but can be biased when plot samples are unbalanced. We developed two contrasting...

  12. Development of a Decision Support System to Predict Physicians' Rehabilitation Protocols for Patients with Knee Osteoarthritis

    ERIC Educational Resources Information Center

    Hawamdeh, Ziad M.; Alshraideh, Mohammad A.; Al-Ajlouni, Jihad M.; Salah, Imad K.; Holm, Margo B.; Otom, Ali H.

    2012-01-01

    To design a medical decision support system (MDSS) that would accurately predict the rehabilitation protocols prescribed by the physicians for patients with knee osteoarthritis (OA) using only their demographic and clinical characteristics. The demographic and clinical variables for 170 patients receiving one of three treatment protocols for knee…

  13. Probability-based collaborative filtering model for predicting gene-disease associations.

    PubMed

    Zeng, Xiangxiang; Ding, Ningxiang; Rodríguez-Patón, Alfonso; Zou, Quan

    2017-12-28

    Accurately predicting pathogenic human genes has been challenging in recent research. Considering extensive gene-disease data verified by biological experiments, we can apply computational methods to perform accurate predictions with reduced time and expenses. We propose a probability-based collaborative filtering model (PCFM) to predict pathogenic human genes. Several kinds of data sets, containing data of humans and data of other nonhuman species, are integrated in our model. Firstly, on the basis of a typical latent factorization model, we propose model I with an average heterogeneous regularization. Secondly, we develop modified model II with personal heterogeneous regularization to enhance the accuracy of aforementioned models. In this model, vector space similarity or Pearson correlation coefficient metrics and data on related species are also used. We compared the results of PCFM with the results of four state-of-arts approaches. The results show that PCFM performs better than other advanced approaches. PCFM model can be leveraged for predictions of disease genes, especially for new human genes or diseases with no known relationships.

  14. Investigation on temporal evolution of the grain refinement in copper under high strain rate loading via in-situ synchrotron measurement and predictive modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shah, Pooja Nitin; Shin, Yung C.; Sun, Tao

    Synchrotron X-rays are integrated with a modified Kolsky tension bar to conduct in situ tracking of the grain refinement mechanism operating during the dynamic deformation of metals. Copper with an initial average grain size of 36 μm is refined to 6.3 μm when loaded at a constant high strain rate of 1200 s -1. The synchrotron measurements revealed the temporal evolution of the grain refinement mechanism in terms of the initiation and rate of refinement throughout the loading test. A multiscale coupled probabilistic cellular automata based recrystallization model has been developed to predict the microstructural evolution occurring during dynamic deformationmore » processes. The model accurately predicts the initiation of the grain refinement mechanism with a predicted final average grain size of 2.4 μm. As a result, the model also accurately predicts the temporal evolution in terms of the initiation and extent of refinement when compared with the experimental results.« less

  15. Investigation on temporal evolution of the grain refinement in copper under high strain rate loading via in-situ synchrotron measurement and predictive modeling

    DOE PAGES

    Shah, Pooja Nitin; Shin, Yung C.; Sun, Tao

    2017-10-03

    Synchrotron X-rays are integrated with a modified Kolsky tension bar to conduct in situ tracking of the grain refinement mechanism operating during the dynamic deformation of metals. Copper with an initial average grain size of 36 μm is refined to 6.3 μm when loaded at a constant high strain rate of 1200 s -1. The synchrotron measurements revealed the temporal evolution of the grain refinement mechanism in terms of the initiation and rate of refinement throughout the loading test. A multiscale coupled probabilistic cellular automata based recrystallization model has been developed to predict the microstructural evolution occurring during dynamic deformationmore » processes. The model accurately predicts the initiation of the grain refinement mechanism with a predicted final average grain size of 2.4 μm. As a result, the model also accurately predicts the temporal evolution in terms of the initiation and extent of refinement when compared with the experimental results.« less

  16. Gaussian mixture models as flux prediction method for central receivers

    NASA Astrophysics Data System (ADS)

    Grobler, Annemarie; Gauché, Paul; Smit, Willie

    2016-05-01

    Flux prediction methods are crucial to the design and operation of central receiver systems. Current methods such as the circular and elliptical (bivariate) Gaussian prediction methods are often used in field layout design and aiming strategies. For experimental or small central receiver systems, the flux profile of a single heliostat often deviates significantly from the circular and elliptical Gaussian models. Therefore a novel method of flux prediction was developed by incorporating the fitting of Gaussian mixture models onto flux profiles produced by flux measurement or ray tracing. A method was also developed to predict the Gaussian mixture model parameters of a single heliostat for a given time using image processing. Recording the predicted parameters in a database ensures that more accurate predictions are made in a shorter time frame.

  17. Using radiance predicted by the P3 approximation in a spherical geometry to predict tissue optical properties

    NASA Astrophysics Data System (ADS)

    Dickey, Dwayne J.; Moore, Ronald B.; Tulip, John

    2001-01-01

    For photodynamic therapy of solid tumors, such as prostatic carcinoma, to be achieved, an accurate model to predict tissue parameters and light dose must be found. Presently, most analytical light dosimetry models are fluence based and are not clinically viable for tissue characterization. Other methods of predicting optical properties, such as Monet Carlo, are accurate but far too time consuming for clinical application. However, radiance predicted by the P3-Approximation, an anaylitical solution to the transport equation, may be a viable and accurate alternative. The P3-Approximation accurately predicts optical parameters in intralipid/methylene blue based phantoms in a spherical geometry. The optical parameters furnished by the radiance, when introduced into fluence predicted by both P3- Approximation and Grosjean Theory, correlate well with experimental data. The P3-Approximation also predicts the optical properties of prostate tissue, agreeing with documented optical parameters. The P3-Approximation could be the clinical tool necessary to facilitate PDT of solid tumors because of the limited number of invasive measurements required and the speed in which accurate calculations can be performed.

  18. Predicting turns in proteins with a unified model.

    PubMed

    Song, Qi; Li, Tonghua; Cong, Peisheng; Sun, Jiangming; Li, Dapeng; Tang, Shengnan

    2012-01-01

    Turns are a critical element of the structure of a protein; turns play a crucial role in loops, folds, and interactions. Current prediction methods are well developed for the prediction of individual turn types, including α-turn, β-turn, and γ-turn, etc. However, for further protein structure and function prediction it is necessary to develop a uniform model that can accurately predict all types of turns simultaneously. In this study, we present a novel approach, TurnP, which offers the ability to investigate all the turns in a protein based on a unified model. The main characteristics of TurnP are: (i) using newly exploited features of structural evolution information (secondary structure and shape string of protein) based on structure homologies, (ii) considering all types of turns in a unified model, and (iii) practical capability of accurate prediction of all turns simultaneously for a query. TurnP utilizes predicted secondary structures and predicted shape strings, both of which have greater accuracy, based on innovative technologies which were both developed by our group. Then, sequence and structural evolution features, which are profile of sequence, profile of secondary structures and profile of shape strings are generated by sequence and structure alignment. When TurnP was validated on a non-redundant dataset (4,107 entries) by five-fold cross-validation, we achieved an accuracy of 88.8% and a sensitivity of 71.8%, which exceeded the most state-of-the-art predictors of certain type of turn. Newly determined sequences, the EVA and CASP9 datasets were used as independent tests and the results we achieved were outstanding for turn predictions and confirmed the good performance of TurnP for practical applications.

  19. Predicting Turns in Proteins with a Unified Model

    PubMed Central

    Song, Qi; Li, Tonghua; Cong, Peisheng; Sun, Jiangming; Li, Dapeng; Tang, Shengnan

    2012-01-01

    Motivation Turns are a critical element of the structure of a protein; turns play a crucial role in loops, folds, and interactions. Current prediction methods are well developed for the prediction of individual turn types, including α-turn, β-turn, and γ-turn, etc. However, for further protein structure and function prediction it is necessary to develop a uniform model that can accurately predict all types of turns simultaneously. Results In this study, we present a novel approach, TurnP, which offers the ability to investigate all the turns in a protein based on a unified model. The main characteristics of TurnP are: (i) using newly exploited features of structural evolution information (secondary structure and shape string of protein) based on structure homologies, (ii) considering all types of turns in a unified model, and (iii) practical capability of accurate prediction of all turns simultaneously for a query. TurnP utilizes predicted secondary structures and predicted shape strings, both of which have greater accuracy, based on innovative technologies which were both developed by our group. Then, sequence and structural evolution features, which are profile of sequence, profile of secondary structures and profile of shape strings are generated by sequence and structure alignment. When TurnP was validated on a non-redundant dataset (4,107 entries) by five-fold cross-validation, we achieved an accuracy of 88.8% and a sensitivity of 71.8%, which exceeded the most state-of-the-art predictors of certain type of turn. Newly determined sequences, the EVA and CASP9 datasets were used as independent tests and the results we achieved were outstanding for turn predictions and confirmed the good performance of TurnP for practical applications. PMID:23144872

  20. PSSP-RFE: accurate prediction of protein structural class by recursive feature extraction from PSI-BLAST profile, physical-chemical property and functional annotations.

    PubMed

    Li, Liqi; Cui, Xiang; Yu, Sanjiu; Zhang, Yuan; Luo, Zhong; Yang, Hua; Zhou, Yue; Zheng, Xiaoqi

    2014-01-01

    Protein structure prediction is critical to functional annotation of the massively accumulated biological sequences, which prompts an imperative need for the development of high-throughput technologies. As a first and key step in protein structure prediction, protein structural class prediction becomes an increasingly challenging task. Amongst most homological-based approaches, the accuracies of protein structural class prediction are sufficiently high for high similarity datasets, but still far from being satisfactory for low similarity datasets, i.e., below 40% in pairwise sequence similarity. Therefore, we present a novel method for accurate and reliable protein structural class prediction for both high and low similarity datasets. This method is based on Support Vector Machine (SVM) in conjunction with integrated features from position-specific score matrix (PSSM), PROFEAT and Gene Ontology (GO). A feature selection approach, SVM-RFE, is also used to rank the integrated feature vectors through recursively removing the feature with the lowest ranking score. The definitive top features selected by SVM-RFE are input into the SVM engines to predict the structural class of a query protein. To validate our method, jackknife tests were applied to seven widely used benchmark datasets, reaching overall accuracies between 84.61% and 99.79%, which are significantly higher than those achieved by state-of-the-art tools. These results suggest that our method could serve as an accurate and cost-effective alternative to existing methods in protein structural classification, especially for low similarity datasets.

  1. Developing and testing temperature models for regulated systems: a case study on the Upper Delaware River

    USGS Publications Warehouse

    Cole, Jeffrey C.; Maloney, Kelly O.; Schmid, Matthias; McKenna, James E.

    2014-01-01

    Water temperature is an important driver of many processes in riverine ecosystems. If reservoirs are present, their releases can greatly influence downstream water temperatures. Models are important tools in understanding the influence these releases may have on the thermal regimes of downstream rivers. In this study, we developed and tested a suite of models to predict river temperature at a location downstream of two reservoirs in the Upper Delaware River (USA), a section of river that is managed to support a world-class coldwater fishery. Three empirical models were tested, including a Generalized Least Squares Model with a cosine trend (GLScos), AutoRegressive Integrated Moving Average (ARIMA), and Artificial Neural Network (ANN). We also tested one mechanistic Heat Flux Model (HFM) that was based on energy gain and loss. Predictor variables used in model development included climate data (e.g., solar radiation, wind speed, etc.) collected from a nearby weather station and temperature and hydrologic data from upstream U.S. Geological Survey gages. Models were developed with a training dataset that consisted of data from 2008 to 2011; they were then independently validated with a test dataset from 2012. Model accuracy was evaluated using root mean square error (RMSE), Nash Sutcliffe efficiency (NSE), percent bias (PBIAS), and index of agreement (d) statistics. Model forecast success was evaluated using baseline-modified prime index of agreement (md) at the one, three, and five day predictions. All five models accurately predicted daily mean river temperature across the entire training dataset (RMSE = 0.58–1.311, NSE = 0.99–0.97, d = 0.98–0.99); ARIMA was most accurate (RMSE = 0.57, NSE = 0.99), but each model, other than ARIMA, showed short periods of under- or over-predicting observed warmer temperatures. For the training dataset, all models besides ARIMA had overestimation bias (PBIAS = −0.10 to −1.30). Validation analyses showed all models performed well; the HFM model was the most accurate compared other models (RMSE = 0.92, both NSE = 0.98, d = 0.99) and the ARIMA model was least accurate (RMSE = 2.06, NSE = 0.92, d = 0.98); however, all models had an overestimation bias (PBIAS = −4.1 to −10.20). Aside from the one day forecast ARIMA model (md = 0.53), all models forecasted fairly well at the one, three, and five day forecasts (md = 0.77–0.96). Overall, we were successful in developing models predicting daily mean temperature across a broad range of temperatures. These models, specifically the GLScos, ANN, and HFM, may serve as important tools for predicting conditions and managing thermal releases in regulated river systems such as the Delaware River. Further model development may be important in customizing predictions for particular biological or ecological needs, or for particular temporal or spatial scales.

  2. Developing and testing temperature models for regulated systems: A case study on the Upper Delaware River

    NASA Astrophysics Data System (ADS)

    Cole, Jeffrey C.; Maloney, Kelly O.; Schmid, Matthias; McKenna, James E.

    2014-11-01

    Water temperature is an important driver of many processes in riverine ecosystems. If reservoirs are present, their releases can greatly influence downstream water temperatures. Models are important tools in understanding the influence these releases may have on the thermal regimes of downstream rivers. In this study, we developed and tested a suite of models to predict river temperature at a location downstream of two reservoirs in the Upper Delaware River (USA), a section of river that is managed to support a world-class coldwater fishery. Three empirical models were tested, including a Generalized Least Squares Model with a cosine trend (GLScos), AutoRegressive Integrated Moving Average (ARIMA), and Artificial Neural Network (ANN). We also tested one mechanistic Heat Flux Model (HFM) that was based on energy gain and loss. Predictor variables used in model development included climate data (e.g., solar radiation, wind speed, etc.) collected from a nearby weather station and temperature and hydrologic data from upstream U.S. Geological Survey gages. Models were developed with a training dataset that consisted of data from 2008 to 2011; they were then independently validated with a test dataset from 2012. Model accuracy was evaluated using root mean square error (RMSE), Nash Sutcliffe efficiency (NSE), percent bias (PBIAS), and index of agreement (d) statistics. Model forecast success was evaluated using baseline-modified prime index of agreement (md) at the one, three, and five day predictions. All five models accurately predicted daily mean river temperature across the entire training dataset (RMSE = 0.58-1.311, NSE = 0.99-0.97, d = 0.98-0.99); ARIMA was most accurate (RMSE = 0.57, NSE = 0.99), but each model, other than ARIMA, showed short periods of under- or over-predicting observed warmer temperatures. For the training dataset, all models besides ARIMA had overestimation bias (PBIAS = -0.10 to -1.30). Validation analyses showed all models performed well; the HFM model was the most accurate compared other models (RMSE = 0.92, both NSE = 0.98, d = 0.99) and the ARIMA model was least accurate (RMSE = 2.06, NSE = 0.92, d = 0.98); however, all models had an overestimation bias (PBIAS = -4.1 to -10.20). Aside from the one day forecast ARIMA model (md = 0.53), all models forecasted fairly well at the one, three, and five day forecasts (md = 0.77-0.96). Overall, we were successful in developing models predicting daily mean temperature across a broad range of temperatures. These models, specifically the GLScos, ANN, and HFM, may serve as important tools for predicting conditions and managing thermal releases in regulated river systems such as the Delaware River. Further model development may be important in customizing predictions for particular biological or ecological needs, or for particular temporal or spatial scales.

  3. VisitSense: Sensing Place Visit Patterns from Ambient Radio on Smartphones for Targeted Mobile Ads in Shopping Malls.

    PubMed

    Kim, Byoungjip; Kang, Seungwoo; Ha, Jin-Young; Song, Junehwa

    2015-07-16

    In this paper, we introduce a novel smartphone framework called VisitSense that automatically detects and predicts a smartphone user's place visits from ambient radio to enable behavioral targeting for mobile ads in large shopping malls. VisitSense enables mobile app developers to adopt visit-pattern-aware mobile advertising for shopping mall visitors in their apps. It also benefits mobile users by allowing them to receive highly relevant mobile ads that are aware of their place visit patterns in shopping malls. To achieve the goal, VisitSense employs accurate visit detection and prediction methods. For accurate visit detection, we develop a change-based detection method to take into consideration the stability change of ambient radio and the mobility change of users. It performs well in large shopping malls where ambient radio is quite noisy and causes existing algorithms to easily fail. In addition, we proposed a causality-based visit prediction model to capture the causality in the sequential visit patterns for effective prediction. We have developed a VisitSense prototype system, and a visit-pattern-aware mobile advertising application that is based on it. Furthermore, we deploy the system in the COEX Mall, one of the largest shopping malls in Korea, and conduct diverse experiments to show the effectiveness of VisitSense.

  4. Water Quality, Cyanobacteria, and Environmental Factors and Their Relations to Microcystin Concentrations for Use in Predictive Models at Ohio Lake Erie and Inland Lake Recreational Sites, 2013-14

    USGS Publications Warehouse

    Francy, Donna S.; Graham, Jennifer L.; Stelzer, Erin A.; Ecker, Christopher D.; Brady, Amie M. G.; Pam Struffolino,; Loftin, Keith A.

    2015-11-06

    The results of this study showed that water-quality and environmental variables are promising for use in site-specific daily or long-term predictive models. In order to develop more accurate models to predict toxin concentrations at freshwater lake sites, data need to be collected more frequently and for consecutive days in future studies.

  5. Development of Physics-Based Hurricane Wave Response Functions: Application to Selected Sites on the U.S. Gulf Coast

    NASA Astrophysics Data System (ADS)

    McLaughlin, P. W.; Kaihatu, J. M.; Irish, J. L.; Taylor, N. R.; Slinn, D.

    2013-12-01

    Recent hurricane activity in the Gulf of Mexico has led to a need for accurate, computationally efficient prediction of hurricane damage so that communities can better assess risk of local socio-economic disruption. This study focuses on developing robust, physics based non-dimensional equations that accurately predict maximum significant wave height at different locations near a given hurricane track. These equations (denoted as Wave Response Functions, or WRFs) were developed from presumed physical dependencies between wave heights and hurricane characteristics and fit with data from numerical models of waves and surge under hurricane conditions. After curve fitting, constraints which correct for fully developed sea state were used to limit the wind wave growth. When applied to the region near Gulfport, MS, back prediction of maximum significant wave height yielded root mean square errors between 0.22-0.42 (m) at open coast stations and 0.07-0.30 (m) at bay stations when compared to the numerical model data. The WRF method was also applied to Corpus Christi, TX and Panama City, FL with similar results. Back prediction errors will be included in uncertainty evaluations connected to risk calculations using joint probability methods. These methods require thousands of simulations to quantify extreme value statistics, thus requiring the use of reduced methods such as the WRF to represent the relevant physical processes.

  6. Time-Accurate Numerical Prediction of Free Flight Aerodynamics of a Finned Projectile

    DTIC Science & Technology

    2005-09-01

    develop (with fewer dollars) more lethal and effective munitions. The munitions must stay abreast of the latest technology available to our...consuming. Computer simulations can and have provided an effective means of determining the unsteady aerodynamics and flight mechanics of guided projectile...Recently, the time-accurate technique was used to obtain improved results for Magnus moment and roll damping moment of a spinning projectile at transonic

  7. Developing a dengue forecast model using machine learning: A case study in China.

    PubMed

    Guo, Pi; Liu, Tao; Zhang, Qin; Wang, Li; Xiao, Jianpeng; Zhang, Qingying; Luo, Ganfeng; Li, Zhihao; He, Jianfeng; Zhang, Yonghui; Ma, Wenjun

    2017-10-01

    In China, dengue remains an important public health issue with expanded areas and increased incidence recently. Accurate and timely forecasts of dengue incidence in China are still lacking. We aimed to use the state-of-the-art machine learning algorithms to develop an accurate predictive model of dengue. Weekly dengue cases, Baidu search queries and climate factors (mean temperature, relative humidity and rainfall) during 2011-2014 in Guangdong were gathered. A dengue search index was constructed for developing the predictive models in combination with climate factors. The observed year and week were also included in the models to control for the long-term trend and seasonality. Several machine learning algorithms, including the support vector regression (SVR) algorithm, step-down linear regression model, gradient boosted regression tree algorithm (GBM), negative binomial regression model (NBM), least absolute shrinkage and selection operator (LASSO) linear regression model and generalized additive model (GAM), were used as candidate models to predict dengue incidence. Performance and goodness of fit of the models were assessed using the root-mean-square error (RMSE) and R-squared measures. The residuals of the models were examined using the autocorrelation and partial autocorrelation function analyses to check the validity of the models. The models were further validated using dengue surveillance data from five other provinces. The epidemics during the last 12 weeks and the peak of the 2014 large outbreak were accurately forecasted by the SVR model selected by a cross-validation technique. Moreover, the SVR model had the consistently smallest prediction error rates for tracking the dynamics of dengue and forecasting the outbreaks in other areas in China. The proposed SVR model achieved a superior performance in comparison with other forecasting techniques assessed in this study. The findings can help the government and community respond early to dengue epidemics.

  8. Prediction of Chemical Respiratory Sensitizers Using GARD, a Novel In Vitro Assay Based on a Genomic Biomarker Signature

    PubMed Central

    Albrekt, Ann-Sofie; Borrebaeck, Carl A. K.; Lindstedt, Malin

    2015-01-01

    Background Repeated exposure to certain low molecular weight (LMW) chemical compounds may result in development of allergic reactions in the skin or in the respiratory tract. In most cases, a certain LMW compound selectively sensitize the skin, giving rise to allergic contact dermatitis (ACD), or the respiratory tract, giving rise to occupational asthma (OA). To limit occurrence of allergic diseases, efforts are currently being made to develop predictive assays that accurately identify chemicals capable of inducing such reactions. However, while a few promising methods for prediction of skin sensitization have been described, to date no validated method, in vitro or in vivo, exists that is able to accurately classify chemicals as respiratory sensitizers. Results Recently, we presented the in vitro based Genomic Allergen Rapid Detection (GARD) assay as a novel testing strategy for classification of skin sensitizing chemicals based on measurement of a genomic biomarker signature. We have expanded the applicability domain of the GARD assay to classify also respiratory sensitizers by identifying a separate biomarker signature containing 389 differentially regulated genes for respiratory sensitizers in comparison to non-respiratory sensitizers. By using an independent data set in combination with supervised machine learning, we validated the assay, showing that the identified genomic biomarker is able to accurately classify respiratory sensitizers. Conclusions We have identified a genomic biomarker signature for classification of respiratory sensitizers. Combining this newly identified biomarker signature with our previously identified biomarker signature for classification of skin sensitizers, we have developed a novel in vitro testing strategy with a potent ability to predict both skin and respiratory sensitization in the same sample. PMID:25760038

  9. Prediction of Exposure Level of Energetic Solar Particle Events

    NASA Astrophysics Data System (ADS)

    Kim, M. H. Y.; Blattnig, S.

    2016-12-01

    The potential for exposure to large solar particle events (SPEs) with fluxes that extend to high energies is a major concern during interplanetary transfer and extravehicular activities (EVAs) on the lunar and Martian surfaces. Prediction of sporadic occurrence of SPEs is not accurate for near or long-term scales, while the expected frequency of such events is strongly influenced by solar cycle activity. In the development of NASA's operational strategies real-time estimation of exposure to SPEs has been considered so that adequate responses can be applied in a timely manner to reduce exposures to well below the exposure limits. Previously, the organ doses of large historical SPEs had been calculated by using the complete energy spectra of each event and then developing a prediction model for blood-forming organ (BFO) dose based solely on an assumed value of integrated fluence above 30 MeV (Φ30) for an otherwise unspecified future SPE. While BFO dose is determined primarily by solar protons with high energies, it was reasoned that more accurate BFO dose prediction models could be developed using integrated fluence above 60 MeV (Φ60) and above 100 MeV (Φ100) as predictors instead of Φ30. In the current study, re-analysis of major SPEs (in which the proton spectra of the ground level enhancement [GLE] events since 1956 are correctly described by Band functions) has been used in evaluation of exposure levels. More accurate prediction models for BFO dose and NASA effective dose are then developed using integrated fluence above 200 MeV (Φ200), which by far have the most weight in the calculation of doses for deep-seated organs from exposure to extreme SPEs (GLEs or sub-GLEs). The unconditional probability of a BFO dose exceeding a pre-specified BFO dose limit is simultaneously calculated by taking into account the distribution of the predictor (Φ30, Φ60, Φ100, or Φ200) as estimated from historical SPEs. These results can be applied to the development of approaches to improve radiation protection of astronauts and the optimization of mission planning for future space missions.

  10. Joint Prediction of Longitudinal Development of Cortical Surfaces and White Matter Fibers from Neonatal MRI

    PubMed Central

    Rekik, Islem; Li, Gang; Yap, Pew-Thian; Chen, Geng; Lin, Weili; Shen, Dinggang

    2017-01-01

    The human brain can be modeled as multiple interrelated shapes (or a multishape), each for characterizing one aspect of the brain, such as the cortex and white matter pathways. Predicting the developing multishape is a very challenging task due to the contrasting nature of the developmental trajectories of the constituent shapes: smooth for the cortical surface and non-smooth for white matter tracts due to changes such as bifurcation. We recently addressed this problem and proposed an approach for predicting the multishape developmental spatiotemporal trajectories of infant brains based only on neonatal MRI data using a set of geometric, dynamic, and fiber-to-surface connectivity features. In this paper, we propose two key innovations to further improve the prediction of multishape evolution. First, for a more accurate cortical surface prediction, instead of simply relying on one neonatal atlas to guide the prediction of the multishape, we propose to use multiple neonatal atlases to build a spatially heterogeneous atlas using the multidirectional varifold representation. This individualizes the atlas by locally maximizing its similarity to the testing baseline cortical shape for each cortical region, thereby better representing the baseline testing cortical surface, which founds the multishape prediction process. Second, for temporally consistent fiber prediction, we propose to reliably estimate spatiotemporal connectivity features using low-rank tensor completion, thereby capturing the variability and richness of the temporal development of fibers. Experimental results confirm that the proposed variants significantly improve the prediction performance of our original multishape prediction framework for both cortical surfaces and fiber tracts shape at 3, 6, and 9 months of age. Our pioneering model will pave the way for learning how to predict the evolution of anatomical shapes with abnormal changes. Ultimately, devising accurate shape evolution prediction models that can help quantify and predict the severity of a brain disorder as it progresses will be of great aid in individualized treatment planning. PMID:28284800

  11. Joint prediction of longitudinal development of cortical surfaces and white matter fibers from neonatal MRI.

    PubMed

    Rekik, Islem; Li, Gang; Yap, Pew-Thian; Chen, Geng; Lin, Weili; Shen, Dinggang

    2017-05-15

    The human brain can be modeled as multiple interrelated shapes (or a multishape), each for characterizing one aspect of the brain, such as the cortex and white matter pathways. Predicting the developing multishape is a very challenging task due to the contrasting nature of the developmental trajectories of the constituent shapes: smooth for the cortical surface and non-smooth for white matter tracts due to changes such as bifurcation. We recently addressed this problem and proposed an approach for predicting the multishape developmental spatiotemporal trajectories of infant brains based only on neonatal MRI data using a set of geometric, dynamic, and fiber-to-surface connectivity features. In this paper, we propose two key innovations to further improve the prediction of multishape evolution. First, for a more accurate cortical surface prediction, instead of simply relying on one neonatal atlas to guide the prediction of the multishape, we propose to use multiple neonatal atlases to build a spatially heterogeneous atlas using the multidirectional varifold representation. This individualizes the atlas by locally maximizing its similarity to the testing baseline cortical shape for each cortical region, thereby better representing the baseline testing cortical surface, which founds the multishape prediction process. Second, for temporally consistent fiber prediction, we propose to reliably estimate spatiotemporal connectivity features using low-rank tensor completion, thereby capturing the variability and richness of the temporal development of fibers. Experimental results confirm that the proposed variants significantly improve the prediction performance of our original multishape prediction framework for both cortical surfaces and fiber tracts shape at 3, 6, and 9 months of age. Our pioneering model will pave the way for learning how to predict the evolution of anatomical shapes with abnormal changes. Ultimately, devising accurate shape evolution prediction models that can help quantify and predict the severity of a brain disorder as it progresses will be of great aid in individualized treatment planning. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Information Technology Supports Integration of Satellite Imagery with Irrigation Management in California's Central Valley

    USDA-ARS?s Scientific Manuscript database

    Remotely sensed data can potentially be used to develop crop coefficient estimates over large areas and make irrigation scheduling more practical, convenient, and accurate. A demonstration system is being developed under NASA's Terrestrial Observation and Prediction System (TOPS) to automatically r...

  13. THE ART OF DATA MINING THE MINEFIELDS OF TOXICITY DATABASES TO LINK CHEMISTRY TO BIOLOGY

    EPA Science Inventory

    Toxicity databases have a special role in predictive toxicology, providing ready access to historical information throughout the workflow of discovery, development, and product safety processes in drug development as well as in review by regulatory agencies. To provide accurate i...

  14. Coarse-Graining Polymer Field Theory for Fast and Accurate Simulations of Directed Self-Assembly

    NASA Astrophysics Data System (ADS)

    Liu, Jimmy; Delaney, Kris; Fredrickson, Glenn

    To design effective manufacturing processes using polymer directed self-assembly (DSA), the semiconductor industry benefits greatly from having a complete picture of stable and defective polymer configurations. Field-theoretic simulations are an effective way to study these configurations and predict defect populations. Self-consistent field theory (SCFT) is a particularly successful theory for studies of DSA. Although other models exist that are faster to simulate, these models are phenomenological or derived through asymptotic approximations, often leading to a loss of accuracy relative to SCFT. In this study, we employ our recently-developed method to produce an accurate coarse-grained field theory for diblock copolymers. The method uses a force- and stress-matching strategy to map output from SCFT simulations into parameters for an optimized phase field model. This optimized phase field model is just as fast as existing phenomenological phase field models, but makes more accurate predictions of polymer self-assembly, both in bulk and in confined systems. We study the performance of this model under various conditions, including its predictions of domain spacing, morphology and defect formation energies. Samsung Electronics.

  15. A hybrid intelligent method for three-dimensional short-term prediction of dissolved oxygen content in aquaculture

    PubMed Central

    Yu, Huihui; Cheng, Yanjun; Cheng, Qianqian; Li, Daoliang

    2018-01-01

    A precise predictive model is important for obtaining a clear understanding of the changes in dissolved oxygen content in crab ponds. Highly accurate interval forecasting of dissolved oxygen content is fundamental to reduce risk, and three-dimensional prediction can provide more accurate results and overall guidance. In this study, a hybrid three-dimensional (3D) dissolved oxygen content prediction model based on a radial basis function (RBF) neural network, K-means and subtractive clustering was developed and named the subtractive clustering (SC)-K-means-RBF model. In this modeling process, K-means and subtractive clustering methods were employed to enhance the hyperparameters required in the RBF neural network model. The comparison of the predicted results of different traditional models validated the effectiveness and accuracy of the proposed hybrid SC-K-means-RBF model for three-dimensional prediction of dissolved oxygen content. Consequently, the proposed model can effectively display the three-dimensional distribution of dissolved oxygen content and serve as a guide for feeding and future studies. PMID:29466394

  16. The PREM score: a graphical tool for predicting survival in very preterm births.

    PubMed

    Cole, T J; Hey, E; Richmond, S

    2010-01-01

    To develop a tool for predicting survival to term in babies born more than 8 weeks early using only information available at or before birth. 1456 non-malformed very preterm babies of 22-31 weeks' gestation born in 2000-3 in the north of England and 3382 births of 23-31 weeks born in 2000-4 in Trent. Survival to term, predicted from information available at birth, and at the onset of labour or delivery. Development of a logistic regression model (the prematurity risk evaluation measure or PREM score) based on gestation, birth weight for gestation and base deficit from umbilical cord blood. Gestation was by far the most powerful predictor of survival to term, and as few as 5 extra days can double the chance of survival. Weight for gestation also had a powerful but non-linear effect on survival, with weight between the median and 85th centile predicting the highest survival. Using this information survival can be predicted almost as accurately before birth as after, although base deficit further improves the prediction. A simple graph is described that shows how the two main variables gestation and weight for gestation interact to predict the chance of survival. The PREM score can be used to predict the chance of survival at or before birth almost as accurately as existing measures influenced by post-delivery condition, to balance risk at entry into a controlled trial and to adjust for differences in "case mix" when assessing the quality of perinatal care.

  17. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  18. Parameterized reduced-order models using hyper-dual numbers.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fike, Jeffrey A.; Brake, Matthew Robert

    2013-10-01

    The goal of most computational simulations is to accurately predict the behavior of a real, physical system. Accurate predictions often require very computationally expensive analyses and so reduced order models (ROMs) are commonly used. ROMs aim to reduce the computational cost of the simulations while still providing accurate results by including all of the salient physics of the real system in the ROM. However, real, physical systems often deviate from the idealized models used in simulations due to variations in manufacturing or other factors. One approach to this issue is to create a parameterized model in order to characterize themore » effect of perturbations from the nominal model on the behavior of the system. This report presents a methodology for developing parameterized ROMs, which is based on Craig-Bampton component mode synthesis and the use of hyper-dual numbers to calculate the derivatives necessary for the parameterization.« less

  19. Approximating high-dimensional dynamics by barycentric coordinates with linear programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirata, Yoshito, E-mail: yoshito@sat.t.u-tokyo.ac.jp; Aihara, Kazuyuki; Suzuki, Hideyuki

    The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics ofmore » the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data.« less

  20. Approximating high-dimensional dynamics by barycentric coordinates with linear programming.

    PubMed

    Hirata, Yoshito; Shiro, Masanori; Takahashi, Nozomu; Aihara, Kazuyuki; Suzuki, Hideyuki; Mas, Paloma

    2015-01-01

    The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics of the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data.

  1. Taxi-Out Time Prediction for Departures at Charlotte Airport Using Machine Learning Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong; Malik, Waqar; Jung, Yoon C.

    2016-01-01

    Predicting the taxi-out times of departures accurately is important for improving airport efficiency and takeoff time predictability. In this paper, we attempt to apply machine learning techniques to actual traffic data at Charlotte Douglas International Airport for taxi-out time prediction. To find the key factors affecting aircraft taxi times, surface surveillance data is first analyzed. From this data analysis, several variables, including terminal concourse, spot, runway, departure fix and weight class, are selected for taxi time prediction. Then, various machine learning methods such as linear regression, support vector machines, k-nearest neighbors, random forest, and neural networks model are applied to actual flight data. Different traffic flow and weather conditions at Charlotte airport are also taken into account for more accurate prediction. The taxi-out time prediction results show that linear regression and random forest techniques can provide the most accurate prediction in terms of root-mean-square errors. We also discuss the operational complexity and uncertainties that make it difficult to predict the taxi times accurately.

  2. High-Temperature Cast Aluminum for Efficient Engines

    NASA Astrophysics Data System (ADS)

    Bobel, Andrew C.

    Accurate thermodynamic databases are the foundation of predictive microstructure and property models. An initial assessment of the commercially available Thermo-Calc TCAL2 database and the proprietary aluminum database of QuesTek demonstrated a large degree of deviation with respect to equilibrium precipitate phase prediction in the compositional region of interest when compared to 3-D atom probe tomography (3DAPT) and transmission electron microscopy (TEM) experimental results. New compositional measurements of the Q-phase (Al-Cu-Mg-Si phase) led to a remodeling of the Q-phase thermodynamic description in the CALPHAD databases which has produced significant improvements in the phase prediction capabilities of the thermodynamic model. Due to the unique morphologies of strengthening precipitate phases commonly utilized in high-strength cast aluminum alloys, the development of new microstructural evolution models to describe both rod and plate particle growth was critical for accurate mechanistic strength models which rely heavily on precipitate size and shape. Particle size measurements through both 3DAPT and TEM experiments were used in conjunction with literature results of many alloy compositions to develop a physical growth model for the independent prediction of rod radii and rod length evolution. In addition a machine learning (ML) model was developed for the independent prediction of plate thickness and plate diameter evolution as a function of alloy composition, aging temperature, and aging time. The developed models are then compared with physical growth laws developed for spheres and modified for ellipsoidal morphology effects. Analysis of the effect of particle morphology on strength enhancement has been undertaken by modification of the Orowan-Ashby equation for 〈110〉 alpha-Al oriented finite rods in addition to an appropriate version for similarly oriented plates. A mechanistic strengthening model was developed for cast aluminum alloys containing both rod and plate-like precipitates. The model accurately accounts for the temperature dependence of particle nucleation and growth, solid solution strengthening, Si eutectic strength, and base aluminum yield strength. Strengthening model predictions of tensile yield strength are in excellent agreement with experimental observations over a wide range of aluminum alloy systems, aging temperatures, and test conditions. The developed models enable the prediction of the required particle morphology and volume fraction necessary to achieve target property goals in the design of future aluminum alloys. The effect of partitioning elements to the Q-phase was also considered for the potential to control the nucleation rate, reduce coarsening, and control the evolution of particle morphology. Elements were selected based on density functional theory (DFT) calculations showing the prevalence of certain elements to partition to the Q-phase. 3DAPT experiments were performed on Q-phase containing wrought alloys with these additions and show segregation of certain elements to the Q-phase with relative agreement to DFT predictions.

  3. Exploring the knowledge behind predictions in everyday cognition: an iterated learning study.

    PubMed

    Stephens, Rachel G; Dunn, John C; Rao, Li-Lin; Li, Shu

    2015-10-01

    Making accurate predictions about events is an important but difficult task. Recent work suggests that people are adept at this task, making predictions that reflect surprisingly accurate knowledge of the distributions of real quantities. Across three experiments, we used an iterated learning procedure to explore the basis of this knowledge: to what extent is domain experience critical to accurate predictions and how accurate are people when faced with unfamiliar domains? In Experiment 1, two groups of participants, one resident in Australia, the other in China, predicted the values of quantities familiar to both (movie run-times), unfamiliar to both (the lengths of Pharaoh reigns), and familiar to one but unfamiliar to the other (cake baking durations and the lengths of Beijing bus routes). While predictions from both groups were reasonably accurate overall, predictions were inaccurate in the selectively unfamiliar domains and, surprisingly, predictions by the China-resident group were also inaccurate for a highly familiar domain: local bus route lengths. Focusing on bus routes, two follow-up experiments with Australia-resident groups clarified the knowledge and strategies that people draw upon, plus important determinants of accurate predictions. For unfamiliar domains, people appear to rely on extrapolating from (not simply directly applying) related knowledge. However, we show that people's predictions are subject to two sources of error: in the estimation of quantities in a familiar domain and extension to plausible values in an unfamiliar domain. We propose that the key to successful predictions is not simply domain experience itself, but explicit experience of relevant quantities.

  4. Development and Validation of a Near-Infrared Spectroscopy Method for the Prediction of Acrylamide Content in French-Fried Potato.

    PubMed

    Adedipe, Oluwatosin E; Johanningsmeier, Suzanne D; Truong, Van-Den; Yencho, G Craig

    2016-03-02

    This study investigated the ability of near-infrared spectroscopy (NIRS) to predict acrylamide content in French-fried potato. Potato flour spiked with acrylamide (50-8000 μg/kg) was used to determine if acrylamide could be accurately predicted in a potato matrix. French fries produced with various pretreatments and cook times (n = 84) and obtained from quick-service restaurants (n = 64) were used for model development and validation. Acrylamide was quantified using gas chromatography-mass spectrometry, and reflectance spectra (400-2500 nm) of each freeze-dried sample were captured on a Foss XDS Rapid Content Analyzer-NIR spectrometer. Partial least-squares (PLS) discriminant analysis and PLS regression modeling demonstrated that NIRS could accurately detect acrylamide content as low as 50 μg/kg in the model potato matrix. Prediction errors of 135 μg/kg (R(2) = 0.98) and 255 μg/kg (R(2) = 0.93) were achieved with the best PLS models for acrylamide prediction in Russet Norkotah French-fried potato and multiple samples of unknown varieties, respectively. The findings indicate that NIRS can be used as a screening tool in potato breeding and potato processing research to reduce acrylamide in the food supply.

  5. Predicting the Overall Spatial Quality of Automotive Audio Systems

    NASA Astrophysics Data System (ADS)

    Koya, Daisuke

    The spatial quality of automotive audio systems is often compromised due to their unideal listening environments. Automotive audio systems need to be developed quickly due to industry demands. A suitable perceptual model could evaluate the spatial quality of automotive audio systems with similar reliability to formal listening tests but take less time. Such a model is developed in this research project by adapting an existing model of spatial quality for automotive audio use. The requirements for the adaptation were investigated in a literature review. A perceptual model called QESTRAL was reviewed, which predicts the overall spatial quality of domestic multichannel audio systems. It was determined that automotive audio systems are likely to be impaired in terms of the spatial attributes that were not considered in developing the QESTRAL model, but metrics are available that might predict these attributes. To establish whether the QESTRAL model in its current form can accurately predict the overall spatial quality of automotive audio systems, MUSHRA listening tests using headphone auralisation with head tracking were conducted to collect results to be compared against predictions by the model. Based on guideline criteria, the model in its current form could not accurately predict the overall spatial quality of automotive audio systems. To improve prediction performance, the QESTRAL model was recalibrated and modified using existing metrics of the model, those that were proposed from the literature review, and newly developed metrics. The most important metrics for predicting the overall spatial quality of automotive audio systems included those that were interaural cross-correlation (IACC) based, relate to localisation of the frontal audio scene, and account for the perceived scene width in front of the listener. Modifying the model for automotive audio systems did not invalidate its use for domestic audio systems. The resulting model predicts the overall spatial quality of 2- and 5-channel automotive audio systems with a cross-validation performance of R. 2 = 0.85 and root-mean-squareerror (RMSE) = 11.03%.

  6. Machine learning bandgaps of double perovskites

    DOE PAGES

    Pilania, G.; Mannodi-Kanakkithodi, A.; Uberuaga, B. P.; ...

    2016-01-19

    The ability to make rapid and accurate predictions on bandgaps of double perovskites is of much practical interest for a range of applications. While quantum mechanical computations for high-fidelity bandgaps are enormously computation-time intensive and thus impractical in high throughput studies, informatics-based statistical learning approaches can be a promising alternative. Here we demonstrate a systematic feature-engineering approach and a robust learning framework for efficient and accurate predictions of electronic bandgaps of double perovskites. After evaluating a set of more than 1.2 million features, we identify lowest occupied Kohn-Sham levels and elemental electronegativities of the constituent atomic species as the mostmore » crucial and relevant predictors. As a result, the developed models are validated and tested using the best practices of data science and further analyzed to rationalize their prediction performance.« less

  7. New PDS will predict performance of pallets made with used parts

    Treesearch

    John W. Clarke; Marshall S. White; Philip A. Araman

    2001-01-01

    The Pallet Design System (PDS) is a computer design program developed by Virginia Tech, the National Wooden Pallet & Container Association, and the U.S. Forest Service to quickly and accurately predict the performance of new wood pallets. PDS has been upgraded annually since its original version in 1984. All of the previous upgrades, however, have continued to...

  8. Comparing aboveground biomass predictions for an uneven-aged pine-dominated stand using local, regional, and national models

    Treesearch

    D.C. Bragg; K.M. McElligott

    2013-01-01

    Sequestration by Arkansas forests removes carbon dioxide from the atmosphere, storing this carbon in biomass that fills a number of critical ecological and socioeconomic functions. We need a better understanding of the contribution of forests to the carbon cycle, including the accurate quantification of tree biomass. Models have long been developed to predict...

  9. Hydroelastic response of a floating runway to cnoidal waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ertekin, R. C., E-mail: ertekin@hawaii.edu; Xia, Dingwu

    2014-02-15

    The hydroelastic response of mat-type Very Large Floating Structures (VLFSs) to severe sea conditions, such as tsunamis and hurricanes, must be assessed for safety and survivability. An efficient and robust nonlinear hydroelastic model is required to predict accurately the motion of and the dynamic loads on a VLFS due to such large waves. We develop a nonlinear theory to predict the hydroelastic response of a VLFS in the presence of cnoidal waves and compare the predictions with the linear theory that is also developed here. This hydroelastic problem is formulated by directly coupling the structure with the fluid, by usemore » of the Level I Green-Naghdi theory for the fluid motion and the Kirchhoff thin plate theory for the runway. The coupled fluid structure system, together with the appropriate jump conditions are solved in two-dimensions by the finite-difference method. The numerical model is used to study the nonlinear response of a VLFS to storm waves which are modeled by use of the cnoidal-wave theory. Parametric studies show that the nonlinearity of the waves is very important in accurately predicting the dynamic bending moment and wave run-up on a VLFS in high seas.« less

  10. Alcohol-related hot-spot analysis and prediction : final report.

    DOT National Transportation Integrated Search

    2017-05-01

    This project developed methods to more accurately identify alcohol-related crash hot spots, ultimately allowing for more effective and efficient enforcement and safety campaigns. Advancements in accuracy came from improving the calculation of spatial...

  11. Modeling the shape and composition of the human body using dual energy X-ray absorptiometry images

    PubMed Central

    Shepherd, John A.; Fan, Bo; Schwartz, Ann V.; Cawthon, Peggy; Cummings, Steven R.; Kritchevsky, Stephen; Nevitt, Michael; Santanasto, Adam; Cootes, Timothy F.

    2017-01-01

    There is growing evidence that body shape and regional body composition are strong indicators of metabolic health. The purpose of this study was to develop statistical models that accurately describe holistic body shape, thickness, and leanness. We hypothesized that there are unique body shape features that are predictive of mortality beyond standard clinical measures. We developed algorithms to process whole-body dual-energy X-ray absorptiometry (DXA) scans into body thickness and leanness images. We performed statistical appearance modeling (SAM) and principal component analysis (PCA) to efficiently encode the variance of body shape, leanness, and thickness across sample of 400 older Americans from the Health ABC study. The sample included 200 cases and 200 controls based on 6-year mortality status, matched on sex, race and BMI. The final model contained 52 points outlining the torso, upper arms, thighs, and bony landmarks. Correlation analyses were performed on the PCA parameters to identify body shape features that vary across groups and with metabolic risk. Stepwise logistic regression was performed to identify sex and race, and predict mortality risk as a function of body shape parameters. These parameters are novel body composition features that uniquely identify body phenotypes of different groups and predict mortality risk. Three parameters from a SAM of body leanness and thickness accurately identified sex (training AUC = 0.99) and six accurately identified race (training AUC = 0.91) in the sample dataset. Three parameters from a SAM of only body thickness predicted mortality (training AUC = 0.66, validation AUC = 0.62). Further study is warranted to identify specific shape/composition features that predict other health outcomes. PMID:28423041

  12. Phonological and Non-Phonological Language Skills as Predictors of Early Reading Performance

    ERIC Educational Resources Information Center

    Batson-Magnuson, LuAnn

    2010-01-01

    Accurate prediction of early childhood reading performance could help identify at-risk students, aid in the development of evidence-based intervention strategies, and further our theoretical understanding of reading development. This study assessed the validity of the Developmental Indicator for the Assessment of Learning (DIAL) language-based…

  13. Assessment of juvenile pigs to serve as human pediatric surrogates for preclinical formulation pharmacokinetic testing

    USDA-ARS?s Scientific Manuscript database

    Pediatric drug development is hampered by the various biological, clinical, and formulation challenges associated with age-based populations. A primary cause for this lack of development is the inability to accurately predict ontogenic changes that affect pharmacokinetics (PK) in children using trad...

  14. Methods for predicting properties and tailoring salt solutions for industrial processes

    NASA Technical Reports Server (NTRS)

    Ally, Moonis R.

    1993-01-01

    An algorithm developed at Oak Ridge National Laboratory accurately and quickly predicts thermodynamic properties of concentrated aqueous salt solutions. This algorithm is much simpler and much faster than other modeling schemes and is unique because it can predict solution behavior at very high concentrations and under varying conditions. Typical industrial applications of this algorithm would be in manufacture of inorganic chemicals by crystallization, thermal storage, refrigeration and cooling, extraction of metals, emissions controls, etc.

  15. A preliminary ferritic-martensitic stainless steel constitution diagram

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balmforth, M.C.; Lippold, J.C.

    1998-01-01

    This paper describes preliminary research to develop a constitution diagram that will more accurately predict the microstructure of ferritic and martensitic stainless steel weld deposits. A button melting technique was used to produce a wide range of compositions using mixtures of conventional ferritic and martensitic stainless steels, including types 403, 409, 410, 430, 439 and 444. These samples were prepared metallographically, and the vol-% ferrite and martensite was determined quantitatively. In addition, the hardness and ferrite number (FN) were measured. Using this data, a preliminary constitution diagram is proposed that provides a more accurate method for predicting the microstructures ofmore » arc welds in ferritic and martensitic stainless steels.« less

  16. Accurate prediction of severe allergic reactions by a small set of environmental parameters (NDVI, temperature).

    PubMed

    Notas, George; Bariotakis, Michail; Kalogrias, Vaios; Andrianaki, Maria; Azariadis, Kalliopi; Kampouri, Errika; Theodoropoulou, Katerina; Lavrentaki, Katerina; Kastrinakis, Stelios; Kampa, Marilena; Agouridakis, Panagiotis; Pirintsos, Stergios; Castanas, Elias

    2015-01-01

    Severe allergic reactions of unknown etiology,necessitating a hospital visit, have an important impact in the life of affected individuals and impose a major economic burden to societies. The prediction of clinically severe allergic reactions would be of great importance, but current attempts have been limited by the lack of a well-founded applicable methodology and the wide spatiotemporal distribution of allergic reactions. The valid prediction of severe allergies (and especially those needing hospital treatment) in a region, could alert health authorities and implicated individuals to take appropriate preemptive measures. In the present report we have collecterd visits for serious allergic reactions of unknown etiology from two major hospitals in the island of Crete, for two distinct time periods (validation and test sets). We have used the Normalized Difference Vegetation Index (NDVI), a satellite-based, freely available measurement, which is an indicator of live green vegetation at a given geographic area, and a set of meteorological data to develop a model capable of describing and predicting severe allergic reaction frequency. Our analysis has retained NDVI and temperature as accurate identifiers and predictors of increased hospital severe allergic reactions visits. Our approach may contribute towards the development of satellite-based modules, for the prediction of severe allergic reactions in specific, well-defined geographical areas. It could also probably be used for the prediction of other environment related diseases and conditions.

  17. Accurate Prediction of Severe Allergic Reactions by a Small Set of Environmental Parameters (NDVI, Temperature)

    PubMed Central

    Andrianaki, Maria; Azariadis, Kalliopi; Kampouri, Errika; Theodoropoulou, Katerina; Lavrentaki, Katerina; Kastrinakis, Stelios; Kampa, Marilena; Agouridakis, Panagiotis; Pirintsos, Stergios; Castanas, Elias

    2015-01-01

    Severe allergic reactions of unknown etiology,necessitating a hospital visit, have an important impact in the life of affected individuals and impose a major economic burden to societies. The prediction of clinically severe allergic reactions would be of great importance, but current attempts have been limited by the lack of a well-founded applicable methodology and the wide spatiotemporal distribution of allergic reactions. The valid prediction of severe allergies (and especially those needing hospital treatment) in a region, could alert health authorities and implicated individuals to take appropriate preemptive measures. In the present report we have collecterd visits for serious allergic reactions of unknown etiology from two major hospitals in the island of Crete, for two distinct time periods (validation and test sets). We have used the Normalized Difference Vegetation Index (NDVI), a satellite-based, freely available measurement, which is an indicator of live green vegetation at a given geographic area, and a set of meteorological data to develop a model capable of describing and predicting severe allergic reaction frequency. Our analysis has retained NDVI and temperature as accurate identifiers and predictors of increased hospital severe allergic reactions visits. Our approach may contribute towards the development of satellite-based modules, for the prediction of severe allergic reactions in specific, well-defined geographical areas. It could also probably be used for the prediction of other environment related diseases and conditions. PMID:25794106

  18. Comparisons of rational engineering correlations of thermophoretically-augmented particle mass transfer with STAN5-predictions for developing boundary layers

    NASA Technical Reports Server (NTRS)

    Gokoglu, S. A.; Rosner, D. E.

    1984-01-01

    Modification of the code STAN5 to properly include thermophoretic mass transport, and examination of selected test cases developing boundary layers which include variable properties, viscous dissipation, transition to turbulence and transpiration cooling. Under conditions representative of current and projected GT operation, local application of St(M)/St(M),o correlations evidently provides accurate and economical engineering design predictions, especially for suspended particles characterized by Schmidt numbers outside of the heavy vapor range.

  19. Quokka: a comprehensive tool for rapid and accurate prediction of kinase family-specific phosphorylation sites in the human proteome.

    PubMed

    Li, Fuyi; Li, Chen; Marquez-Lago, Tatiana T; Leier, André; Akutsu, Tatsuya; Purcell, Anthony W; Smith, A Ian; Lithgow, Trevor; Daly, Roger J; Song, Jiangning; Chou, Kuo-Chen

    2018-06-27

    Kinase-regulated phosphorylation is a ubiquitous type of post-translational modification (PTM) in both eukaryotic and prokaryotic cells. Phosphorylation plays fundamental roles in many signalling pathways and biological processes, such as protein degradation and protein-protein interactions. Experimental studies have revealed that signalling defects caused by aberrant phosphorylation are highly associated with a variety of human diseases, especially cancers. In light of this, a number of computational methods aiming to accurately predict protein kinase family-specific or kinase-specific phosphorylation sites have been established, thereby facilitating phosphoproteomic data analysis. In this work, we present Quokka, a novel bioinformatics tool that allows users to rapidly and accurately identify human kinase family-regulated phosphorylation sites. Quokka was developed by using a variety of sequence scoring functions combined with an optimized logistic regression algorithm. We evaluated Quokka based on well-prepared up-to-date benchmark and independent test datasets, curated from the Phospho.ELM and UniProt databases, respectively. The independent test demonstrates that Quokka improves the prediction performance compared with state-of-the-art computational tools for phosphorylation prediction. In summary, our tool provides users with high-quality predicted human phosphorylation sites for hypothesis generation and biological validation. The Quokka webserver and datasets are freely available at http://quokka.erc.monash.edu/. Supplementary data are available at Bioinformatics online.

  20. Study on elevated-temperature flow behavior of Ni-Cr-Mo-B ultra-heavy-plate steel via experiment and modelling

    NASA Astrophysics Data System (ADS)

    Gao, Zhi-yu; Kang, Yu; Li, Yan-shuai; Meng, Chao; Pan, Tao

    2018-04-01

    Elevated-temperature flow behavior of a novel Ni-Cr-Mo-B ultra-heavy-plate steel was investigated by conducting hot compressive deformation tests on a Gleeble-3800 thermo-mechanical simulator at a temperature range of 1123 K–1423 K with a strain rate range from 0.01 s‑1 to10 s‑1 and a height reduction of 70%. Based on the experimental results, classic strain-compensated Arrhenius-type, a new revised strain-compensated Arrhenius-type and classic modified Johnson-Cook constitutive models were developed for predicting the high-temperature deformation behavior of the steel. The predictability of these models were comparatively evaluated in terms of statistical parameters including correlation coefficient (R), average absolute relative error (AARE), average root mean square error (RMSE), normalized mean bias error (NMBE) and relative error. The statistical results indicate that the new revised strain-compensated Arrhenius-type model could give prediction of elevated-temperature flow stress for the steel accurately under the entire process conditions. However, the predicted values by the classic modified Johnson-Cook model could not agree well with the experimental values, and the classic strain-compensated Arrhenius-type model could track the deformation behavior more accurately compared with the modified Johnson-Cook model, but less accurately with the new revised strain-compensated Arrhenius-type model. In addition, reasons of differences in predictability of these models were discussed in detail.

  1. PHOENIX: a scoring function for affinity prediction derived using high-resolution crystal structures and calorimetry measurements.

    PubMed

    Tang, Yat T; Marshall, Garland R

    2011-02-28

    Binding affinity prediction is one of the most critical components to computer-aided structure-based drug design. Despite advances in first-principle methods for predicting binding affinity, empirical scoring functions that are fast and only relatively accurate are still widely used in structure-based drug design. With the increasing availability of X-ray crystallographic structures in the Protein Data Bank and continuing application of biophysical methods such as isothermal titration calorimetry to measure thermodynamic parameters contributing to binding free energy, sufficient experimental data exists that scoring functions can now be derived by separating enthalpic (ΔH) and entropic (TΔS) contributions to binding free energy (ΔG). PHOENIX, a scoring function to predict binding affinities of protein-ligand complexes, utilizes the increasing availability of experimental data to improve binding affinity predictions by the following: model training and testing using high-resolution crystallographic data to minimize structural noise, independent models of enthalpic and entropic contributions fitted to thermodynamic parameters assumed to be thermodynamically biased to calculate binding free energy, use of shape and volume descriptors to better capture entropic contributions. A set of 42 descriptors and 112 protein-ligand complexes were used to derive functions using partial least-squares for change of enthalpy (ΔH) and change of entropy (TΔS) to calculate change of binding free energy (ΔG), resulting in a predictive r2 (r(pred)2) of 0.55 and a standard error (SE) of 1.34 kcal/mol. External validation using the 2009 version of the PDBbind "refined set" (n = 1612) resulted in a Pearson correlation coefficient (R(p)) of 0.575 and a mean error (ME) of 1.41 pK(d). Enthalpy and entropy predictions were of limited accuracy individually. However, their difference resulted in a relatively accurate binding free energy. While the development of an accurate and applicable scoring function was an objective of this study, the main focus was evaluation of the use of high-resolution X-ray crystal structures with high-quality thermodynamic parameters from isothermal titration calorimetry for scoring function development. With the increasing application of structure-based methods in molecular design, this study suggests that using high-resolution crystal structures, separating enthalpy and entropy contributions to binding free energy, and including descriptors to better capture entropic contributions may prove to be effective strategies toward rapid and accurate calculation of binding affinity.

  2. Application of a data assimilation method via an ensemble Kalman filter to reactive urea hydrolysis transport modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Juxiu Tong; Bill X. Hu; Hai Huang

    2014-03-01

    With growing importance of water resources in the world, remediations of anthropogenic contaminations due to reactive solute transport become even more important. A good understanding of reactive rate parameters such as kinetic parameters is the key to accurately predicting reactive solute transport processes and designing corresponding remediation schemes. For modeling reactive solute transport, it is very difficult to estimate chemical reaction rate parameters due to complex processes of chemical reactions and limited available data. To find a method to get the reactive rate parameters for the reactive urea hydrolysis transport modeling and obtain more accurate prediction for the chemical concentrations,more » we developed a data assimilation method based on an ensemble Kalman filter (EnKF) method to calibrate reactive rate parameters for modeling urea hydrolysis transport in a synthetic one-dimensional column at laboratory scale and to update modeling prediction. We applied a constrained EnKF method to pose constraints to the updated reactive rate parameters and the predicted solute concentrations based on their physical meanings after the data assimilation calibration. From the study results we concluded that we could efficiently improve the chemical reactive rate parameters with the data assimilation method via the EnKF, and at the same time we could improve solute concentration prediction. The more data we assimilated, the more accurate the reactive rate parameters and concentration prediction. The filter divergence problem was also solved in this study.« less

  3. Radiation from advanced solid rocket motor plumes

    NASA Technical Reports Server (NTRS)

    Farmer, Richard C.; Smith, Sheldon D.; Myruski, Brian L.

    1994-01-01

    The overall objective of this study was to develop an understanding of solid rocket motor (SRM) plumes in sufficient detail to accurately explain the majority of plume radiation test data. Improved flowfield and radiation analysis codes were developed to accurately and efficiently account for all the factors which effect radiation heating from rocket plumes. These codes were verified by comparing predicted plume behavior with measured NASA/MSFC ASRM test data. Upon conducting a thorough review of the current state-of-the-art of SRM plume flowfield and radiation prediction methodology and the pertinent data base, the following analyses were developed for future design use. The NOZZRAD code was developed for preliminary base heating design and Al2O3 particle optical property data evaluation using a generalized two-flux solution to the radiative transfer equation. The IDARAD code was developed for rapid evaluation of plume radiation effects using the spherical harmonics method of differential approximation to the radiative transfer equation. The FDNS CFD code with fully coupled Euler-Lagrange particle tracking was validated by comparison to predictions made with the industry standard RAMP code for SRM nozzle flowfield analysis. The FDNS code provides the ability to analyze not only rocket nozzle flow, but also axisymmetric and three-dimensional plume flowfields with state-of-the-art CFD methodology. Procedures for conducting meaningful thermo-vision camera studies were developed.

  4. Prediction of individual brain maturity using fMRI.

    PubMed

    Dosenbach, Nico U F; Nardos, Binyam; Cohen, Alexander L; Fair, Damien A; Power, Jonathan D; Church, Jessica A; Nelson, Steven M; Wig, Gagan S; Vogel, Alecia C; Lessov-Schlaggar, Christina N; Barnes, Kelly Anne; Dubis, Joseph W; Feczko, Eric; Coalson, Rebecca S; Pruett, John R; Barch, Deanna M; Petersen, Steven E; Schlaggar, Bradley L

    2010-09-10

    Group functional connectivity magnetic resonance imaging (fcMRI) studies have documented reliable changes in human functional brain maturity over development. Here we show that support vector machine-based multivariate pattern analysis extracts sufficient information from fcMRI data to make accurate predictions about individuals' brain maturity across development. The use of only 5 minutes of resting-state fcMRI data from 238 scans of typically developing volunteers (ages 7 to 30 years) allowed prediction of individual brain maturity as a functional connectivity maturation index. The resultant functional maturation curve accounted for 55% of the sample variance and followed a nonlinear asymptotic growth curve shape. The greatest relative contribution to predicting individual brain maturity was made by the weakening of short-range functional connections between the adult brain's major functional networks.

  5. Prediction of Flow Stress in Cadmium Using Constitutive Equation and Artificial Neural Network Approach

    NASA Astrophysics Data System (ADS)

    Sarkar, A.; Chakravartty, J. K.

    2013-10-01

    A model is developed to predict the constitutive flow behavior of cadmium during compression test using artificial neural network (ANN). The inputs of the neural network are strain, strain rate, and temperature, whereas flow stress is the output. Experimental data obtained from compression tests in the temperature range -30 to 70 °C, strain range 0.1 to 0.6, and strain rate range 10-3 to 1 s-1 are employed to develop the model. A three-layer feed-forward ANN is trained with Levenberg-Marquardt training algorithm. It has been shown that the developed ANN model can efficiently and accurately predict the deformation behavior of cadmium. This trained network could predict the flow stress better than a constitutive equation of the type.

  6. Solubility prediction, solvate and cocrystal screening as tools for rational crystal engineering.

    PubMed

    Loschen, Christoph; Klamt, Andreas

    2015-06-01

    The fact that novel drug candidates are becoming increasingly insoluble is a major problem of current drug development. Computational tools may address this issue by screening for suitable solvents or by identifying potential novel cocrystal formers that increase bioavailability. In contrast to other more specialized methods, the fluid phase thermodynamics approach COSMO-RS (conductor-like screening model for real solvents) allows for a comprehensive treatment of drug solubility, solvate and cocrystal formation and many other thermodynamics properties in liquids. This article gives an overview of recent COSMO-RS developments that are of interest for drug development and contains several new application examples for solubility prediction and solvate/cocrystal screening. For all property predictions COSMO-RS has been used. The basic concept of COSMO-RS consists of using the screening charge density as computed from first principles calculations in combination with fast statistical thermodynamics to compute the chemical potential of a compound in solution. The fast and accurate assessment of drug solubility and the identification of suitable solvents, solvate or cocrystal formers is nowadays possible and may be used to complement modern drug development. Efficiency is increased by avoiding costly quantum-chemical computations using a database of previously computed molecular fragments. COSMO-RS theory can be applied to a range of physico-chemical properties, which are of interest in rational crystal engineering. Most notably, in combination with experimental reference data, accurate quantitative solubility predictions in any solvent or solvent mixture are possible. Additionally, COSMO-RS can be extended to the prediction of cocrystal formation, which results in considerable predictive accuracy concerning coformer screening. In a recent variant costly quantum chemical calculations are avoided resulting in a significant speed-up and ease-of-use. © 2015 Royal Pharmaceutical Society.

  7. Resources and Commitment as Critical Factors in the Development of "Gifted" Athletes

    ERIC Educational Resources Information Center

    Baker, Joseph; Cote, Jean

    2003-01-01

    Several sport-specific talent detection models have been developed over the last 30 years (Durand-Bush & Salmela, 2001). However, these models have failed in at least one important standard of judgment--accurately predicting who will develop into an elite level athlete. The authors believe that the WICS model presented by Robert Sternberg also…

  8. Child Psychopathy: Theories, Measurement, and Relations with the Development and Persistence of Conduct Problems

    ERIC Educational Resources Information Center

    Kotler, Julie S.; McMahon, Robert J.

    2005-01-01

    To develop more accurate explanatory and predictive models of child and adolescent conduct problems, interest has grown in examining psychopathic traits in youth. The presence or absence of these traits may help to identify unique etiological pathways in the development of antisocial behavior. The current review provides a detailed summary and…

  9. Toward Objective Monitoring of Ingestive Behavior in Free-living Population

    PubMed Central

    Sazonov, Edward S.; Schuckers, Stephanie A.C.; Lopez-Meyer, Paulo; Makeyev, Oleksandr; Melanson, Edward L.; Neuman, Michael R.; Hill, James O.

    2010-01-01

    Understanding of eating behaviors associated with obesity requires objective and accurate monitoring of food intake patterns. Accurate methods are available for measuring total energy expenditure and its components in free-living populations, but methods for measuring food intake in free-living people are far less accurate and involve self-reporting or subjective monitoring. We suggest that chews and swallows can be used for objective monitoring of ingestive behavior. This hypothesis was verified in a human study involving 20 subjects. Chews and swallows were captured during periods of quiet resting, talking, and meals of varying size. The counts of chews and swallows along with other derived metrics were used to build prediction models for detection of food intake, differentiation between liquids and solids, and for estimation of the mass of ingested food. The proposed prediction models were able to detect periods of food intake with >95% accuracy and a fine time resolution of 30 s, differentiate solid foods from liquids with >91% accuracy, and predict mass of ingested food with >91% accuracy for solids and >83% accuracy for liquids. In earlier publications, we have shown that chews and swallows can be captured by noninvasive sensors that could be developed into a wearable device. Thus, the proposed methodology could lead to the development of an innovative new way of assessing human eating behavior in free-living conditions. PMID:19444225

  10. Toward objective monitoring of ingestive behavior in free-living population.

    PubMed

    Sazonov, Edward S; Schuckers, Stephanie A C; Lopez-Meyer, Paulo; Makeyev, Oleksandr; Melanson, Edward L; Neuman, Michael R; Hill, James O

    2009-10-01

    Understanding of eating behaviors associated with obesity requires objective and accurate monitoring of food intake patterns. Accurate methods are available for measuring total energy expenditure and its components in free-living populations, but methods for measuring food intake in free-living people are far less accurate and involve self-reporting or subjective monitoring. We suggest that chews and swallows can be used for objective monitoring of ingestive behavior. This hypothesis was verified in a human study involving 20 subjects. Chews and swallows were captured during periods of quiet resting, talking, and meals of varying size. The counts of chews and swallows along with other derived metrics were used to build prediction models for detection of food intake, differentiation between liquids and solids, and for estimation of the mass of ingested food. The proposed prediction models were able to detect periods of food intake with >95% accuracy and a fine time resolution of 30 s, differentiate solid foods from liquids with >91% accuracy, and predict mass of ingested food with >91% accuracy for solids and >83% accuracy for liquids. In earlier publications, we have shown that chews and swallows can be captured by noninvasive sensors that could be developed into a wearable device. Thus, the proposed methodology could lead to the development of an innovative new way of assessing human eating behavior in free-living conditions.

  11. Predicting plant biomass accumulation from image-derived parameters

    PubMed Central

    Chen, Dijun; Shi, Rongli; Pape, Jean-Michel; Neumann, Kerstin; Graner, Andreas; Chen, Ming; Klukas, Christian

    2018-01-01

    Abstract Background Image-based high-throughput phenotyping technologies have been rapidly developed in plant science recently, and they provide a great potential to gain more valuable information than traditionally destructive methods. Predicting plant biomass is regarded as a key purpose for plant breeders and ecologists. However, it is a great challenge to find a predictive biomass model across experiments. Results In the present study, we constructed 4 predictive models to examine the quantitative relationship between image-based features and plant biomass accumulation. Our methodology has been applied to 3 consecutive barley (Hordeum vulgare) experiments with control and stress treatments. The results proved that plant biomass can be accurately predicted from image-based parameters using a random forest model. The high prediction accuracy based on this model will contribute to relieving the phenotyping bottleneck in biomass measurement in breeding applications. The prediction performance is still relatively high across experiments under similar conditions. The relative contribution of individual features for predicting biomass was further quantified, revealing new insights into the phenotypic determinants of the plant biomass outcome. Furthermore, methods could also be used to determine the most important image-based features related to plant biomass accumulation, which would be promising for subsequent genetic mapping to uncover the genetic basis of biomass. Conclusions We have developed quantitative models to accurately predict plant biomass accumulation from image data. We anticipate that the analysis results will be useful to advance our views of the phenotypic determinants of plant biomass outcome, and the statistical methods can be broadly used for other plant species. PMID:29346559

  12. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    NASA Technical Reports Server (NTRS)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  13. Prediction of near-term breast cancer risk using a Bayesian belief network

    NASA Astrophysics Data System (ADS)

    Zheng, Bin; Ramalingam, Pandiyarajan; Hariharan, Harishwaran; Leader, Joseph K.; Gur, David

    2013-03-01

    Accurately predicting near-term breast cancer risk is an important prerequisite for establishing an optimal personalized breast cancer screening paradigm. In previous studies, we investigated and tested the feasibility of developing a unique near-term breast cancer risk prediction model based on a new risk factor associated with bilateral mammographic density asymmetry between the left and right breasts of a woman using a single feature. In this study we developed a multi-feature based Bayesian belief network (BBN) that combines bilateral mammographic density asymmetry with three other popular risk factors, namely (1) age, (2) family history, and (3) average breast density, to further increase the discriminatory power of our cancer risk model. A dataset involving "prior" negative mammography examinations of 348 women was used in the study. Among these women, 174 had breast cancer detected and verified in the next sequential screening examinations, and 174 remained negative (cancer-free). A BBN was applied to predict the risk of each woman having cancer detected six to 18 months later following the negative screening mammography. The prediction results were compared with those using single features. The prediction accuracy was significantly increased when using the BBN. The area under the ROC curve increased from an AUC=0.70 to 0.84 (p<0.01), while the positive predictive value (PPV) and negative predictive value (NPV) also increased from a PPV=0.61 to 0.78 and an NPV=0.65 to 0.75, respectively. This study demonstrates that a multi-feature based BBN can more accurately predict the near-term breast cancer risk than with a single feature.

  14. A physical-based gas-surface interaction model for rarefied gas flow simulation

    NASA Astrophysics Data System (ADS)

    Liang, Tengfei; Li, Qi; Ye, Wenjing

    2018-01-01

    Empirical gas-surface interaction models, such as the Maxwell model and the Cercignani-Lampis model, are widely used as the boundary condition in rarefied gas flow simulations. The accuracy of these models in the prediction of macroscopic behavior of rarefied gas flows is less satisfactory in some cases especially the highly non-equilibrium ones. Molecular dynamics simulation can accurately resolve the gas-surface interaction process at atomic scale, and hence can predict accurate macroscopic behavior. They are however too computationally expensive to be applied in real problems. In this work, a statistical physical-based gas-surface interaction model, which complies with the basic relations of boundary condition, is developed based on the framework of the washboard model. In virtue of its physical basis, this new model is capable of capturing some important relations/trends for which the classic empirical models fail to model correctly. As such, the new model is much more accurate than the classic models, and in the meantime is more efficient than MD simulations. Therefore, it can serve as a more accurate and efficient boundary condition for rarefied gas flow simulations.

  15. Effect of species rarity on the accuracy of species distribution models for reptiles and amphibians in southern California

    USGS Publications Warehouse

    Franklin, J.; Wejnert, K.E.; Hathaway, S.A.; Rochester, C.J.; Fisher, R.N.

    2009-01-01

    Aim: Several studies have found that more accurate predictive models of species' occurrences can be developed for rarer species; however, one recent study found the relationship between range size and model performance to be an artefact of sample prevalence, that is, the proportion of presence versus absence observations in the data used to train the model. We examined the effect of model type, species rarity class, species' survey frequency, detectability and manipulated sample prevalence on the accuracy of distribution models developed for 30 reptile and amphibian species. Location: Coastal southern California, USA. Methods: Classification trees, generalized additive models and generalized linear models were developed using species presence and absence data from 420 locations. Model performance was measured using sensitivity, specificity and the area under the curve (AUC) of the receiver-operating characteristic (ROC) plot based on twofold cross-validation, or on bootstrapping. Predictors included climate, terrain, soil and vegetation variables. Species were assigned to rarity classes by experts. The data were sampled to generate subsets with varying ratios of presences and absences to test for the effect of sample prevalence. Join count statistics were used to characterize spatial dependence in the prediction errors. Results: Species in classes with higher rarity were more accurately predicted than common species, and this effect was independent of sample prevalence. Although positive spatial autocorrelation remained in the prediction errors, it was weaker than was observed in the species occurrence data. The differences in accuracy among model types were slight. Main conclusions: Using a variety of modelling methods, more accurate species distribution models were developed for rarer than for more common species. This was presumably because it is difficult to discriminate suitable from unsuitable habitat for habitat generalists, and not as an artefact of the effect of sample prevalence on model estimation. ?? 2008 The Authors.

  16. Optimization and real-time control for laser treatment of heterogeneous soft tissues.

    PubMed

    Feng, Yusheng; Fuentes, David; Hawkins, Andrea; Bass, Jon M; Rylander, Marissa Nichole

    2009-01-01

    Predicting the outcome of thermotherapies in cancer treatment requires an accurate characterization of the bioheat transfer processes in soft tissues. Due to the biological and structural complexity of tumor (soft tissue) composition and vasculature, it is often very difficult to obtain reliable tissue properties that is one of the key factors for the accurate treatment outcome prediction. Efficient algorithms employing in vivo thermal measurements to determine heterogeneous thermal tissues properties in conjunction with a detailed sensitivity analysis can produce essential information for model development and optimal control. The goals of this paper are to present a general formulation of the bioheat transfer equation for heterogeneous soft tissues, review models and algorithms developed for cell damage, heat shock proteins, and soft tissues with nanoparticle inclusion, and demonstrate an overall computational strategy for developing a laser treatment framework with the ability to perform real-time robust calibrations and optimal control. This computational strategy can be applied to other thermotherapies using the heat source such as radio frequency or high intensity focused ultrasound.

  17. A simple method for HPLC retention time prediction: linear calibration using two reference substances.

    PubMed

    Sun, Lei; Jin, Hong-Yu; Tian, Run-Tao; Wang, Ming-Juan; Liu, Li-Na; Ye, Liu-Ping; Zuo, Tian-Tian; Ma, Shuang-Cheng

    2017-01-01

    Analysis of related substances in pharmaceutical chemicals and multi-components in traditional Chinese medicines needs bulk of reference substances to identify the chromatographic peaks accurately. But the reference substances are costly. Thus, the relative retention (RR) method has been widely adopted in pharmacopoeias and literatures for characterizing HPLC behaviors of those reference substances unavailable. The problem is it is difficult to reproduce the RR on different columns due to the error between measured retention time (t R ) and predicted t R in some cases. Therefore, it is useful to develop an alternative and simple method for prediction of t R accurately. In the present study, based on the thermodynamic theory of HPLC, a method named linear calibration using two reference substances (LCTRS) was proposed. The method includes three steps, procedure of two points prediction, procedure of validation by multiple points regression and sequential matching. The t R of compounds on a HPLC column can be calculated by standard retention time and linear relationship. The method was validated in two medicines on 30 columns. It was demonstrated that, LCTRS method is simple, but more accurate and more robust on different HPLC columns than RR method. Hence quality standards using LCTRS method are easy to reproduce in different laboratories with lower cost of reference substances.

  18. Development of a binder fracture test to determine fracture energy.

    DOT National Transportation Integrated Search

    2012-04-01

    It has been found that binder testing methods in current specifications do not accurately predict cracking performance at intermediate temperatures. Fracture energy has been determined to be strongly correlated to fracture resistance of asphalt mixtu...

  19. A probabilistic model to predict clinical phenotypic traits from genome sequencing.

    PubMed

    Chen, Yun-Ching; Douville, Christopher; Wang, Cheng; Niknafs, Noushin; Yeo, Grace; Beleva-Guthrie, Violeta; Carter, Hannah; Stenson, Peter D; Cooper, David N; Li, Biao; Mooney, Sean; Karchin, Rachel

    2014-09-01

    Genetic screening is becoming possible on an unprecedented scale. However, its utility remains controversial. Although most variant genotypes cannot be easily interpreted, many individuals nevertheless attempt to interpret their genetic information. Initiatives such as the Personal Genome Project (PGP) and Illumina's Understand Your Genome are sequencing thousands of adults, collecting phenotypic information and developing computational pipelines to identify the most important variant genotypes harbored by each individual. These pipelines consider database and allele frequency annotations and bioinformatics classifications. We propose that the next step will be to integrate these different sources of information to estimate the probability that a given individual has specific phenotypes of clinical interest. To this end, we have designed a Bayesian probabilistic model to predict the probability of dichotomous phenotypes. When applied to a cohort from PGP, predictions of Gilbert syndrome, Graves' disease, non-Hodgkin lymphoma, and various blood groups were accurate, as individuals manifesting the phenotype in question exhibited the highest, or among the highest, predicted probabilities. Thirty-eight PGP phenotypes (26%) were predicted with area-under-the-ROC curve (AUC)>0.7, and 23 (15.8%) of these were statistically significant, based on permutation tests. Moreover, in a Critical Assessment of Genome Interpretation (CAGI) blinded prediction experiment, the models were used to match 77 PGP genomes to phenotypic profiles, generating the most accurate prediction of 16 submissions, according to an independent assessor. Although the models are currently insufficiently accurate for diagnostic utility, we expect their performance to improve with growth of publicly available genomics data and model refinement by domain experts.

  20. Toward structure prediction of cyclic peptides.

    PubMed

    Yu, Hongtao; Lin, Yu-Shan

    2015-02-14

    Cyclic peptides are a promising class of molecules that can be used to target specific protein-protein interactions. A computational method to accurately predict their structures would substantially advance the development of cyclic peptides as modulators of protein-protein interactions. Here, we develop a computational method that integrates bias-exchange metadynamics simulations, a Boltzmann reweighting scheme, dihedral principal component analysis and a modified density peak-based cluster analysis to provide a converged structural description for cyclic peptides. Using this method, we evaluate the performance of a number of popular protein force fields on a model cyclic peptide. All the tested force fields seem to over-stabilize the α-helix and PPII/β regions in the Ramachandran plot, commonly populated by linear peptides and proteins. Our findings suggest that re-parameterization of a force field that well describes the full Ramachandran plot is necessary to accurately model cyclic peptides.

  1. Prediction of discretization error using the error transport equation

    NASA Astrophysics Data System (ADS)

    Celik, Ismail B.; Parsons, Don Roscoe

    2017-06-01

    This study focuses on an approach to quantify the discretization error associated with numerical solutions of partial differential equations by solving an error transport equation (ETE). The goal is to develop a method that can be used to adequately predict the discretization error using the numerical solution on only one grid/mesh. The primary problem associated with solving the ETE is the formulation of the error source term which is required for accurately predicting the transport of the error. In this study, a novel approach is considered which involves fitting the numerical solution with a series of locally smooth curves and then blending them together with a weighted spline approach. The result is a continuously differentiable analytic expression that can be used to determine the error source term. Once the source term has been developed, the ETE can easily be solved using the same solver that is used to obtain the original numerical solution. The new methodology is applied to the two-dimensional Navier-Stokes equations in the laminar flow regime. A simple unsteady flow case is also considered. The discretization error predictions based on the methodology presented in this study are in good agreement with the 'true error'. While in most cases the error predictions are not quite as accurate as those from Richardson extrapolation, the results are reasonable and only require one numerical grid. The current results indicate that there is much promise going forward with the newly developed error source term evaluation technique and the ETE.

  2. Development and validation of a machine learning algorithm and hybrid system to predict the need for life-saving interventions in trauma patients.

    PubMed

    Liu, Nehemiah T; Holcomb, John B; Wade, Charles E; Batchinsky, Andriy I; Cancio, Leopoldo C; Darrah, Mark I; Salinas, José

    2014-02-01

    Accurate and effective diagnosis of actual injury severity can be problematic in trauma patients. Inherent physiologic compensatory mechanisms may prevent accurate diagnosis and mask true severity in many circumstances. The objective of this project was the development and validation of a multiparameter machine learning algorithm and system capable of predicting the need for life-saving interventions (LSIs) in trauma patients. Statistics based on means, slopes, and maxima of various vital sign measurements corresponding to 79 trauma patient records generated over 110,000 feature sets, which were used to develop, train, and implement the system. Comparisons among several machine learning models proved that a multilayer perceptron would best implement the algorithm in a hybrid system consisting of a machine learning component and basic detection rules. Additionally, 295,994 feature sets from 82 h of trauma patient data showed that the system can obtain 89.8 % accuracy within 5 min of recorded LSIs. Use of machine learning technologies combined with basic detection rules provides a potential approach for accurately assessing the need for LSIs in trauma patients. The performance of this system demonstrates that machine learning technology can be implemented in a real-time fashion and potentially used in a critical care environment.

  3. The prediction of intelligence in preschool children using alternative models to regression.

    PubMed

    Finch, W Holmes; Chang, Mei; Davis, Andrew S; Holden, Jocelyn E; Rothlisberg, Barbara A; McIntosh, David E

    2011-12-01

    Statistical prediction of an outcome variable using multiple independent variables is a common practice in the social and behavioral sciences. For example, neuropsychologists are sometimes called upon to provide predictions of preinjury cognitive functioning for individuals who have suffered a traumatic brain injury. Typically, these predictions are made using standard multiple linear regression models with several demographic variables (e.g., gender, ethnicity, education level) as predictors. Prior research has shown conflicting evidence regarding the ability of such models to provide accurate predictions of outcome variables such as full-scale intelligence (FSIQ) test scores. The present study had two goals: (1) to demonstrate the utility of a set of alternative prediction methods that have been applied extensively in the natural sciences and business but have not been frequently explored in the social sciences and (2) to develop models that can be used to predict premorbid cognitive functioning in preschool children. Predictions of Stanford-Binet 5 FSIQ scores for preschool-aged children is used to compare the performance of a multiple regression model with several of these alternative methods. Results demonstrate that classification and regression trees provided more accurate predictions of FSIQ scores than does the more traditional regression approach. Implications of these results are discussed.

  4. A prediction model for early death in non-small cell lung cancer patients following curative-intent chemoradiotherapy.

    PubMed

    Jochems, Arthur; El-Naqa, Issam; Kessler, Marc; Mayo, Charles S; Jolly, Shruti; Matuszak, Martha; Faivre-Finn, Corinne; Price, Gareth; Holloway, Lois; Vinod, Shalini; Field, Matthew; Barakat, Mohamed Samir; Thwaites, David; de Ruysscher, Dirk; Dekker, Andre; Lambin, Philippe

    2018-02-01

    Early death after a treatment can be seen as a therapeutic failure. Accurate prediction of patients at risk for early mortality is crucial to avoid unnecessary harm and reducing costs. The goal of our work is two-fold: first, to evaluate the performance of a previously published model for early death in our cohorts. Second, to develop a prognostic model for early death prediction following radiotherapy. Patients with NSCLC treated with chemoradiotherapy or radiotherapy alone were included in this study. Four different cohorts from different countries were available for this work (N = 1540). The previous model used age, gender, performance status, tumor stage, income deprivation, no previous treatment given (yes/no) and body mass index to make predictions. A random forest model was developed by learning on the Maastro cohort (N = 698). The new model used performance status, age, gender, T and N stage, total tumor volume (cc), total tumor dose (Gy) and chemotherapy timing (none, sequential, concurrent) to make predictions. Death within 4 months of receiving the first radiotherapy fraction was used as the outcome. Early death rates ranged from 6 to 11% within the four cohorts. The previous model performed with AUC values ranging from 0.54 to 0.64 on the validation cohorts. Our newly developed model had improved AUC values ranging from 0.62 to 0.71 on the validation cohorts. Using advanced machine learning methods and informative variables, prognostic models for early mortality can be developed. Development of accurate prognostic tools for early mortality is important to inform patients about treatment options and optimize care.

  5. Developing a tuberculosis transmission model that accounts for changes in population health.

    PubMed

    Oxlade, Olivia; Schwartzman, Kevin; Benedetti, Andrea; Pai, Madhukar; Heymann, Jody; Menzies, Dick

    2011-01-01

    Simulation models are useful in policy planning for tuberculosis (TB) control. To accurately assess interventions, important modifiers of the epidemic should be accounted for in evaluative models. Improvements in population health were associated with the declining TB epidemic in the pre-antibiotic era and may be relevant today. The objective of this study was to develop and validate a TB transmission model that accounted for changes in population health. We developed a deterministic TB transmission model, using reported data from the pre-antibiotic era in England. Change in adjusted life expectancy, used as a proxy for general health, was used to determine the rate of change of key epidemiological parameters. Predicted outcomes included risk of TB infection and TB mortality. The model was validated in the setting of the Netherlands and then applied to modern Peru. The model, developed in the setting of England, predicted TB trends in the Netherlands very accurately. The R(2) value for correlation between observed and predicted data was 0.97 and 0.95 for TB infection and mortality, respectively. In Peru, the predicted decline in incidence prior to the expansion of "Directly Observed Treatment Short Course" (The DOTS strategy) was 3.7% per year (observed = 3.9% per year). After DOTS expansion, the predicted decline was very similar to the observed decline of 5.8% per year. We successfully developed and validated a TB model, which uses a proxy for population health to estimate changes in key epidemiology parameters. Population health contributed significantly to improvement in TB outcomes observed in Peru. Changing population health should be incorporated into evaluative models for global TB control.

  6. Prediction Models for 30-Day Mortality and Complications After Total Knee and Hip Arthroplasties for Veteran Health Administration Patients With Osteoarthritis.

    PubMed

    Harris, Alex Hs; Kuo, Alfred C; Bowe, Thomas; Gupta, Shalini; Nordin, David; Giori, Nicholas J

    2018-05-01

    Statistical models to preoperatively predict patients' risk of death and major complications after total joint arthroplasty (TJA) could improve the quality of preoperative management and informed consent. Although risk models for TJA exist, they have limitations including poor transparency and/or unknown or poor performance. Thus, it is currently impossible to know how well currently available models predict short-term complications after TJA, or if newly developed models are more accurate. We sought to develop and conduct cross-validation of predictive risk models, and report details and performance metrics as benchmarks. Over 90 preoperative variables were used as candidate predictors of death and major complications within 30 days for Veterans Health Administration patients with osteoarthritis who underwent TJA. Data were split into 3 samples-for selection of model tuning parameters, model development, and cross-validation. C-indexes (discrimination) and calibration plots were produced. A total of 70,569 patients diagnosed with osteoarthritis who received primary TJA were included. C-statistics and bootstrapped confidence intervals for the cross-validation of the boosted regression models were highest for cardiac complications (0.75; 0.71-0.79) and 30-day mortality (0.73; 0.66-0.79) and lowest for deep vein thrombosis (0.59; 0.55-0.64) and return to the operating room (0.60; 0.57-0.63). Moderately accurate predictive models of 30-day mortality and cardiac complications after TJA in Veterans Health Administration patients were developed and internally cross-validated. By reporting model coefficients and performance metrics, other model developers can test these models on new samples and have a procedure and indication-specific benchmark to surpass. Published by Elsevier Inc.

  7. Organ-on-a-Chip Technology for Reproducing Multiorgan Physiology.

    PubMed

    Lee, Seung Hwan; Sung, Jong Hwan

    2018-01-01

    In the drug development process, the accurate prediction of drug efficacy and toxicity is important in order to reduce the cost, labor, and effort involved. For this purpose, conventional 2D cell culture models are used in the early phase of drug development. However, the differences between the in vitro and the in vivo systems have caused the failure of drugs in the later phase of the drug-development process. Therefore, there is a need for a novel in vitro model system that can provide accurate information for evaluating the drug efficacy and toxicity through a closer recapitulation of the in vivo system. Recently, the idea of using microtechnology for mimicking the microscale tissue environment has become widespread, leading to the development of "organ-on-a-chip." Furthermore, the system is further developed for realizing a multiorgan model for mimicking interactions between multiple organs. These advancements are still ongoing and are aimed at ultimately developing "body-on-a-chip" or "human-on-a-chip" devices for predicting the response of the whole body. This review summarizes recently developed organ-on-a-chip technologies, and their applications for reproducing multiorgan functions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Predicting the Dynamic Crushing Response of a Composite Honeycomb Energy Absorber Using Solid-Element-Based Models in LS-DYNA

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.

    2010-01-01

    This paper describes an analytical study that was performed as part of the development of an externally deployable energy absorber (DEA) concept. The concept consists of a composite honeycomb structure that can be stowed until needed to provide energy attenuation during a crash event, much like an external airbag system. One goal of the DEA development project was to generate a robust and reliable Finite Element Model (FEM) of the DEA that could be used to accurately predict its crush response under dynamic loading. The results of dynamic crush tests of 50-, 104-, and 68-cell DEA components are presented, and compared with simulation results from a solid-element FEM. Simulations of the FEM were performed in LS-DYNA(Registered TradeMark) to compare the capabilities of three different material models: MAT 63 (crushable foam), MAT 26 (honeycomb), and MAT 126 (modified honeycomb). These material models are evaluated to determine if they can be used to accurately predict both the uniform crushing and final compaction phases of the DEA for normal and off-axis loading conditions

  9. Decision-Tree Analysis for Predicting First-Time Pass/Fail Rates for the NCLEX-RN® in Associate Degree Nursing Students.

    PubMed

    Chen, Hsiu-Chin; Bennett, Sean

    2016-08-01

    Little evidence shows the use of decision-tree algorithms in identifying predictors and analyzing their associations with pass rates for the NCLEX-RN(®) in associate degree nursing students. This longitudinal and retrospective cohort study investigated whether a decision-tree algorithm could be used to develop an accurate prediction model for the students' passing or failing the NCLEX-RN. This study used archived data from 453 associate degree nursing students in a selected program. The chi-squared automatic interaction detection analysis of the decision trees module was used to examine the effect of the collected predictors on passing/failing the NCLEX-RN. The actual percentage scores of Assessment Technologies Institute®'s RN Comprehensive Predictor(®) accurately identified students at risk of failing. The classification model correctly classified 92.7% of the students for passing. This study applied the decision-tree model to analyze a sequence database for developing a prediction model for early remediation in preparation for the NCLEXRN. [J Nurs Educ. 2016;55(8):454-457.]. Copyright 2016, SLACK Incorporated.

  10. Augmenting the SCaN Link Budget Tool with Validated Atmospheric Propagation

    NASA Technical Reports Server (NTRS)

    Steinkerchner, Leo; Welch, Bryan

    2017-01-01

    In any Earth-Space or Space-Earth communications link, atmospheric effects cause significant signal attenuation. In order to develop a communications system that is cost effective while meeting appropriate performance requirements, it is important to accurately predict these effects for the given link parameters. This project aimed to develop a Matlab(TradeMark) (The MathWorks, Inc.) program that could augment the existing Space Communications and Navigation (SCaN) Link Budget Tool with accurate predictions of atmospheric attenuation of both optical and radio-frequency signals according to the SCaN Optical Link Assessment Model Version 5 and the International Telecommunications Union, Radiocommunications Sector (ITU-R) atmospheric propagation loss model, respectively. When compared to data collected from the Advance Communications Technology Satellite (ACTS), the radio-frequency model predicted attenuation to within 1.3 dB of loss for 95 of measurements. Ultimately, this tool will be integrated into the SCaN Center for Engineering, Networks, Integration, and Communications (SCENIC) user interface in order to support analysis of existing SCaN systems and planning capabilities for future NASA missions.

  11. Prediction of Lunar Reconnaissance Orbiter Reaction Wheel Assembly Angular Momentum Using Regression Analysis

    NASA Technical Reports Server (NTRS)

    DeHart, Russell

    2017-01-01

    This study determines the feasibility of creating a tool that can accurately predict Lunar Reconnaissance Orbiter (LRO) reaction wheel assembly (RWA) angular momentum, weeks or even months into the future. LRO is a three-axis stabilized spacecraft that was launched on June 18, 2009. While typically nadir-pointing, LRO conducts many types of slews to enable novel science collection. Momentum unloads have historically been performed approximately once every two weeks with the goal of maintaining system total angular momentum below 70 Nms; however flight experience shows the models developed before launch are overly conservative, with many momentum unloads being performed before system angular momentum surpasses 50 Nms. A more accurate model of RWA angular momentum growth would improve momentum unload scheduling and decrease the frequency of these unloads. Since some LRO instruments must be deactivated during momentum unloads and in the case of one instrument, decontaminated for 24 hours there after a decrease in the frequency of unloads increases science collection. This study develops a new model to predict LRO RWA angular momentum. Regression analysis of data from October 2014 to October 2015 was used to develop relationships between solar beta angle, slew specifications, and RWA angular momentum growth. The resulting model predicts RWA angular momentum using input solar beta angle and mission schedule data. This model was used to predict RWA angular momentum from October 2013 to October 2014. Predictions agree well with telemetry; of the 23 momentum unloads performed from October 2013 to October 2014, the mean and median magnitude of the RWA total angular momentum prediction error at the time of the momentum unloads were 3.7 and 2.7 Nms, respectively. The magnitude of the largest RWA total angular momentum prediction error was 10.6 Nms. Development of a tool that uses the models presented herein is currently underway.

  12. An Interpretable Machine Learning Model for Accurate Prediction of Sepsis in the ICU.

    PubMed

    Nemati, Shamim; Holder, Andre; Razmi, Fereshteh; Stanley, Matthew D; Clifford, Gari D; Buchman, Timothy G

    2018-04-01

    Sepsis is among the leading causes of morbidity, mortality, and cost overruns in critically ill patients. Early intervention with antibiotics improves survival in septic patients. However, no clinically validated system exists for real-time prediction of sepsis onset. We aimed to develop and validate an Artificial Intelligence Sepsis Expert algorithm for early prediction of sepsis. Observational cohort study. Academic medical center from January 2013 to December 2015. Over 31,000 admissions to the ICUs at two Emory University hospitals (development cohort), in addition to over 52,000 ICU patients from the publicly available Medical Information Mart for Intensive Care-III ICU database (validation cohort). Patients who met the Third International Consensus Definitions for Sepsis (Sepsis-3) prior to or within 4 hours of their ICU admission were excluded, resulting in roughly 27,000 and 42,000 patients within our development and validation cohorts, respectively. None. High-resolution vital signs time series and electronic medical record data were extracted. A set of 65 features (variables) were calculated on hourly basis and passed to the Artificial Intelligence Sepsis Expert algorithm to predict onset of sepsis in the proceeding T hours (where T = 12, 8, 6, or 4). Artificial Intelligence Sepsis Expert was used to predict onset of sepsis in the proceeding T hours and to produce a list of the most significant contributing factors. For the 12-, 8-, 6-, and 4-hour ahead prediction of sepsis, Artificial Intelligence Sepsis Expert achieved area under the receiver operating characteristic in the range of 0.83-0.85. Performance of the Artificial Intelligence Sepsis Expert on the development and validation cohorts was indistinguishable. Using data available in the ICU in real-time, Artificial Intelligence Sepsis Expert can accurately predict the onset of sepsis in an ICU patient 4-12 hours prior to clinical recognition. A prospective study is necessary to determine the clinical utility of the proposed sepsis prediction model.

  13. Determination of main components and anaerobic rumen digestibility of aquatic plants in vitro using near-infrared-reflectance spectroscopy.

    PubMed

    Yue, Zheng-Bo; Zhang, Meng-Lin; Sheng, Guo-Ping; Liu, Rong-Hua; Long, Ying; Xiang, Bing-Ren; Wang, Jin; Yu, Han-Qing

    2010-04-01

    A near-infrared-reflectance (NIR) spectroscopy-based method is established to determine the main components of aquatic plants as well as their anaerobic rumen biodegradability. The developed method is more rapid and accurate compared to the conventional chemical analysis and biodegradability tests. Moisture, volatile solid, Klason lignin and ash in entire aquatic plants could be accurately predicted using this method with coefficient of determination (r(2)) values of 0.952, 0.916, 0.939 and 0.950, respectively. In addition, the anaerobic rumen biodegradability of aquatic plants, represented as biogas and methane yields, could also be predicted well. The algorithm of continuous wavelet transform for the NIR spectral data pretreatment is able to greatly enhance the robustness and predictive ability of the NIR spectral analysis. These results indicate that NIR spectroscopy could be used to predict the main components of aquatic plants and their anaerobic biodegradability. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  14. Accurate prediction of cellular co-translational folding indicates proteins can switch from post- to co-translational folding

    PubMed Central

    Nissley, Daniel A.; Sharma, Ajeet K.; Ahmed, Nabeel; Friedrich, Ulrike A.; Kramer, Günter; Bukau, Bernd; O'Brien, Edward P.

    2016-01-01

    The rates at which domains fold and codons are translated are important factors in determining whether a nascent protein will co-translationally fold and function or misfold and malfunction. Here we develop a chemical kinetic model that calculates a protein domain's co-translational folding curve during synthesis using only the domain's bulk folding and unfolding rates and codon translation rates. We show that this model accurately predicts the course of co-translational folding measured in vivo for four different protein molecules. We then make predictions for a number of different proteins in yeast and find that synonymous codon substitutions, which change translation-elongation rates, can switch some protein domains from folding post-translationally to folding co-translationally—a result consistent with previous experimental studies. Our approach explains essential features of co-translational folding curves and predicts how varying the translation rate at different codon positions along a transcript's coding sequence affects this self-assembly process. PMID:26887592

  15. Exploring a new quantitative image marker to assess benefit of chemotherapy to ovarian cancer patients

    NASA Astrophysics Data System (ADS)

    Mirniaharikandehei, Seyedehnafiseh; Patil, Omkar; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin

    2017-03-01

    Accurately assessing the potential benefit of chemotherapy to cancer patients is an important prerequisite to developing precision medicine in cancer treatment. The previous study has shown that total psoas area (TPA) measured on preoperative cross-section CT image might be a good image marker to predict long-term outcome of pancreatic cancer patients after surgery. However, accurate and automated segmentation of TPA from the CT image is difficult due to the fuzzy boundary or connection of TPA to other muscle areas. In this study, we developed a new interactive computer-aided detection (ICAD) scheme aiming to segment TPA from the abdominal CT images more accurately and assess the feasibility of using this new quantitative image marker to predict the benefit of ovarian cancer patients receiving Bevacizumab-based chemotherapy. ICAD scheme was applied to identify a CT image slice of interest, which is located at the level of L3 (vertebral spines). The cross-sections of the right and left TPA are segmented using a set of adaptively adjusted boundary conditions. TPA is then quantitatively measured. In addition, recent studies have investigated that muscle radiation attenuation which reflects fat deposition in the tissue might be a good image feature for predicting the survival rate of cancer patients. The scheme and TPA measurement task were applied to a large national clinical trial database involving 1,247 ovarian cancer patients. By comparing with manual segmentation results, we found that ICAD scheme could yield higher accuracy and consistency for this task. Using a new ICAD scheme can provide clinical researchers a useful tool to more efficiently and accurately extract TPA as well as muscle radiation attenuation as new image makers, and allow them to investigate the discriminatory power of it to predict progression-free survival and/or overall survival of the cancer patients before and after taking chemotherapy.

  16. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields

    PubMed Central

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin

    2015-01-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. PMID:26590280

  17. Predicting Team Performance through Human Behavioral Sensing and Quantitative Workflow Instrumentation

    DTIC Science & Technology

    2016-07-27

    make risk-informed decisions during serious games . Statistical models of intra- game performance were developed to determine whether behaviors in...specific facets of the gameplay workflow were predictive of analytical performance and games outcomes. A study of over seventy instrumented teams revealed...more accurate game decisions. 2 Keywords: Humatics · Serious Games · Human-System Interaction · Instrumentation · Teamwork · Communication Analysis

  18. A Maximal Graded Exercise Test to Accurately Predict VO2max in 18-65-Year-Old Adults

    ERIC Educational Resources Information Center

    George, James D.; Bradshaw, Danielle I.; Hyde, Annette; Vehrs, Pat R.; Hager, Ronald L.; Yanowitz, Frank G.

    2007-01-01

    The purpose of this study was to develop an age-generalized regression model to predict maximal oxygen uptake (VO sub 2 max) based on a maximal treadmill graded exercise test (GXT; George, 1996). Participants (N = 100), ages 18-65 years, reached a maximal level of exertion (mean plus or minus standard deviation [SD]; maximal heart rate [HR sub…

  19. Improved Modeling of Finite-Rate Turbulent Combustion Processes in Research Combustors

    NASA Technical Reports Server (NTRS)

    VanOverbeke, Thomas J.

    1998-01-01

    The objective of this thesis is to further develop and test a stochastic model of turbulent combustion in recirculating flows. There is a requirement to increase the accuracy of multi-dimensional combustion predictions. As turbulence affects reaction rates, this interaction must be more accurately evaluated. In this work a more physically correct way of handling the interaction of turbulence on combustion is further developed and tested. As turbulence involves randomness, stochastic modeling is used. Averaged values such as temperature and species concentration are found by integrating the probability density function (pdf) over the range of the scalar. The model in this work does not assume the pdf type, but solves for the evolution of the pdf using the Monte Carlo solution technique. The model is further developed by including a more robust reaction solver, by using accurate thermodynamics and by more accurate transport elements. The stochastic method is used with Semi-Implicit Method for Pressure-Linked Equations. The SIMPLE method is used to solve for velocity, pressure, turbulent kinetic energy and dissipation. The pdf solver solves for temperature and species concentration. Thus, the method is partially familiar to combustor engineers. The method is compared to benchmark experimental data and baseline calculations. The baseline method was tested on isothermal flows, evaporating sprays and combusting sprays. Pdf and baseline predictions were performed for three diffusion flames and one premixed flame. The pdf method predicted lower combustion rates than the baseline method in agreement with the data, except for the premixed flame. The baseline and stochastic predictions bounded the experimental data for the premixed flame. The use of a continuous mixing model or relax to mean mixing model had little effect on the prediction of average temperature. Two grids were used in a hydrogen diffusion flame simulation. Grid density did not effect the predictions except for peak temperature and tangential velocity. The hybrid pdf method did take longer and required more memory, but has a theoretical basis to extend to many reaction steps which cannot be said of current turbulent combustion models.

  20. Predictive Model for the Meniscus-Guided Coating of High-Quality Organic Single-Crystalline Thin Films.

    PubMed

    Janneck, Robby; Vercesi, Federico; Heremans, Paul; Genoe, Jan; Rolin, Cedric

    2016-09-01

    A model that describes solvent evaporation dynamics in meniscus-guided coating techniques is developed. In combination with a single fitting parameter, it is shown that this formula can accurately predict a processing window for various coating conditions. Organic thin-film transistors (OTFTs), fabricated by a zone-casting setup, indeed show the best performance at the predicted coating speeds with mobilities reaching 7 cm 2 V -1 s -1 . © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Forming limit prediction by an evolving non-quadratic yield criterion considering the anisotropic hardening and r-value evolution

    NASA Astrophysics Data System (ADS)

    Lian, Junhe; Shen, Fuhui; Liu, Wenqi; Münstermann, Sebastian

    2018-05-01

    The constitutive model development has been driven to a very accurate and fine-resolution description of the material behaviour responding to various environmental variable changes. The evolving features of the anisotropic behaviour during deformation, therefore, has drawn particular attention due to its possible impacts on the sheet metal forming industry. An evolving non-associated Hill48 (enHill48) model was recently proposed and applied to the forming limit prediction by coupling with the modified maximum force criterion. On the one hand, the study showed the significance to include the anisotropic evolution for accurate forming limit prediction. On the other hand, it also illustrated that the enHill48 model introduced an instability region that suddenly decreases the formability. Therefore, in this study, an alternative model that is based on the associated flow rule and provides similar anisotropic predictive capability is extended to chapter the evolving effects and further applied to the forming limit prediction. The final results are compared with experimental data as well as the results by enHill48 model.

  2. MOWGLI: prediction of protein-MannOse interacting residues With ensemble classifiers usinG evoLutionary Information.

    PubMed

    Pai, Priyadarshini P; Mondal, Sukanta

    2016-10-01

    Proteins interact with carbohydrates to perform various cellular interactions. Of the many carbohydrate ligands that proteins bind with, mannose constitute an important class, playing important roles in host defense mechanisms. Accurate identification of mannose-interacting residues (MIR) may provide important clues to decipher the underlying mechanisms of protein-mannose interactions during infections. This study proposes an approach using an ensemble of base classifiers for prediction of MIR using their evolutionary information in the form of position-specific scoring matrix. The base classifiers are random forests trained by different subsets of training data set Dset128 using 10-fold cross-validation. The optimized ensemble of base classifiers, MOWGLI, is then used to predict MIR on protein chains of the test data set Dtestset29 which showed a promising performance with 92.0% accurate prediction. An overall improvement of 26.6% in precision was observed upon comparison with the state-of-art. It is hoped that this approach, yielding enhanced predictions, could be eventually used for applications in drug design and vaccine development.

  3. A research program for improving heat transfer prediction for the laminar to turbulent transition region of turbine vanes/blades

    NASA Technical Reports Server (NTRS)

    Simon, Frederick F.

    1993-01-01

    A program sponsored by NASA for the investigation of the heat transfer in the transition region of turbine vanes and blades with the objective of improving the capability for predicting heat transfer is described. The accurate prediction of gas-side heat transfer is important to the determination of turbine longevity, engine performance, and developmental costs. The need for accurate predictions will become greater as the operating temperatures and stage loading levels of advanced turbine engines increase. The present methods for predicting transition shear stress and heat transfer on turbine blades are based on incomplete knowledge and are largely empirical. To meet the objective of the NASA program, a team approach consisting of researchers from government, universities, a research institute, and a small business is presented. The research is divided into the areas of experiments, direct numerical simulations (DNS), and turbulence modeling. A summary of the results to date is given for the above research areas in a high-disturbance environment (bypass transition) with a discussion of the model development necessary for use in numerical codes.

  4. gCUP: rapid GPU-based HIV-1 co-receptor usage prediction for next-generation sequencing.

    PubMed

    Olejnik, Michael; Steuwer, Michel; Gorlatch, Sergei; Heider, Dominik

    2014-11-15

    Next-generation sequencing (NGS) has a large potential in HIV diagnostics, and genotypic prediction models have been developed and successfully tested in the recent years. However, albeit being highly accurate, these computational models lack computational efficiency to reach their full potential. In this study, we demonstrate the use of graphics processing units (GPUs) in combination with a computational prediction model for HIV tropism. Our new model named gCUP, parallelized and optimized for GPU, is highly accurate and can classify >175 000 sequences per second on an NVIDIA GeForce GTX 460. The computational efficiency of our new model is the next step to enable NGS technologies to reach clinical significance in HIV diagnostics. Moreover, our approach is not limited to HIV tropism prediction, but can also be easily adapted to other settings, e.g. drug resistance prediction. The source code can be downloaded at http://www.heiderlab.de d.heider@wz-straubing.de. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Ligand and structure-based methodologies for the prediction of the activity of G protein-coupled receptor ligands

    NASA Astrophysics Data System (ADS)

    Costanzi, Stefano; Tikhonova, Irina G.; Harden, T. Kendall; Jacobson, Kenneth A.

    2009-11-01

    Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.

  6. VisitSense: Sensing Place Visit Patterns from Ambient Radio on Smartphones for Targeted Mobile Ads in Shopping Malls

    PubMed Central

    Kim, Byoungjip; Kang, Seungwoo; Ha, Jin-Young; Song, Junehwa

    2015-01-01

    In this paper, we introduce a novel smartphone framework called VisitSense that automatically detects and predicts a smartphone user’s place visits from ambient radio to enable behavioral targeting for mobile ads in large shopping malls. VisitSense enables mobile app developers to adopt visit-pattern-aware mobile advertising for shopping mall visitors in their apps. It also benefits mobile users by allowing them to receive highly relevant mobile ads that are aware of their place visit patterns in shopping malls. To achieve the goal, VisitSense employs accurate visit detection and prediction methods. For accurate visit detection, we develop a change-based detection method to take into consideration the stability change of ambient radio and the mobility change of users. It performs well in large shopping malls where ambient radio is quite noisy and causes existing algorithms to easily fail. In addition, we proposed a causality-based visit prediction model to capture the causality in the sequential visit patterns for effective prediction. We have developed a VisitSense prototype system, and a visit-pattern-aware mobile advertising application that is based on it. Furthermore, we deploy the system in the COEX Mall, one of the largest shopping malls in Korea, and conduct diverse experiments to show the effectiveness of VisitSense. PMID:26193275

  7. Prediction of porosity of food materials during drying: Current challenges and directions.

    PubMed

    Joardder, Mohammad U H; Kumar, C; Karim, M A

    2017-07-18

    Pore formation in food samples is a common physical phenomenon observed during dehydration processes. The pore evolution during drying significantly affects the physical properties and quality of dried foods. Therefore, it should be taken into consideration when predicting transport processes in the drying sample. Characteristics of pore formation depend on the drying process parameters, product properties and processing time. Understanding the physics of pore formation and evolution during drying will assist in accurately predicting the drying kinetics and quality of food materials. Researchers have been trying to develop mathematical models to describe the pore formation and evolution during drying. In this study, existing porosity models are critically analysed and limitations are identified. Better insight into the factors affecting porosity is provided, and suggestions are proposed to overcome the limitations. These include considerations of process parameters such as glass transition temperature, sample temperature, and variable material properties in the porosity models. Several researchers have proposed models for porosity prediction of food materials during drying. However, these models are either very simplistic or empirical in nature and failed to consider relevant significant factors that influence porosity. In-depth understanding of characteristics of the pore is required for developing a generic model of porosity. A micro-level analysis of pore formation is presented for better understanding, which will help in developing an accurate and generic porosity model.

  8. Non-Markovian closure models for large eddy simulations using the Mori-Zwanzig formalism

    NASA Astrophysics Data System (ADS)

    Parish, Eric J.; Duraisamy, Karthik

    2017-01-01

    This work uses the Mori-Zwanzig (M-Z) formalism, a concept originating from nonequilibrium statistical mechanics, as a basis for the development of coarse-grained models of turbulence. The mechanics of the generalized Langevin equation (GLE) are considered, and insight gained from the orthogonal dynamics equation is used as a starting point for model development. A class of subgrid models is considered which represent nonlocal behavior via a finite memory approximation [Stinis, arXiv:1211.4285 (2012)], the length of which is determined using a heuristic that is related to the spectral radius of the Jacobian of the resolved variables. The resulting models are intimately tied to the underlying numerical resolution and are capable of approximating non-Markovian effects. Numerical experiments on the Burgers equation demonstrate that the M-Z-based models can accurately predict the temporal evolution of the total kinetic energy and the total dissipation rate at varying mesh resolutions. The trajectory of each resolved mode in phase space is accurately predicted for cases where the coarse graining is moderate. Large eddy simulations (LESs) of homogeneous isotropic turbulence and the Taylor-Green Vortex show that the M-Z-based models are able to provide excellent predictions, accurately capturing the subgrid contribution to energy transfer. Last, LESs of fully developed channel flow demonstrate the applicability of M-Z-based models to nondecaying problems. It is notable that the form of the closure is not imposed by the modeler, but is rather derived from the mathematics of the coarse graining, highlighting the potential of M-Z-based techniques to define LES closures.

  9. Optical glucose monitoring using vertical cavity surface emitting lasers (VCSELs)

    NASA Astrophysics Data System (ADS)

    Talebi Fard, Sahba; Hofmann, Werner; Talebi Fard, Pouria; Kwok, Ezra; Amann, Markus-Christian; Chrostowski, Lukas

    2009-08-01

    Diabetes Mellitus is a common chronic disease that has become a public health issue. Continuous glucose monitoring improves patient health by stabilizing the glucose levels. Optical methods are one of the painless and promising methods that can be used for blood glucose predictions. However, having accuracies lower than what is acceptable clinically has been a major concern. Using lasers along with multivariate techniques such as Partial Least Square (PLS) can improve glucose predictions. This research involves investigations for developing a novel optical system for accurate glucose predictions, which leads to the development of a small, low power, implantable optical sensor for diabetes patients.

  10. Development of a cryogenic mixed fluid J-T cooling computer code, 'JTMIX'

    NASA Technical Reports Server (NTRS)

    Jones, Jack A.

    1991-01-01

    An initial study was performed for analyzing and predicting the temperatures and cooling capacities when mixtures of fluids are used in Joule-Thomson coolers and in heat pipes. A computer code, JTMIX, was developed for mixed gas J-T analysis for any fluid combination of neon, nitrogen, various hydrocarbons, argon, oxygen, carbon monoxide, carbon dioxide, and hydrogen sulfide. When used in conjunction with the NIST computer code, DDMIX, it has accurately predicted order-of-magnitude increases in J-T cooling capacities when various hydrocarbons are added to nitrogen, and it predicts nitrogen normal boiling point depressions to as low as 60 K when neon is added.

  11. Ceramic Matrix Composites (CMC) Life Prediction Development

    NASA Technical Reports Server (NTRS)

    Levine, Stanley R.; Verrilli, Michael J.; Thomas, David J.; Halbig, Michael C.; Calomino, Anthony M.; Ellis, John R.; Opila, Elizabeth J.

    1990-01-01

    Advanced launch systems will very likely incorporate fiber reinforced ceramic matrix composites (CMC) in critical propulsion and airframe components. The use of CMC will save weight, increase operating margin, safety and performance, and improve reuse capability. For reusable and single mission use, accurate life prediction is critical to success. The tools to accomplish this are immature and not oriented toward the behavior of carbon fiber reinforced silicon carbide (C/SiC), the primary system of interest for many applications. This paper describes an approach and progress made to satisfy the need to develop an integrated life prediction system that addresses mechanical durability and environmental degradation.

  12. Predictive equations for the estimation of body size in seals and sea lions (Carnivora: Pinnipedia)

    PubMed Central

    Churchill, Morgan; Clementz, Mark T; Kohno, Naoki

    2014-01-01

    Body size plays an important role in pinniped ecology and life history. However, body size data is often absent for historical, archaeological, and fossil specimens. To estimate the body size of pinnipeds (seals, sea lions, and walruses) for today and the past, we used 14 commonly preserved cranial measurements to develop sets of single variable and multivariate predictive equations for pinniped body mass and total length. Principal components analysis (PCA) was used to test whether separate family specific regressions were more appropriate than single predictive equations for Pinnipedia. The influence of phylogeny was tested with phylogenetic independent contrasts (PIC). The accuracy of these regressions was then assessed using a combination of coefficient of determination, percent prediction error, and standard error of estimation. Three different methods of multivariate analysis were examined: bidirectional stepwise model selection using Akaike information criteria; all-subsets model selection using Bayesian information criteria (BIC); and partial least squares regression. The PCA showed clear discrimination between Otariidae (fur seals and sea lions) and Phocidae (earless seals) for the 14 measurements, indicating the need for family-specific regression equations. The PIC analysis found that phylogeny had a minor influence on relationship between morphological variables and body size. The regressions for total length were more accurate than those for body mass, and equations specific to Otariidae were more accurate than those for Phocidae. Of the three multivariate methods, the all-subsets approach required the fewest number of variables to estimate body size accurately. We then used the single variable predictive equations and the all-subsets approach to estimate the body size of two recently extinct pinniped taxa, the Caribbean monk seal (Monachus tropicalis) and the Japanese sea lion (Zalophus japonicus). Body size estimates using single variable regressions generally under or over-estimated body size; however, the all-subset regression produced body size estimates that were close to historically recorded body length for these two species. This indicates that the all-subset regression equations developed in this study can estimate body size accurately. PMID:24916814

  13. A Simple Plasma Retinol Isotope Ratio Method for Estimating β-Carotene Relative Bioefficacy in Humans: Validation with the Use of Model-Based Compartmental Analysis.

    PubMed

    Ford, Jennifer Lynn; Green, Joanne Balmer; Lietz, Georg; Oxley, Anthony; Green, Michael H

    2017-09-01

    Background: Provitamin A carotenoids are an important source of dietary vitamin A for many populations. Thus, accurate and simple methods for estimating carotenoid bioefficacy are needed to evaluate the vitamin A value of test solutions and plant sources. β-Carotene bioefficacy is often estimated from the ratio of the areas under plasma isotope response curves after subjects ingest labeled β-carotene and a labeled retinyl acetate reference dose [isotope reference method (IRM)], but to our knowledge, the method has not yet been evaluated for accuracy. Objectives: Our objectives were to develop and test a physiologically based compartmental model that includes both absorptive and postabsorptive β-carotene bioconversion and to use the model to evaluate the accuracy of the IRM and a simple plasma retinol isotope ratio [(RIR), labeled β-carotene-derived retinol/labeled reference-dose-derived retinol in one plasma sample] for estimating relative bioefficacy. Methods: We used model-based compartmental analysis (Simulation, Analysis and Modeling software) to develop and apply a model that provided known values for β-carotene bioefficacy. Theoretical data for 10 subjects were generated by the model and used to determine bioefficacy by RIR and IRM; predictions were compared with known values. We also applied RIR and IRM to previously published data. Results: Plasma RIR accurately predicted β-carotene relative bioefficacy at 14 d or later. IRM also accurately predicted bioefficacy by 14 d, except that, when there was substantial postabsorptive bioconversion, IRM underestimated bioefficacy. Based on our model, 1-d predictions of relative bioefficacy include absorptive plus a portion of early postabsorptive conversion. Conclusion: The plasma RIR is a simple tracer method that accurately predicts β-carotene relative bioefficacy based on analysis of one blood sample obtained at ≥14 d after co-ingestion of labeled β-carotene and retinyl acetate. The method also provides information about the contributions of absorptive and postabsorptive conversion to total bioefficacy if an additional sample is taken at 1 d. © 2017 American Society for Nutrition.

  14. Fundamental Algorithms of the Goddard Battery Model

    NASA Technical Reports Server (NTRS)

    Jagielski, J. M.

    1985-01-01

    The Goddard Space Flight Center (GSFC) is currently producing a computer model to predict Nickel Cadmium (NiCd) performance in a Low Earth Orbit (LEO) cycling regime. The model proper is currently still in development, but the inherent, fundamental algorithms (or methodologies) of the model are defined. At present, the model is closely dependent on empirical data and the data base currently used is of questionable accuracy. Even so, very good correlations have been determined between model predictions and actual cycling data. A more accurate and encompassing data base has been generated to serve dual functions: show the limitations of the current data base, and be inbred in the model properly for more accurate predictions. The fundamental algorithms of the model, and the present data base and its limitations, are described and a brief preliminary analysis of the new data base and its verification of the model's methodology are presented.

  15. Spectroscopy of organic semiconductors from first principles

    NASA Astrophysics Data System (ADS)

    Sharifzadeh, Sahar; Biller, Ariel; Kronik, Leeor; Neaton, Jeffery

    2011-03-01

    Advances in organic optoelectronic materials rely on an accurate understanding their spectroscopy, motivating the development of predictive theoretical methods that accurately describe the excited states of organic semiconductors. In this work, we use density functional theory and many-body perturbation theory (GW/BSE) to compute the electronic and optical properties of two well-studied organic semiconductors, pentacene and PTCDA. We carefully compare our calculations of the bulk density of states with available photoemission spectra, accounting for the role of finite temperature and surface effects in experiment, and examining the influence of our main approximations -- e.g. the GW starting point and the application of the generalized plasmon-pole model -- on the predicted electronic structure. Moreover, our predictions for the nature of the exciton and its binding energy are discussed and compared against optical absorption data. We acknowledge DOE, NSF, and BASF for financial support and NERSC for computational resources.

  16. Automated adaptive inference of phenomenological dynamical models.

    PubMed

    Daniels, Bryan C; Nemenman, Ilya

    2015-08-21

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved.

  17. Remote sensing techniques for prediction of watershed runoff

    NASA Technical Reports Server (NTRS)

    Blanchard, B. J.

    1975-01-01

    Hydrologic parameters of watersheds for use in mathematical models and as design criteria for flood detention structures are sometimes difficult to quantify using conventional measuring systems. The advent of remote sensing devices developed in the past decade offers the possibility that watershed characteristics such as vegetative cover, soils, soil moisture, etc., may be quantified rapidly and economically. Experiments with visible and near infrared data from the LANDSAT-1 multispectral scanner indicate a simple technique for calibration of runoff equation coefficients is feasible. The technique was tested on 10 watersheds in the Chickasha area and test results show more accurate runoff coefficients were obtained than with conventional methods. The technique worked equally as well using a dry fall scene. The runoff equation coefficients were then predicted for 22 subwatersheds with flood detention structures. Predicted values were again more accurate than coefficients produced by conventional methods.

  18. Automated adaptive inference of phenomenological dynamical models

    PubMed Central

    Daniels, Bryan C.; Nemenman, Ilya

    2015-01-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved. PMID:26293508

  19. Epidemic predictions in an imperfect world: modelling disease spread with partial data

    PubMed Central

    Dawson, Peter M.; Werkman, Marleen; Brooks-Pollock, Ellen; Tildesley, Michael J.

    2015-01-01

    ‘Big-data’ epidemic models are being increasingly used to influence government policy to help with control and eradication of infectious diseases. In the case of livestock, detailed movement records have been used to parametrize realistic transmission models. While livestock movement data are readily available in the UK and other countries in the EU, in many countries around the world, such detailed data are not available. By using a comprehensive database of the UK cattle trade network, we implement various sampling strategies to determine the quantity of network data required to give accurate epidemiological predictions. It is found that by targeting nodes with the highest number of movements, accurate predictions on the size and spatial spread of epidemics can be made. This work has implications for countries such as the USA, where access to data is limited, and developing countries that may lack the resources to collect a full dataset on livestock movements. PMID:25948687

  20. Teachers' Acceptance of Absenteeism: Towards Developing a Specific Scale

    ERIC Educational Resources Information Center

    Shapira-Lishchinsky, Orly; Ishan, Gamal

    2013-01-01

    Purpose: This study aims to develop and validate a measure of a specific attitude toward teachers' absenteeism that predicts this behavior more accurately than other general measures of job attitudes. Design/methodology/approach: Participants were 443 teachers from 21 secondary schools in Israel. In the first phase, the teachers answered anonymous…

  1. A PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODEL FOR TRICHLOROETHYLENE WITH SPECIFICITY FOR THE LONG EVANS RAT

    EPA Science Inventory

    A PBPK model for TCE with specificity for the male LE rat that accurately predicts TCE tissue time-course data has not been developed, although other PBPK models for TCE exist. Development of such a model was the present aim. The PBPK model consisted of 5 compartments: fat; slowl...

  2. The Juvenile Addiction Risk Rating: Development and Initial Psychometrics

    ERIC Educational Resources Information Center

    Powell, Michael; Newgent, Rebecca A.

    2016-01-01

    This article describes the development and psychometrics of the Juvenile Addiction Risk Rating. The Juvenile Addiction Risk Rating is a brief screening of addiction potential based on 10 risk factors predictive of youth alcohol and drug-related problems that assists examiners in more accurate treatment planning when self-report information is…

  3. Correlation of electron and proton irradiation-induced damage in InP solar cells

    NASA Technical Reports Server (NTRS)

    Walters, Robert J.; Summers, Geoffrey P.; Messenger, Scott R.; Burke, Edward A.

    1995-01-01

    When determining the best solar cell technology for a particular space flight mission, accurate prediction of solar cell performance in a space radiation environment is essential. The current methodology used to make such predictions requires extensive experimental data measured under both electron and proton irradiation. Due to the rising cost of accelerators and irradiation facilities, such extensive data sets are expensive to obtain. Moreover, with the rapid development of novel cell designs, the necessary data are often not available. Therefore, a method for predicting cell degradation based on limited data is needed. Such a method has been developed at the Naval Research Laboratory based on damage correlation using 'displacement damage dose' which is the product of the non-ionizing energy loss (NIEL) and the particle fluence. Displacement damage dose is a direct analog of the ionization dose used to correlate the effects of ionizing radiations. In this method, the performance of a solar cell in a complex radiation environment can be predicted from data on a single proton energy and two electron energies, or one proton energy, one electron energy, and Co(exp 60) gammas. This method has been used to accurately predict the extensive data set measured by Anspaugh on GaAs/Ge solar cells under a wide range of electron and proton energies. In this paper, the method is applied to InP solar cells using data measured under 1 MeV electron and 3 MeV proton irradiations, and the calculations are shown to agree well with the measured data. In addition to providing accurate damage predictions, this method also provides a basis for quantitative comparisons of the performance of different cell technologies. The performance of the present InP cells is compared to that published for GaAs/Ge cells. The results show InP to be inherently more resistant to displacement energy deposition than GaAs/Ge.

  4. Evaluation of wave runup predictions from numerical and parametric models

    USGS Publications Warehouse

    Stockdon, Hilary F.; Thompson, David M.; Plant, Nathaniel G.; Long, Joseph W.

    2014-01-01

    Wave runup during storms is a primary driver of coastal evolution, including shoreline and dune erosion and barrier island overwash. Runup and its components, setup and swash, can be predicted from a parameterized model that was developed by comparing runup observations to offshore wave height, wave period, and local beach slope. Because observations during extreme storms are often unavailable, a numerical model is used to simulate the storm-driven runup to compare to the parameterized model and then develop an approach to improve the accuracy of the parameterization. Numerically simulated and parameterized runup were compared to observations to evaluate model accuracies. The analysis demonstrated that setup was accurately predicted by both the parameterized model and numerical simulations. Infragravity swash heights were most accurately predicted by the parameterized model. The numerical model suffered from bias and gain errors that depended on whether a one-dimensional or two-dimensional spatial domain was used. Nonetheless, all of the predictions were significantly correlated to the observations, implying that the systematic errors can be corrected. The numerical simulations did not resolve the incident-band swash motions, as expected, and the parameterized model performed best at predicting incident-band swash heights. An assimilated prediction using a weighted average of the parameterized model and the numerical simulations resulted in a reduction in prediction error variance. Finally, the numerical simulations were extended to include storm conditions that have not been previously observed. These results indicated that the parameterized predictions of setup may need modification for extreme conditions; numerical simulations can be used to extend the validity of the parameterized predictions of infragravity swash; and numerical simulations systematically underpredict incident swash, which is relatively unimportant under extreme conditions.

  5. ESTIMATING CONTAMINANT DOSE FOR INTERMITTENT DERMAL CONTACT: MODEL DEVELOPMENT, TESTING, AND APPLICATION

    EPA Science Inventory

    Assessments of aggregate exposure to pesticides and other surface contamination in residential environments are often driven by assumptions about dermal contacts. Accurately predicting cumulative doses from realistic skin contact scenarios requires characterization of exposure sc...

  6. Improved analysis tool for concrete pavement : [project summary].

    DOT National Transportation Integrated Search

    2017-10-01

    University of Florida researchers developed 3D-FE models to more accurately predict the behavior of concrete slabs. They also followed up on a project to characterize strain gauge performance for a Florida Department of Transportation (FDOT) concrete...

  7. A Crowdsourcing Approach to Developing and Assessing Prediction Algorithms for AML Prognosis

    PubMed Central

    Noren, David P.; Long, Byron L.; Norel, Raquel; Rrhissorrakrai, Kahn; Hess, Kenneth; Hu, Chenyue Wendy; Bisberg, Alex J.; Schultz, Andre; Engquist, Erik; Liu, Li; Lin, Xihui; Chen, Gregory M.; Xie, Honglei; Hunter, Geoffrey A. M.; Norman, Thea; Friend, Stephen H.; Stolovitzky, Gustavo; Kornblau, Steven; Qutub, Amina A.

    2016-01-01

    Acute Myeloid Leukemia (AML) is a fatal hematological cancer. The genetic abnormalities underlying AML are extremely heterogeneous among patients, making prognosis and treatment selection very difficult. While clinical proteomics data has the potential to improve prognosis accuracy, thus far, the quantitative means to do so have yet to be developed. Here we report the results and insights gained from the DREAM 9 Acute Myeloid Prediction Outcome Prediction Challenge (AML-OPC), a crowdsourcing effort designed to promote the development of quantitative methods for AML prognosis prediction. We identify the most accurate and robust models in predicting patient response to therapy, remission duration, and overall survival. We further investigate patient response to therapy, a clinically actionable prediction, and find that patients that are classified as resistant to therapy are harder to predict than responsive patients across the 31 models submitted to the challenge. The top two performing models, which held a high sensitivity to these patients, substantially utilized the proteomics data to make predictions. Using these models, we also identify which signaling proteins were useful in predicting patient therapeutic response. PMID:27351836

  8. New Objective Refraction Metric Based on Sphere Fitting to the Wavefront

    PubMed Central

    Martínez-Finkelshtein, Andreí

    2017-01-01

    Purpose To develop an objective refraction formula based on the ocular wavefront error (WFE) expressed in terms of Zernike coefficients and pupil radius, which would be an accurate predictor of subjective spherical equivalent (SE) for different pupil sizes. Methods A sphere is fitted to the ocular wavefront at the center and at a variable distance, t. The optimal fitting distance, topt, is obtained empirically from a dataset of 308 eyes as a function of objective refraction pupil radius, r0, and used to define the formula of a new wavefront refraction metric (MTR). The metric is tested in another, independent dataset of 200 eyes. Results For pupil radii r0 ≤ 2 mm, the new metric predicts the equivalent sphere with similar accuracy (<0.1D), however, for r0 > 2 mm, the mean error of traditional metrics can increase beyond 0.25D, and the MTR remains accurate. The proposed metric allows clinicians to obtain an accurate clinical spherical equivalent value without rescaling/refitting of the wavefront coefficients. It has the potential to be developed into a metric which will be able to predict full spherocylindrical refraction for the desired illumination conditions and corresponding pupil size. PMID:29104804

  9. New Objective Refraction Metric Based on Sphere Fitting to the Wavefront.

    PubMed

    Jaskulski, Mateusz; Martínez-Finkelshtein, Andreí; López-Gil, Norberto

    2017-01-01

    To develop an objective refraction formula based on the ocular wavefront error (WFE) expressed in terms of Zernike coefficients and pupil radius, which would be an accurate predictor of subjective spherical equivalent (SE) for different pupil sizes. A sphere is fitted to the ocular wavefront at the center and at a variable distance, t . The optimal fitting distance, t opt , is obtained empirically from a dataset of 308 eyes as a function of objective refraction pupil radius, r 0 , and used to define the formula of a new wavefront refraction metric (MTR). The metric is tested in another, independent dataset of 200 eyes. For pupil radii r 0 ≤ 2 mm, the new metric predicts the equivalent sphere with similar accuracy (<0.1D), however, for r 0 > 2 mm, the mean error of traditional metrics can increase beyond 0.25D, and the MTR remains accurate. The proposed metric allows clinicians to obtain an accurate clinical spherical equivalent value without rescaling/refitting of the wavefront coefficients. It has the potential to be developed into a metric which will be able to predict full spherocylindrical refraction for the desired illumination conditions and corresponding pupil size.

  10. Accounting for receptor flexibility and enhanced sampling methods in computer-aided drug design.

    PubMed

    Sinko, William; Lindert, Steffen; McCammon, J Andrew

    2013-01-01

    Protein flexibility plays a major role in biomolecular recognition. In many cases, it is not obvious how molecular structure will change upon association with other molecules. In proteins, these changes can be major, with large deviations in overall backbone structure, or they can be more subtle as in a side-chain rotation. Either way the algorithms that predict the favorability of biomolecular association require relatively accurate predictions of the bound structure to give an accurate assessment of the energy involved in association. Here, we review a number of techniques that have been proposed to accommodate receptor flexibility in the simulation of small molecules binding to protein receptors. We investigate modifications to standard rigid receptor docking algorithms and also explore enhanced sampling techniques, and the combination of free energy calculations and enhanced sampling techniques. The understanding and allowance for receptor flexibility are helping to make computer simulations of ligand protein binding more accurate. These developments may help improve the efficiency of drug discovery and development. Efficiency will be essential as we begin to see personalized medicine tailored to individual patients, which means specific drugs are needed for each patient's genetic makeup. © 2012 John Wiley & Sons A/S.

  11. Predictive and Experimental Approaches for Elucidating Protein–Protein Interactions and Quaternary Structures

    PubMed Central

    Nealon, John Oliver; Philomina, Limcy Seby

    2017-01-01

    The elucidation of protein–protein interactions is vital for determining the function and action of quaternary protein structures. Here, we discuss the difficulty and importance of establishing protein quaternary structure and review in vitro and in silico methods for doing so. Determining the interacting partner proteins of predicted protein structures is very time-consuming when using in vitro methods, this can be somewhat alleviated by use of predictive methods. However, developing reliably accurate predictive tools has proved to be difficult. We review the current state of the art in predictive protein interaction software and discuss the problem of scoring and therefore ranking predictions. Current community-based predictive exercises are discussed in relation to the growth of protein interaction prediction as an area within these exercises. We suggest a fusion of experimental and predictive methods that make use of sparse experimental data to determine higher resolution predicted protein interactions as being necessary to drive forward development. PMID:29206185

  12. Prediction and assimilation of surf-zone processes using a Bayesian network: Part I: Forward models

    USGS Publications Warehouse

    Plant, Nathaniel G.; Holland, K. Todd

    2011-01-01

    Prediction of coastal processes, including waves, currents, and sediment transport, can be obtained from a variety of detailed geophysical-process models with many simulations showing significant skill. This capability supports a wide range of research and applied efforts that can benefit from accurate numerical predictions. However, the predictions are only as accurate as the data used to drive the models and, given the large temporal and spatial variability of the surf zone, inaccuracies in data are unavoidable such that useful predictions require corresponding estimates of uncertainty. We demonstrate how a Bayesian-network model can be used to provide accurate predictions of wave-height evolution in the surf zone given very sparse and/or inaccurate boundary-condition data. The approach is based on a formal treatment of a data-assimilation problem that takes advantage of significant reduction of the dimensionality of the model system. We demonstrate that predictions of a detailed geophysical model of the wave evolution are reproduced accurately using a Bayesian approach. In this surf-zone application, forward prediction skill was 83%, and uncertainties in the model inputs were accurately transferred to uncertainty in output variables. We also demonstrate that if modeling uncertainties were not conveyed to the Bayesian network (i.e., perfect data or model were assumed), then overly optimistic prediction uncertainties were computed. More consistent predictions and uncertainties were obtained by including model-parameter errors as a source of input uncertainty. Improved predictions (skill of 90%) were achieved because the Bayesian network simultaneously estimated optimal parameters while predicting wave heights.

  13. An Economical Semi-Analytical Orbit Theory for Retarded Satellite Motion About an Oblate Planet

    NASA Technical Reports Server (NTRS)

    Gordon, R. A.

    1980-01-01

    Brouwer and Brouwer-Lyddanes' use of the Von Zeipel-Delaunay method is employed to develop an efficient analytical orbit theory suitable for microcomputers. A succinctly simple pseudo-phenomenologically conceptualized algorithm is introduced which accurately and economically synthesizes modeling of drag effects. The method epitomizes and manifests effortless efficient computer mechanization. Simulated trajectory data is employed to illustrate the theory's ability to accurately accommodate oblateness and drag effects for microcomputer ground based or onboard predicted orbital representation. Real tracking data is used to demonstrate that the theory's orbit determination and orbit prediction capabilities are favorably adaptable to and are comparable with results obtained utilizing complex definitive Cowell method solutions on satellites experiencing significant drag effects.

  14. Verification of sub-grid filtered drag models for gas-particle fluidized beds with immersed cylinder arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarkar, Avik; Sun, Xin; Sundaresan, Sankaran

    2014-04-23

    The accuracy of coarse-grid multiphase CFD simulations of fluidized beds may be improved via the inclusion of filtered constitutive models. In our previous study (Sarkar et al., Chem. Eng. Sci., 104, 399-412), we developed such a set of filtered drag relationships for beds with immersed arrays of cooling tubes. Verification of these filtered drag models is addressed in this work. Predictions from coarse-grid simulations with the sub-grid filtered corrections are compared against accurate, highly-resolved simulations of full-scale turbulent and bubbling fluidized beds. The filtered drag models offer a computationally efficient yet accurate alternative for obtaining macroscopic predictions, but the spatialmore » resolution of meso-scale clustering heterogeneities is sacrificed.« less

  15. Obtaining Accurate Probabilities Using Classifier Calibration

    ERIC Educational Resources Information Center

    Pakdaman Naeini, Mahdi

    2016-01-01

    Learning probabilistic classification and prediction models that generate accurate probabilities is essential in many prediction and decision-making tasks in machine learning and data mining. One way to achieve this goal is to post-process the output of classification models to obtain more accurate probabilities. These post-processing methods are…

  16. ROI on yield data analysis systems through a business process management strategy

    NASA Astrophysics Data System (ADS)

    Rehani, Manu; Strader, Nathan; Hanson, Jeff

    2005-05-01

    The overriding motivation for yield engineering is profitability. This is achieved through application of yield management. The first application is to continually reduce waste in the form of yield loss. New products, new technologies and the dynamic state of the process and equipment keep introducing new ways to cause yield loss. In response, the yield management efforts have to continually come up with new solutions to minimize it. The second application of yield engineering is to aid in accurate product pricing. This is achieved through predicting future results of the yield engineering effort. The more accurate the yield prediction, the more accurate the wafer start volume, the more accurate the wafer pricing. Another aspect of yield prediction pertains to gauging the impact of a yield problem and predicting how long that will last. The ability to predict such impacts again feeds into wafer start calculations and wafer pricing. The question then is that if the stakes on yield management are so high why is it that most yield management efforts are run like science and engineering projects and less like manufacturing? In the eighties manufacturing put the theory of constraints1 into practice and put a premium on stability and predictability in manufacturing activities, why can't the same be done for yield management activities? This line of introspection led us to define and implement a business process to manage the yield engineering activities. We analyzed the best known methods (BKM) and deployed a workflow tool to make them the standard operating procedure (SOP) for yield managment. We present a case study in deploying a Business Process Management solution for Semiconductor Yield Engineering in a high-mix ASIC environment. We will present a description of the situation prior to deployment, a window into the development process and a valuation of the benefits.

  17. Analysis of Flight Management System Predictions of Idle-Thrust Descents

    NASA Technical Reports Server (NTRS)

    Stell, Laurel

    2010-01-01

    To enable arriving aircraft to fly optimized descents computed by the flight management system (FMS) in congested airspace, ground automation must accurately predict descent trajectories. To support development of the predictor and its uncertainty models, descents from cruise to the meter fix were executed using vertical navigation in a B737-700 simulator and a B777-200 simulator, both with commercial FMSs. For both aircraft types, the FMS computed the intended descent path for a specified speed profile assuming idle thrust after top of descent (TOD), and then it controlled the avionics without human intervention. The test matrix varied aircraft weight, descent speed, and wind conditions. The first analysis in this paper determined the effect of the test matrix parameters on the FMS computation of TOD location, and it compared the results to those for the current ground predictor in the Efficient Descent Advisor (EDA). The second analysis was similar but considered the time to fly a specified distance to the meter fix. The effects of the test matrix variables together with the accuracy requirements for the predictor will determine the allowable error for the predictor inputs. For the B737, the EDA prediction of meter fix crossing time agreed well with the FMS; but its prediction of TOD location probably was not sufficiently accurate to enable idle-thrust descents in congested airspace, even though the FMS and EDA gave similar shapes for TOD location as a function of the test matrix variables. For the B777, the FMS and EDA gave different shapes for the TOD location function, and the EDA prediction of the TOD location is not accurate enough to fully enable the concept. Furthermore, the differences between the FMS and EDA predictions of meter fix crossing time for the B777 indicated that at least one of them was not sufficiently accurate.

  18. Rapid and Accurate Evaluation of the Quality of Commercial Organic Fertilizers Using Near Infrared Spectroscopy

    PubMed Central

    Wang, Chang; Huang, Chichao; Qian, Jian; Xiao, Jian; Li, Huan; Wen, Yongli; He, Xinhua; Ran, Wei; Shen, Qirong; Yu, Guanghui

    2014-01-01

    The composting industry has been growing rapidly in China because of a boom in the animal industry. Therefore, a rapid and accurate assessment of the quality of commercial organic fertilizers is of the utmost importance. In this study, a novel technique that combines near infrared (NIR) spectroscopy with partial least squares (PLS) analysis is developed for rapidly and accurately assessing commercial organic fertilizers quality. A total of 104 commercial organic fertilizers were collected from full-scale compost factories in Jiangsu Province, east China. In general, the NIR-PLS technique showed accurate predictions of the total organic matter, water soluble organic nitrogen, pH, and germination index; less accurate results of the moisture, total nitrogen, and electrical conductivity; and the least accurate results for water soluble organic carbon. Our results suggested the combined NIR-PLS technique could be applied as a valuable tool to rapidly and accurately assess the quality of commercial organic fertilizers. PMID:24586313

  19. Rapid and accurate evaluation of the quality of commercial organic fertilizers using near infrared spectroscopy.

    PubMed

    Wang, Chang; Huang, Chichao; Qian, Jian; Xiao, Jian; Li, Huan; Wen, Yongli; He, Xinhua; Ran, Wei; Shen, Qirong; Yu, Guanghui

    2014-01-01

    The composting industry has been growing rapidly in China because of a boom in the animal industry. Therefore, a rapid and accurate assessment of the quality of commercial organic fertilizers is of the utmost importance. In this study, a novel technique that combines near infrared (NIR) spectroscopy with partial least squares (PLS) analysis is developed for rapidly and accurately assessing commercial organic fertilizers quality. A total of 104 commercial organic fertilizers were collected from full-scale compost factories in Jiangsu Province, east China. In general, the NIR-PLS technique showed accurate predictions of the total organic matter, water soluble organic nitrogen, pH, and germination index; less accurate results of the moisture, total nitrogen, and electrical conductivity; and the least accurate results for water soluble organic carbon. Our results suggested the combined NIR-PLS technique could be applied as a valuable tool to rapidly and accurately assess the quality of commercial organic fertilizers.

  20. Can Mathematical Models Predict the Outcomes of Prostate Cancer Patients Undergoing Intermittent Androgen Deprivation Therapy?

    NASA Astrophysics Data System (ADS)

    Everett, R. A.; Packer, A. M.; Kuang, Y.

    Androgen deprivation therapy is a common treatment for advanced or metastatic prostate cancer. Like the normal prostate, most tumors depend on androgens for proliferation and survival but often develop treatment resistance. Hormonal treatment causes many undesirable side effects which significantly decrease the quality of life for patients. Intermittently applying androgen deprivation in cycles reduces the total duration with these negative effects and may reduce selective pressure for resistance. We extend an existing model which used measurements of patient testosterone levels to accurately fit measured serum prostate specific antigen (PSA) levels. We test the model's predictive accuracy, using only a subset of the data to find parameter values. The results are compared with those of an existing piecewise linear model which does not use testosterone as an input. Since actual treatment protocol is to re-apply therapy when PSA levels recover beyond some threshold value, we develop a second method for predicting the PSA levels. Based on a small set of data from seven patients, our results showed that the piecewise linear model produced slightly more accurate results while the two predictive methods are comparable. This suggests that a simpler model may be more beneficial for a predictive use compared to a more biologically insightful model, although further research is needed in this field prior to implementing mathematical models as a predictive method in a clinical setting. Nevertheless, both models are an important step in this direction.

  1. Can Mathematical Models Predict the Outcomes of Prostate Cancer Patients Undergoing Intermittent Androgen Deprivation Therapy?

    NASA Astrophysics Data System (ADS)

    Everett, R. A.; Packer, A. M.; Kuang, Y.

    2014-04-01

    Androgen deprivation therapy is a common treatment for advanced or metastatic prostate cancer. Like the normal prostate, most tumors depend on androgens for proliferation and survival but often develop treatment resistance. Hormonal treatment causes many undesirable side effects which significantly decrease the quality of life for patients. Intermittently applying androgen deprivation in cycles reduces the total duration with these negative effects and may reduce selective pressure for resistance. We extend an existing model which used measurements of patient testosterone levels to accurately fit measured serum prostate specific antigen (PSA) levels. We test the model's predictive accuracy, using only a subset of the data to find parameter values. The results are compared with those of an existing piecewise linear model which does not use testosterone as an input. Since actual treatment protocol is to re-apply therapy when PSA levels recover beyond some threshold value, we develop a second method for predicting the PSA levels. Based on a small set of data from seven patients, our results showed that the piecewise linear model produced slightly more accurate results while the two predictive methods are comparable. This suggests that a simpler model may be more beneficial for a predictive use compared to a more biologically insightful model, although further research is needed in this field prior to implementing mathematical models as a predictive method in a clinical setting. Nevertheless, both models are an important step in this direction.

  2. Electrochemical carbon dioxide concentrator subsystem math model. [for manned space station

    NASA Technical Reports Server (NTRS)

    Marshall, R. D.; Carlson, J. N.; Schubert, F. H.

    1974-01-01

    A steady state computer simulation model has been developed to describe the performance of a total six man, self-contained electrochemical carbon dioxide concentrator subsystem built for the space station prototype. The math model combines expressions describing the performance of the electrochemical depolarized carbon dioxide concentrator cells and modules previously developed with expressions describing the performance of the other major CS-6 components. The model is capable of accurately predicting CS-6 performance over EDC operating ranges and the computer simulation results agree with experimental data obtained over the prediction range.

  3. Transcriptomics in cancer diagnostics: developments in technology, clinical research and commercialization.

    PubMed

    Sager, Monica; Yeat, Nai Chien; Pajaro-Van der Stadt, Stefan; Lin, Charlotte; Ren, Qiuyin; Lin, Jimmy

    2015-01-01

    Transcriptomic technologies are evolving to diagnose cancer earlier and more accurately to provide greater predictive and prognostic utility to oncologists and patients. Digital techniques such as RNA sequencing are replacing still-imaging techniques to provide more detailed analysis of the transcriptome and aberrant expression that causes oncogenesis, while companion diagnostics are developing to determine the likely effectiveness of targeted treatments. This article examines recent advancements in molecular profiling research and technology as applied to cancer diagnosis, clinical applications and predictions for the future of personalized medicine in oncology.

  4. FINDSITE-metal: Integrating evolutionary information and machine learning for structure-based metal binding site prediction at the proteome level

    PubMed Central

    Brylinski, Michal; Skolnick, Jeffrey

    2010-01-01

    The rapid accumulation of gene sequences, many of which are hypothetical proteins with unknown function, has stimulated the development of accurate computational tools for protein function prediction with evolution/structure-based approaches showing considerable promise. In this paper, we present FINDSITE-metal, a new threading-based method designed specifically to detect metal binding sites in modeled protein structures. Comprehensive benchmarks using different quality protein structures show that weakly homologous protein models provide sufficient structural information for quite accurate annotation by FINDSITE-metal. Combining structure/evolutionary information with machine learning results in highly accurate metal binding annotations; for protein models constructed by TASSER, whose average Cα RMSD from the native structure is 8.9 Å, 59.5% (71.9%) of the best of top five predicted metal locations are within 4 Å (8 Å) from a bound metal in the crystal structure. For most of the targets, multiple metal binding sites are detected with the best predicted binding site at rank 1 and within the top 2 ranks in 65.6% and 83.1% of the cases, respectively. Furthermore, for iron, copper, zinc, calcium and magnesium ions, the binding metal can be predicted with high, typically 70-90%, accuracy. FINDSITE-metal also provides a set of confidence indexes that help assess the reliability of predictions. Finally, we describe the proteome-wide application of FINDSITE-metal that quantifies the metal binding complement of the human proteome. FINDSITE-metal is freely available to the academic community at http://cssb.biology.gatech.edu/findsite-metal/. PMID:21287609

  5. The Gastric/Pancreatic Amylase Ratio Predicts Postoperative Pancreatic Fistula With High Sensitivity and Specificity

    PubMed Central

    Jin, Shuo; Shi, Xiao-Ju; Sun, Xiao-Dong; Zhang, Ping; Lv, Guo-Yue; Du, Xiao-Hong; Wang, Si-Yuan; Wang, Guang-Yi

    2015-01-01

    Abstract This article aims to identify risk factors for postoperative pancreatic fistula (POPF) and evaluate the gastric/pancreatic amylase ratio (GPAR) on postoperative day (POD) 3 as a POPF predictor in patients who undergo pancreaticoduodenectomy (PD). POPF significantly contributes to mortality and morbidity in patients who undergo PD. Previously identified predictors for POPF often have low predictive accuracy. Therefore, accurate POPF predictors are needed. In this prospective cohort study, we measured the clinical and biochemical factors of 61 patients who underwent PD and diagnosed POPF according to the definition of the International Study Group of Pancreatic Fistula. We analyzed the association between POPF and various factors, identified POPF risk factors, and evaluated the predictive power of the GPAR on POD3 and the levels of serum and ascites amylase. Of the 61 patients, 21 developed POPF. The color of the pancreatic drain fluid, POD1 serum, POD1 median output of pancreatic drain fluid volume, and GPAR were significantly associated with POPF. The color of the pancreatic drain fluid and high GPAR were independent risk factors. Although serum and ascites amylase did not predict POPF accurately, the cutoff value was 1.24, and GPAR predicted POPF with high sensitivity and specificity. This is the first report demonstrating that high GPAR on POD3 is a risk factor for POPF and showing that GPAR is a more accurate predictor of POPF than the previously reported amylase markers. PMID:25621676

  6. The gastric/pancreatic amylase ratio predicts postoperative pancreatic fistula with high sensitivity and specificity.

    PubMed

    Jin, Shuo; Shi, Xiao-Ju; Sun, Xiao-Dong; Zhang, Ping; Lv, Guo-Yue; Du, Xiao-Hong; Wang, Si-Yuan; Wang, Guang-Yi

    2015-01-01

    This article aims to identify risk factors for postoperative pancreatic fistula (POPF) and evaluate the gastric/pancreatic amylase ratio (GPAR) on postoperative day (POD) 3 as a POPF predictor in patients who undergo pancreaticoduodenectomy (PD).POPF significantly contributes to mortality and morbidity in patients who undergo PD. Previously identified predictors for POPF often have low predictive accuracy. Therefore, accurate POPF predictors are needed.In this prospective cohort study, we measured the clinical and biochemical factors of 61 patients who underwent PD and diagnosed POPF according to the definition of the International Study Group of Pancreatic Fistula. We analyzed the association between POPF and various factors, identified POPF risk factors, and evaluated the predictive power of the GPAR on POD3 and the levels of serum and ascites amylase.Of the 61 patients, 21 developed POPF. The color of the pancreatic drain fluid, POD1 serum, POD1 median output of pancreatic drain fluid volume, and GPAR were significantly associated with POPF. The color of the pancreatic drain fluid and high GPAR were independent risk factors. Although serum and ascites amylase did not predict POPF accurately, the cutoff value was 1.24, and GPAR predicted POPF with high sensitivity and specificity.This is the first report demonstrating that high GPAR on POD3 is a risk factor for POPF and showing that GPAR is a more accurate predictor of POPF than the previously reported amylase markers.

  7. Summary of Data from the First AIAA CFD Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Levy, David W.; Zickuhr, Tom; Vassberg, John; Agrawal, Shreekant; Wahls, Richard A.; Pirzadeh, Shahyar; Hemsch, Michael J.

    2002-01-01

    The results from the first AIAA CFD Drag Prediction Workshop are summarized. The workshop was designed specifically to assess the state-of-the-art of computational fluid dynamics methods for force and moment prediction. An impartial forum was provided to evaluate the effectiveness of existing computer codes and modeling techniques, and to identify areas needing additional research and development. The subject of the study was the DLR-F4 wing-body configuration, which is representative of transport aircraft designed for transonic flight. Specific test cases were required so that valid comparisons could be made. Optional test cases included constant-C(sub L) drag-rise predictions typically used in airplane design by industry. Results are compared to experimental data from three wind tunnel tests. A total of 18 international participants using 14 different codes submitted data to the workshop. No particular grid type or turbulence model was more accurate, when compared to each other, or to wind tunnel data. Most of the results overpredicted C(sub Lo) and C(sub Do), but induced drag (dC(sub D)/dC(sub L)(exp 2)) agreed fairly well. Drag rise at high Mach number was underpredicted, however, especially at high C(sub L). On average, the drag data were fairly accurate, but the scatter was greater than desired. The results show that well-validated Reynolds-Averaged Navier-Stokes CFD methods are sufficiently accurate to make design decisions based on predicted drag.

  8. Predictive Monitoring for Improved Management of Glucose Levels

    PubMed Central

    Reifman, Jaques; Rajaraman, Srinivasan; Gribok, Andrei; Ward, W. Kenneth

    2007-01-01

    Background Recent developments and expected near-future improvements in continuous glucose monitoring (CGM) devices provide opportunities to couple them with mathematical forecasting models to produce predictive monitoring systems for early, proactive glycemia management of diabetes mellitus patients before glucose levels drift to undesirable levels. This article assesses the feasibility of data-driven models to serve as the forecasting engine of predictive monitoring systems. Methods We investigated the capabilities of data-driven autoregressive (AR) models to (1) capture the correlations in glucose time-series data, (2) make accurate predictions as a function of prediction horizon, and (3) be made portable from individual to individual without any need for model tuning. The investigation is performed by employing CGM data from nine type 1 diabetic subjects collected over a continuous 5-day period. Results With CGM data serving as the gold standard, AR model-based predictions of glucose levels assessed over nine subjects with Clarke error grid analysis indicated that, for a 30-minute prediction horizon, individually tuned models yield 97.6 to 100.0% of data in the clinically acceptable zones A and B, whereas cross-subject, portable models yield 95.8 to 99.7% of data in zones A and B. Conclusions This study shows that, for a 30-minute prediction horizon, data-driven AR models provide sufficiently-accurate and clinically-acceptable estimates of glucose levels for timely, proactive therapy and should be considered as the modeling engine for predictive monitoring of patients with type 1 diabetes mellitus. It also suggests that AR models can be made portable from individual to individual with minor performance penalties, while greatly reducing the burden associated with model tuning and data collection for model development. PMID:19885110

  9. Modeling Clinical Outcomes in Prostate Cancer: Application and Validation of the Discrete Event Simulation Approach.

    PubMed

    Pan, Feng; Reifsnider, Odette; Zheng, Ying; Proskorovsky, Irina; Li, Tracy; He, Jianming; Sorensen, Sonja V

    2018-04-01

    Treatment landscape in prostate cancer has changed dramatically with the emergence of new medicines in the past few years. The traditional survival partition model (SPM) cannot accurately predict long-term clinical outcomes because it is limited by its ability to capture the key consequences associated with this changing treatment paradigm. The objective of this study was to introduce and validate a discrete-event simulation (DES) model for prostate cancer. A DES model was developed to simulate overall survival (OS) and other clinical outcomes based on patient characteristics, treatment received, and disease progression history. We tested and validated this model with clinical trial data from the abiraterone acetate phase III trial (COU-AA-302). The model was constructed with interim data (55% death) and validated with the final data (96% death). Predicted OS values were also compared with those from the SPM. The DES model's predicted time to chemotherapy and OS are highly consistent with the final observed data. The model accurately predicts the OS hazard ratio from the final data cut (predicted: 0.74; 95% confidence interval [CI] 0.64-0.85 and final actual: 0.74; 95% CI 0.6-0.88). The log-rank test to compare the observed and predicted OS curves indicated no statistically significant difference between observed and predicted curves. However, the predictions from the SPM based on interim data deviated significantly from the final data. Our study showed that a DES model with properly developed risk equations presents considerable improvements to the more traditional SPM in flexibility and predictive accuracy of long-term outcomes. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Electrochemical carbon dioxide concentrator: Math model

    NASA Technical Reports Server (NTRS)

    Marshall, R. D.; Schubert, F. H.; Carlson, J. N.

    1973-01-01

    A steady state computer simulation model of an Electrochemical Depolarized Carbon Dioxide Concentrator (EDC) has been developed. The mathematical model combines EDC heat and mass balance equations with empirical correlations derived from experimental data to describe EDC performance as a function of the operating parameters involved. The model is capable of accurately predicting performance over EDC operating ranges. Model simulation results agree with the experimental data obtained over the prediction range.

  11. Hyperspectral scattering profiles for prediction of the microbial spoilage of beef

    NASA Astrophysics Data System (ADS)

    Peng, Yankun; Zhang, Jing; Wu, Jianhu; Hang, Hui

    2009-05-01

    Spoilage in beef is the result of decomposition and the formation of metabolites caused by the growth and enzymatic activity of microorganisms. There is still no technology for the rapid, accurate and non-destructive detection of bacterially spoiled or contaminated beef. In this study, hyperspectral imaging technique was exploited to measure biochemical changes within the fresh beef. Fresh beef rump steaks were purchased from a commercial plant, and left to spoil in refrigerator at 8°C. Every 12 hours, hyperspectral scattering profiles over the spectral region between 400 nm and 1100 nm were collected directly from the sample surface in reflection pattern in order to develop an optimal model for prediction of the beef spoilage, in parallel the total viable count (TVC) per gram of beef were obtained by classical microbiological plating methods. The spectral scattering profiles at individual wavelengths were fitted accurately by a two-parameter Lorentzian distribution function. TVC prediction models were developed, using multi-linear regression, on relating individual Lorentzian parameters and their combinations at different wavelengths to log10(TVC) value. The best predictions were obtained with r2= 0.96 and SEP = 0.23 for log10(TVC). The research demonstrated that hyperspectral imaging technique is a valid tool for real-time and non-destructive detection of bacterial spoilage in beef.

  12. Evaluation of a Linear Cumulative Damage Failure Model for Epoxy Adhesive

    NASA Technical Reports Server (NTRS)

    Richardson, David E.; Batista-Rodriquez, Alicia; Macon, David; Totman, Peter; McCool, Alex (Technical Monitor)

    2001-01-01

    Recently a significant amount of work has been conducted to provide more complex and accurate material models for use in the evaluation of adhesive bondlines. Some of this has been prompted by recent studies into the effects of residual stresses on the integrity of bondlines. Several techniques have been developed for the analysis of bondline residual stresses. Key to these analyses is the criterion that is used for predicting failure. Residual stress loading of an adhesive bondline can occur over the life of the component. For many bonded systems, this can be several years. It is impractical to directly characterize failure of adhesive bondlines under a constant load for several years. Therefore, alternative approaches for predictions of bondline failures are required. In the past, cumulative damage failure models have been developed. These models have ranged from very simple to very complex. This paper documents the generation and evaluation of some of the most simple linear damage accumulation tensile failure models for an epoxy adhesive. This paper shows how several variations on the failure model were generated and presents an evaluation of the accuracy of these failure models in predicting creep failure of the adhesive. The paper shows that a simple failure model can be generated from short-term failure data for accurate predictions of long-term adhesive performance.

  13. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    PubMed Central

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models. PMID:26890307

  14. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness.

    PubMed

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models.

  15. Monitoring of Batch Industrial Crystallization with Growth, Nucleation, and Agglomeration. Part 1: Modeling with Method of Characteristics.

    PubMed

    Porru, Marcella; Özkan, Leyla

    2017-05-24

    This paper develops a new simulation model for crystal size distribution dynamics in industrial batch crystallization. The work is motivated by the necessity of accurate prediction models for online monitoring purposes. The proposed numerical scheme is able to handle growth, nucleation, and agglomeration kinetics by means of the population balance equation and the method of characteristics. The former offers a detailed description of the solid phase evolution, while the latter provides an accurate and efficient numerical solution. In particular, the accuracy of the prediction of the agglomeration kinetics, which cannot be ignored in industrial crystallization, has been assessed by comparing it with solutions in the literature. The efficiency of the solution has been tested on a simulation of a seeded flash cooling batch process. Since the proposed numerical scheme can accurately simulate the system behavior more than hundred times faster than the batch duration, it is suitable for online applications such as process monitoring tools based on state estimators.

  16. Computer-based personality judgments are more accurate than those made by humans

    PubMed Central

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  17. Computer-based personality judgments are more accurate than those made by humans.

    PubMed

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  18. Monitoring of Batch Industrial Crystallization with Growth, Nucleation, and Agglomeration. Part 1: Modeling with Method of Characteristics

    PubMed Central

    2017-01-01

    This paper develops a new simulation model for crystal size distribution dynamics in industrial batch crystallization. The work is motivated by the necessity of accurate prediction models for online monitoring purposes. The proposed numerical scheme is able to handle growth, nucleation, and agglomeration kinetics by means of the population balance equation and the method of characteristics. The former offers a detailed description of the solid phase evolution, while the latter provides an accurate and efficient numerical solution. In particular, the accuracy of the prediction of the agglomeration kinetics, which cannot be ignored in industrial crystallization, has been assessed by comparing it with solutions in the literature. The efficiency of the solution has been tested on a simulation of a seeded flash cooling batch process. Since the proposed numerical scheme can accurately simulate the system behavior more than hundred times faster than the batch duration, it is suitable for online applications such as process monitoring tools based on state estimators. PMID:28603342

  19. Nucleon-Nucleon Total Cross Section

    NASA Technical Reports Server (NTRS)

    Norbury, John W.

    2008-01-01

    The total proton-proton and neutron-proton cross sections currently used in the transport code HZETRN show significant disagreement with experiment in the GeV and EeV energy ranges. The GeV range is near the region of maximum cosmic ray intensity. It is therefore important to correct these cross sections, so that predictions of space radiation environments will be accurate. Parameterizations of nucleon-nucleon total cross sections are developed which are accurate over the entire energy range of the cosmic ray spectrum.

  20. Quantification of observed flare parameters in relation to a shear-index and verification of MHD models for flare prediction

    NASA Technical Reports Server (NTRS)

    Wu, S. T.

    1987-01-01

    The goal for the SAMEX magnetograph's optical system is to accurately measure the polarization state of sunlight in a narrow spectral bandwidth over the field of view of an active region to make an accurate determination of the magnetic field in that region. The instrumental polarization is characterized. The optics and coatings were designed to minimize this spurious polarization introduced by foreoptics. The method developed to calculate the instrumental polarization of the SAMEX optics is described.

  1. A Novel Grading Biomarker for the Prediction of Conversion From Mild Cognitive Impairment to Alzheimer's Disease.

    PubMed

    Tong, Tong; Gao, Qinquan; Guerrero, Ricardo; Ledig, Christian; Chen, Liang; Rueckert, Daniel; Initiative, Alzheimer's Disease Neuroimaging

    2017-01-01

    Identifying mild cognitive impairment (MCI) subjects who will progress to Alzheimer's disease (AD) is not only crucial in clinical practice, but also has a significant potential to enrich clinical trials. The purpose of this study is to develop an effective biomarker for an accurate prediction of MCI-to-AD conversion from magnetic resonance images. We propose a novel grading biomarker for the prediction of MCI-to-AD conversion. First, we comprehensively study the effects of several important factors on the performance in the prediction task including registration accuracy, age correction, feature selection, and the selection of training data. Based on the studies of these factors, a grading biomarker is then calculated for each MCI subject using sparse representation techniques. Finally, the grading biomarker is combined with age and cognitive measures to provide a more accurate prediction of MCI-to-AD conversion. Using the Alzheimer's Disease Neuroimaging Initiative (ADNI) dataset, the proposed global grading biomarker achieved an area under the receiver operating characteristic curve (AUC) in the range of 79-81% for the prediction of MCI-to-AD conversion within three years in tenfold cross validations. The classification AUC further increases to 84-92% when age and cognitive measures are combined with the proposed grading biomarker. The obtained accuracy of the proposed biomarker benefits from the contributions of different factors: a tradeoff registration level to align images to the template space, the removal of the normal aging effect, selection of discriminative voxels, the calculation of the grading biomarker using AD and normal control groups, and the integration of sparse representation technique and the combination of cognitive measures. The evaluation on the ADNI dataset shows the efficacy of the proposed biomarker and demonstrates a significant contribution in accurate prediction of MCI-to-AD conversion.

  2. Radiomics biomarkers for accurate tumor progression prediction of oropharyngeal cancer

    NASA Astrophysics Data System (ADS)

    Hadjiiski, Lubomir; Chan, Heang-Ping; Cha, Kenny H.; Srinivasan, Ashok; Wei, Jun; Zhou, Chuan; Prince, Mark; Papagerakis, Silvana

    2017-03-01

    Accurate tumor progression prediction for oropharyngeal cancers is crucial for identifying patients who would best be treated with optimized treatment and therefore minimize the risk of under- or over-treatment. An objective decision support system that can merge the available radiomics, histopathologic and molecular biomarkers in a predictive model based on statistical outcomes of previous cases and machine learning may assist clinicians in making more accurate assessment of oropharyngeal tumor progression. In this study, we evaluated the feasibility of developing individual and combined predictive models based on quantitative image analysis from radiomics, histopathology and molecular biomarkers for oropharyngeal tumor progression prediction. With IRB approval, 31, 84, and 127 patients with head and neck CT (CT-HN), tumor tissue microarrays (TMAs) and molecular biomarker expressions, respectively, were collected. For 8 of the patients all 3 types of biomarkers were available and they were sequestered in a test set. The CT-HN lesions were automatically segmented using our level sets based method. Morphological, texture and molecular based features were extracted from CT-HN and TMA images, and selected features were merged by a neural network. The classification accuracy was quantified using the area under the ROC curve (AUC). Test AUCs of 0.87, 0.74, and 0.71 were obtained with the individual predictive models based on radiomics, histopathologic, and molecular features, respectively. Combining the radiomics and molecular models increased the test AUC to 0.90. Combining all 3 models increased the test AUC further to 0.94. This preliminary study demonstrates that the individual domains of biomarkers are useful and the integrated multi-domain approach is most promising for tumor progression prediction.

  3. Pattern recognition and functional neuroimaging help to discriminate healthy adolescents at risk for mood disorders from low risk adolescents.

    PubMed

    Mourão-Miranda, Janaina; Oliveira, Leticia; Ladouceur, Cecile D; Marquand, Andre; Brammer, Michael; Birmaher, Boris; Axelson, David; Phillips, Mary L

    2012-01-01

    There are no known biological measures that accurately predict future development of psychiatric disorders in individual at-risk adolescents. We investigated whether machine learning and fMRI could help to: 1. differentiate healthy adolescents genetically at-risk for bipolar disorder and other Axis I psychiatric disorders from healthy adolescents at low risk of developing these disorders; 2. identify those healthy genetically at-risk adolescents who were most likely to develop future Axis I disorders. 16 healthy offspring genetically at risk for bipolar disorder and other Axis I disorders by virtue of having a parent with bipolar disorder and 16 healthy, age- and gender-matched low-risk offspring of healthy parents with no history of psychiatric disorders (12-17 year-olds) performed two emotional face gender-labeling tasks (happy/neutral; fearful/neutral) during fMRI. We used Gaussian Process Classifiers (GPC), a machine learning approach that assigns a predictive probability of group membership to an individual person, to differentiate groups and to identify those at-risk adolescents most likely to develop future Axis I disorders. Using GPC, activity to neutral faces presented during the happy experiment accurately and significantly differentiated groups, achieving 75% accuracy (sensitivity = 75%, specificity = 75%). Furthermore, predictive probabilities were significantly higher for those at-risk adolescents who subsequently developed an Axis I disorder than for those at-risk adolescents remaining healthy at follow-up. We show that a combination of two promising techniques, machine learning and neuroimaging, not only discriminates healthy low-risk from healthy adolescents genetically at-risk for Axis I disorders, but may ultimately help to predict which at-risk adolescents subsequently develop these disorders.

  4. Accurate Prediction of Motor Failures by Application of Multi CBM Tools: A Case Study

    NASA Astrophysics Data System (ADS)

    Dutta, Rana; Singh, Veerendra Pratap; Dwivedi, Jai Prakash

    2018-02-01

    Motor failures are very difficult to predict accurately with a single condition-monitoring tool as both electrical and the mechanical systems are closely related. Electrical problem, like phase unbalance, stator winding insulation failures can, at times, lead to vibration problem and at the same time mechanical failures like bearing failure, leads to rotor eccentricity. In this case study of a 550 kW blower motor it has been shown that a rotor bar crack was detected by current signature analysis and vibration monitoring confirmed the same. In later months in a similar motor vibration monitoring predicted bearing failure and current signature analysis confirmed the same. In both the cases, after dismantling the motor, the predictions were found to be accurate. In this paper we will be discussing the accurate predictions of motor failures through use of multi condition monitoring tools with two case studies.

  5. Aggregation Trade Offs in Family Based Recommendations

    NASA Astrophysics Data System (ADS)

    Berkovsky, Shlomo; Freyne, Jill; Coombe, Mac

    Personalized information access tools are frequently based on collaborative filtering recommendation algorithms. Collaborative filtering recommender systems typically suffer from a data sparsity problem, where systems do not have sufficient user data to generate accurate and reliable predictions. Prior research suggested using group-based user data in the collaborative filtering recommendation process to generate group-based predictions and partially resolve the sparsity problem. Although group recommendations are less accurate than personalized recommendations, they are more accurate than general non-personalized recommendations, which are the natural fall back when personalized recommendations cannot be generated. In this work we present initial results of a study that exploits the browsing logs of real families of users gathered in an eHealth portal. The browsing logs allowed us to experimentally compare the accuracy of two group-based recommendation strategies: aggregated group models and aggregated predictions. Our results showed that aggregating individual models into group models resulted in more accurate predictions than aggregating individual predictions into group predictions.

  6. Numerical simulation of dune-flat bed transition and stage‐discharge relationship with hysteresis effect

    USGS Publications Warehouse

    Shimizu, Yasuyuki; Giri, Sanjay; Yamaguchi, Satomi; Nelson, Jonathan M.

    2009-01-01

    This work presents recent advances on morphodynamic modeling of bed forms under unsteady discharge. This paper includes further development of a morphodynamic model proposed earlier by Giri and Shimizu (2006a). This model reproduces the temporal development of river dunes and accurately replicates the physical properties associated with bed form evolution. Model results appear to provide accurate predictions of bed form geometry and form drag over bed forms for arbitrary steady flows. However, accurate predictions of temporal changes of form drag are key to the prediction of stage‐discharge relation during flood events. Herein, the model capability is extended to replicate the dune–flat bed transition, and in turn, the variation of form drag produced by the temporal growth or decay of bed forms under unsteady flow conditions. Some numerical experiments are performed to analyze hysteresis of the stage‐discharge relationship caused by the transition between dune and flat bed regimes during rising and falling stages of varying flows. The numerical model successfully simulates dune–flat bed transition and the associated hysteresis of the stage‐discharge relationship; this is in good agreement with physical observations but has been treated in the past only using empirical methods. A hypothetical relationship for a sediment parameter (the mean step length) is proposed to a first level of approximation that enables reproduction of the dune–flat bed transition. The proposed numerical model demonstrates its ability to address an important practical problem associated with bed form evolution and flow resistance in varying flows.

  7. Practical theories for service life prediction of critical aerospace structural components

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Monaghan, Richard C.; Jackson, Raymond H.

    1992-01-01

    A new second-order theory was developed for predicting the service lives of aerospace structural components. The predictions based on this new theory were compared with those based on the Ko first-order theory and the classical theory of service life predictions. The new theory gives very accurate service life predictions. An equivalent constant-amplitude stress cycle method was proposed for representing the random load spectrum for crack growth calculations. This method predicts the most conservative service life. The proposed use of minimum detectable crack size, instead of proof load established crack size as an initial crack size for crack growth calculations, could give a more realistic service life.

  8. Risk prediction for chronic kidney disease progression using heterogeneous electronic health record data and time series analysis.

    PubMed

    Perotte, Adler; Ranganath, Rajesh; Hirsch, Jamie S; Blei, David; Elhadad, Noémie

    2015-07-01

    As adoption of electronic health records continues to increase, there is an opportunity to incorporate clinical documentation as well as laboratory values and demographics into risk prediction modeling. The authors develop a risk prediction model for chronic kidney disease (CKD) progression from stage III to stage IV that includes longitudinal data and features drawn from clinical documentation. The study cohort consisted of 2908 primary-care clinic patients who had at least three visits prior to January 1, 2013 and developed CKD stage III during their documented history. Development and validation cohorts were randomly selected from this cohort and the study datasets included longitudinal inpatient and outpatient data from these populations. Time series analysis (Kalman filter) and survival analysis (Cox proportional hazards) were combined to produce a range of risk models. These models were evaluated using concordance, a discriminatory statistic. A risk model incorporating longitudinal data on clinical documentation and laboratory test results (concordance 0.849) predicts progression from state III CKD to stage IV CKD more accurately when compared to a similar model without laboratory test results (concordance 0.733, P<.001), a model that only considers the most recent laboratory test results (concordance 0.819, P < .031) and a model based on estimated glomerular filtration rate (concordance 0.779, P < .001). A risk prediction model that takes longitudinal laboratory test results and clinical documentation into consideration can predict CKD progression from stage III to stage IV more accurately than three models that do not take all of these variables into consideration. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  9. Comparison of techniques for correction of magnification of pelvic X-rays for hip surgery planning.

    PubMed

    The, Bertram; Kootstra, Johan W J; Hosman, Anton H; Verdonschot, Nico; Gerritsma, Carina L E; Diercks, Ron L

    2007-12-01

    The aim of this study was to develop an accurate method for correction of magnification of pelvic x-rays to enhance accuracy of hip surgery planning. All investigated methods aim at estimating the anteroposterior location of the hip joint in supine position to correctly position a reference object for correction of magnification. An existing method-which is currently being used in clinical practice in our clinics-is based on estimating the position of the hip joint by palpation of the greater trochanter. It is only moderately accurate and difficult to execute reliably in clinical practice. To develop a new method, 99 patients who already had a hip implant in situ were included; this enabled determining the true location of the hip joint deducted from the magnification of the prosthesis. Physical examination was used to obtain predictor variables possibly associated with the height of the hip joint. This included a simple dynamic hip joint examination to estimate the position of the center of rotation. Prediction equations were then constructed using regression analysis. The performance of these prediction equations was compared with the performance of the existing protocol. The mean absolute error in predicting the height of the hip joint center using the old method was 20 mm (range -79 mm to +46 mm). This was 11 mm for the new method (-32 mm to +39 mm). The prediction equation is: height (mm) = 34 + 1/2 abdominal circumference (cm). The newly developed prediction equation is a superior method for predicting the height of the hip joint center for correction of magnification of pelvic x-rays. We recommend its implementation in the departments of radiology and orthopedic surgery.

  10. Application of the Spectral Element Method to Interior Noise Problems

    NASA Technical Reports Server (NTRS)

    Doyle, James F.

    1998-01-01

    The primary effort of this research project was focused the development of analytical methods for the accurate prediction of structural acoustic noise and response. Of particular interest was the development of curved frame and shell spectral elements for the efficient computational of structural response and of schemes to match this to the surrounding fluid.

  11. Future-Orientated Approaches to Curriculum Development: Fictive Scripting

    ERIC Educational Resources Information Center

    Garraway, James

    2017-01-01

    Though the future cannot be accurately predicted, it is possible to envisage a number of probable developments which can promote thinking about the future and so promote a more informed stance about what should or should not be done. Studies in technology and society have claimed that the use of a type of forecasting using plausible but imaginary…

  12. Development of a Skin Burn Predictive Model adapted to Laser Irradiation

    NASA Astrophysics Data System (ADS)

    Sonneck-Museux, N.; Scheer, E.; Perez, L.; Agay, D.; Autrique, L.

    2016-12-01

    Laser technology is increasingly used, and it is crucial for both safety and medical reasons that the impact of laser irradiation on human skin can be accurately predicted. This study is mainly focused on laser-skin interactions and potential lesions (burns). A mathematical model dedicated to heat transfers in skin exposed to infrared laser radiations has been developed. The model is validated by studying heat transfers in human skin and simultaneously performing experimentations an animal model (pig). For all experimental tests, pig's skin surface temperature is recorded. Three laser wavelengths have been tested: 808 nm, 1940 nm and 10 600 nm. The first is a diode laser producing radiation absorbed deep within the skin. The second wavelength has a more superficial effect. For the third wavelength, skin is an opaque material. The validity of the developed models is verified by comparison with experimental results (in vivo tests) and the results of previous studies reported in the literature. The comparison shows that the models accurately predict the burn degree caused by laser radiation over a wide range of conditions. The results show that the important parameter for burn prediction is the extinction coefficient. For the 1940 nm wavelength especially, significant differences between modeling results and literature have been observed, mainly due to this coefficient's value. This new model can be used as a predictive tool in order to estimate the amount of injury induced by several types (couple power-time) of laser aggressions on the arm, the face and on the palm of the hand.

  13. Accuracy of a nomogram for prediction of lymph-node metastasis detected with conventional histopathology and ultrastaging in endometrial cancer

    PubMed Central

    Koskas, M; Chereau, E; Ballester, M; Dubernard, G; Lécuru, F; Heitz, D; Mathevet, P; Marret, H; Querleu, D; Golfier, F; Leblanc, E; Luton, D; Rouzier, R; Daraï, E

    2013-01-01

    Background: We developed a nomogram based on five clinical and pathological characteristics to predict lymph-node (LN) metastasis with a high concordance probability in endometrial cancer. Sentinel LN (SLN) biopsy has been suggested as a compromise between systematic lymphadenectomy and no dissection in patients with low-risk endometrial cancer. Methods: Patients with stage I–II endometrial cancer had pelvic SLN and systematic pelvic-node dissection. All LNs were histopathologically examined, and the SLNs were examined by immunohistochemistry. We compared the accuracy of the nomogram at predicting LN detected with conventional histopathology (macrometastasis) and ultrastaging procedure using SLN (micrometastasis). Results: Thirty-eight of the 187 patients (20%) had pelvic LN metastases, 20 had macrometastases and 18 had micrometastases. For the prediction of macrometastases, the nomogram showed good discrimination, with an area under the receiver operating characteristic curve (AUC) of 0.76, and was well calibrated (average error =2.1%). For the prediction of micro- and macrometastases, the nomogram showed poorer discrimination, with an AUC of 0.67, and was less well calibrated (average error =10.9%). Conclusion: Our nomogram is accurate at predicting LN macrometastases but less accurate at predicting micrometastases. Our results suggest that micrometastases are an ‘intermediate state' between disease-free LN and macrometastasis. PMID:23481184

  14. Can We Predict Patient Wait Time?

    PubMed

    Pianykh, Oleg S; Rosenthal, Daniel I

    2015-10-01

    The importance of patient wait-time management and predictability can hardly be overestimated: For most hospitals, it is the patient queues that drive and define every bit of clinical workflow. The objective of this work was to study the predictability of patient wait time and identify its most influential predictors. To solve this problem, we developed a comprehensive list of 25 wait-related parameters, suggested in earlier work and observed in our own experiments. All parameters were chosen as derivable from a typical Hospital Information System dataset. The parameters were fed into several time-predicting models, and the best parameter subsets, discovered through exhaustive model search, were applied to a large sample of actual patient wait data. We were able to discover the most efficient wait-time prediction factors and models, such as the line-size models introduced in this work. Moreover, these models proved to be equally accurate and computationally efficient. Finally, the selected models were implemented in our patient waiting areas, displaying predicted wait times on the monitors located at the front desks. The limitations of these models are also discussed. Optimal regression models based on wait-line sizes can provide accurate and efficient predictions for patient wait time. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  15. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  16. Stochastic Short-term High-resolution Prediction of Solar Irradiance and Photovoltaic Power Output

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melin, Alexander M.; Olama, Mohammed M.; Dong, Jin

    The increased penetration of solar photovoltaic (PV) energy sources into electric grids has increased the need for accurate modeling and prediction of solar irradiance and power production. Existing modeling and prediction techniques focus on long-term low-resolution prediction over minutes to years. This paper examines the stochastic modeling and short-term high-resolution prediction of solar irradiance and PV power output. We propose a stochastic state-space model to characterize the behaviors of solar irradiance and PV power output. This prediction model is suitable for the development of optimal power controllers for PV sources. A filter-based expectation-maximization and Kalman filtering mechanism is employed tomore » estimate the parameters and states in the state-space model. The mechanism results in a finite dimensional filter which only uses the first and second order statistics. The structure of the scheme contributes to a direct prediction of the solar irradiance and PV power output without any linearization process or simplifying assumptions of the signal’s model. This enables the system to accurately predict small as well as large fluctuations of the solar signals. The mechanism is recursive allowing the solar irradiance and PV power to be predicted online from measurements. The mechanism is tested using solar irradiance and PV power measurement data collected locally in our lab.« less

  17. Forecasting municipal solid waste generation using artificial intelligence modelling approaches.

    PubMed

    Abbasi, Maryam; El Hanandeh, Ali

    2016-10-01

    Municipal solid waste (MSW) management is a major concern to local governments to protect human health, the environment and to preserve natural resources. The design and operation of an effective MSW management system requires accurate estimation of future waste generation quantities. The main objective of this study was to develop a model for accurate forecasting of MSW generation that helps waste related organizations to better design and operate effective MSW management systems. Four intelligent system algorithms including support vector machine (SVM), adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN) and k-nearest neighbours (kNN) were tested for their ability to predict monthly waste generation in the Logan City Council region in Queensland, Australia. Results showed artificial intelligence models have good prediction performance and could be successfully applied to establish municipal solid waste forecasting models. Using machine learning algorithms can reliably predict monthly MSW generation by training with waste generation time series. In addition, results suggest that ANFIS system produced the most accurate forecasts of the peaks while kNN was successful in predicting the monthly averages of waste quantities. Based on the results, the total annual MSW generated in Logan City will reach 9.4×10(7)kg by 2020 while the peak monthly waste will reach 9.37×10(6)kg. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. A multisensor evaluation of the asymmetric convective model, version 2, in southeast Texas.

    PubMed

    Kolling, Jenna S; Pleim, Jonathan E; Jeffries, Harvey E; Vizuete, William

    2013-01-01

    There currently exist a number of planetary boundary layer (PBL) schemes that can represent the effects of turbulence in daytime convective conditions, although these schemes remain a large source of uncertainty in meteorology and air quality model simulations. This study evaluates a recently developed combined local and nonlocal closure PBL scheme, the Asymmetric Convective Model, version 2 (ACM2), against PBL observations taken from radar wind profilers, a ground-based lidar, and multiple daytime radiosonde balloon launches. These observations were compared against predictions of PBLs from the Weather Research and Forecasting (WRF) model version 3.1 with the ACM2 PBL scheme option, and the Fifth-Generation Meteorological Model (MM5) version 3.7.3 with the Eta PBL scheme option that is currently being used to develop ozone control strategies in southeast Texas. MM5 and WRF predictions during the regulatory modeling episode were evaluated on their ability to predict the rise and fall of the PBL during daytime convective conditions across southeastern Texas. The MM5 predicted PBLs consistently underpredicted observations, and were also less than the WRF PBL predictions. The analysis reveals that the MM5 predicted a slower rising and shallower PBL not representative of the daytime urban boundary layer. Alternatively, the WRF model predicted a more accurate PBL evolution improving the root mean square error (RMSE), both temporally and spatially. The WRF model also more accurately predicted vertical profiles of temperature and moisture in the lowest 3 km of the atmosphere. Inspection of median surface temperature and moisture time-series plots revealed higher predicted surface temperatures in WRF and more surface moisture in MM5. These could not be attributed to surface heat fluxes, and thus the differences in performance of the WRF and MM5 models are likely due to the PBL schemes. An accurate depiction of the diurnal evolution of the planetary boundary layer (PBL) is necessary for realistic air quality simulations, and for formulating effective policy. The meteorological model used to support the southeast Texas 03 attainment demonstration made predictions of the PBL that were consistently less than those found in observations. The use of the Asymmetric Convective Model, version 2 (ACM2), predicted taller PBL heights and improved model predictions. A lower predicted PBL height in an air quality model would increase precursor concentrations and change the chemical production of O3 and possibly the response to control strategies.

  19. IPMP Global Fit - A one-step direct data analysis tool for predictive microbiology.

    PubMed

    Huang, Lihan

    2017-12-04

    The objective of this work is to develop and validate a unified optimization algorithm for performing one-step global regression analysis of isothermal growth and survival curves for determination of kinetic parameters in predictive microbiology. The algorithm is incorporated with user-friendly graphical interfaces (GUIs) to develop a data analysis tool, the USDA IPMP-Global Fit. The GUIs are designed to guide the users to easily navigate through the data analysis process and properly select the initial parameters for different combinations of mathematical models. The software is developed for one-step kinetic analysis to directly construct tertiary models by minimizing the global error between the experimental observations and mathematical models. The current version of the software is specifically designed for constructing tertiary models with time and temperature as the independent model parameters in the package. The software is tested with a total of 9 different combinations of primary and secondary models for growth and survival of various microorganisms. The results of data analysis show that this software provides accurate estimates of kinetic parameters. In addition, it can be used to improve the experimental design and data collection for more accurate estimation of kinetic parameters. IPMP-Global Fit can be used in combination with the regular USDA-IPMP for solving the inverse problems and developing tertiary models in predictive microbiology. Published by Elsevier B.V.

  20. Comparison of time series models for predicting campylobacteriosis risk in New Zealand.

    PubMed

    Al-Sakkaf, A; Jones, G

    2014-05-01

    Predicting campylobacteriosis cases is a matter of considerable concern in New Zealand, after the number of the notified cases was the highest among the developed countries in 2006. Thus, there is a need to develop a model or a tool to predict accurately the number of campylobacteriosis cases as the Microbial Risk Assessment Model used to predict the number of campylobacteriosis cases failed to predict accurately the number of actual cases. We explore the appropriateness of classical time series modelling approaches for predicting campylobacteriosis. Finding the most appropriate time series model for New Zealand data has additional practical considerations given a possible structural change, that is, a specific and sudden change in response to the implemented interventions. A univariate methodological approach was used to predict monthly disease cases using New Zealand surveillance data of campylobacteriosis incidence from 1998 to 2009. The data from the years 1998 to 2008 were used to model the time series with the year 2009 held out of the data set for model validation. The best two models were then fitted to the full 1998-2009 data and used to predict for each month of 2010. The Holt-Winters (multiplicative) and ARIMA (additive) intervention models were considered the best models for predicting campylobacteriosis in New Zealand. It was noticed that the prediction by an additive ARIMA with intervention was slightly better than the prediction by a Holt-Winter multiplicative method for the annual total in year 2010, the former predicting only 23 cases less than the actual reported cases. It is confirmed that classical time series techniques such as ARIMA with intervention and Holt-Winters can provide a good prediction performance for campylobacteriosis risk in New Zealand. The results reported by this study are useful to the New Zealand Health and Safety Authority's efforts in addressing the problem of the campylobacteriosis epidemic. © 2013 Blackwell Verlag GmbH.

  1. Developing a dengue forecast model using machine learning: A case study in China

    PubMed Central

    Zhang, Qin; Wang, Li; Xiao, Jianpeng; Zhang, Qingying; Luo, Ganfeng; Li, Zhihao; He, Jianfeng; Zhang, Yonghui; Ma, Wenjun

    2017-01-01

    Background In China, dengue remains an important public health issue with expanded areas and increased incidence recently. Accurate and timely forecasts of dengue incidence in China are still lacking. We aimed to use the state-of-the-art machine learning algorithms to develop an accurate predictive model of dengue. Methodology/Principal findings Weekly dengue cases, Baidu search queries and climate factors (mean temperature, relative humidity and rainfall) during 2011–2014 in Guangdong were gathered. A dengue search index was constructed for developing the predictive models in combination with climate factors. The observed year and week were also included in the models to control for the long-term trend and seasonality. Several machine learning algorithms, including the support vector regression (SVR) algorithm, step-down linear regression model, gradient boosted regression tree algorithm (GBM), negative binomial regression model (NBM), least absolute shrinkage and selection operator (LASSO) linear regression model and generalized additive model (GAM), were used as candidate models to predict dengue incidence. Performance and goodness of fit of the models were assessed using the root-mean-square error (RMSE) and R-squared measures. The residuals of the models were examined using the autocorrelation and partial autocorrelation function analyses to check the validity of the models. The models were further validated using dengue surveillance data from five other provinces. The epidemics during the last 12 weeks and the peak of the 2014 large outbreak were accurately forecasted by the SVR model selected by a cross-validation technique. Moreover, the SVR model had the consistently smallest prediction error rates for tracking the dynamics of dengue and forecasting the outbreaks in other areas in China. Conclusion and significance The proposed SVR model achieved a superior performance in comparison with other forecasting techniques assessed in this study. The findings can help the government and community respond early to dengue epidemics. PMID:29036169

  2. Less is more: Sampling chemical space with active learning

    NASA Astrophysics Data System (ADS)

    Smith, Justin S.; Nebgen, Ben; Lubbers, Nicholas; Isayev, Olexandr; Roitberg, Adrian E.

    2018-06-01

    The development of accurate and transferable machine learning (ML) potentials for predicting molecular energetics is a challenging task. The process of data generation to train such ML potentials is a task neither well understood nor researched in detail. In this work, we present a fully automated approach for the generation of datasets with the intent of training universal ML potentials. It is based on the concept of active learning (AL) via Query by Committee (QBC), which uses the disagreement between an ensemble of ML potentials to infer the reliability of the ensemble's prediction. QBC allows the presented AL algorithm to automatically sample regions of chemical space where the ML potential fails to accurately predict the potential energy. AL improves the overall fitness of ANAKIN-ME (ANI) deep learning potentials in rigorous test cases by mitigating human biases in deciding what new training data to use. AL also reduces the training set size to a fraction of the data required when using naive random sampling techniques. To provide validation of our AL approach, we develop the COmprehensive Machine-learning Potential (COMP6) benchmark (publicly available on GitHub) which contains a diverse set of organic molecules. Active learning-based ANI potentials outperform the original random sampled ANI-1 potential with only 10% of the data, while the final active learning-based model vastly outperforms ANI-1 on the COMP6 benchmark after training to only 25% of the data. Finally, we show that our proposed AL technique develops a universal ANI potential (ANI-1x) that provides accurate energy and force predictions on the entire COMP6 benchmark. This universal ML potential achieves a level of accuracy on par with the best ML potentials for single molecules or materials, while remaining applicable to the general class of organic molecules composed of the elements CHNO.

  3. The Environmental Action Internal Control Index.

    ERIC Educational Resources Information Center

    Smith-Sebasto, N. J.; Fortner, Rosanne W.

    1994-01-01

    Reports research designed to develop a reliable and valid instrument to assess the relationship between locus of control of reinforcement and environmentally responsible behavior in (n=853) undergraduate students. Results suggest that the Environmental Action Internal Control Index can accurately predict environmentally responsible behavior.…

  4. Finite element estimation of the residual stresses in roller-straightened rail

    DOT National Transportation Integrated Search

    2004-11-13

    The purpose of this paper is to develop models to accurately predict : the residual stresses due to the roller straightening of railroad rails. : Several aspects of residual stress creation in rail due to roller : straightening are addressed. The eff...

  5. Dosimetric verification of radiation therapy including intensity modulated treatments, using an amorphous-silicon electronic portal imaging device

    NASA Astrophysics Data System (ADS)

    Chytyk-Praznik, Krista Joy

    Radiation therapy is continuously increasing in complexity due to technological innovation in delivery techniques, necessitating thorough dosimetric verification. Comparing accurately predicted portal dose images to measured images obtained during patient treatment can determine if a particular treatment was delivered correctly. The goal of this thesis was to create a method to predict portal dose images that was versatile and accurate enough to use in a clinical setting. All measured images in this work were obtained with an amorphous silicon electronic portal imaging device (a-Si EPID), but the technique is applicable to any planar imager. A detailed, physics-motivated fluence model was developed to characterize fluence exiting the linear accelerator head. The model was further refined using results from Monte Carlo simulations and schematics of the linear accelerator. The fluence incident on the EPID was converted to a portal dose image through a superposition of Monte Carlo-generated, monoenergetic dose kernels specific to the a-Si EPID. Predictions of clinical IMRT fields with no patient present agreed with measured portal dose images within 3% and 3 mm. The dose kernels were applied ignoring the geometrically divergent nature of incident fluence on the EPID. A computational investigation into this parallel dose kernel assumption determined its validity under clinically relevant situations. Introducing a patient or phantom into the beam required the portal image prediction algorithm to account for patient scatter and attenuation. Primary fluence was calculated by attenuating raylines cast through the patient CT dataset, while scatter fluence was determined through the superposition of pre-calculated scatter fluence kernels. Total dose in the EPID was calculated by convolving the total predicted incident fluence with the EPID-specific dose kernels. The algorithm was tested on water slabs with square fields, agreeing with measurement within 3% and 3 mm. The method was then applied to five prostate and six head-and-neck IMRT treatment courses (˜1900 clinical images). Deviations between the predicted and measured images were quantified. The portal dose image prediction model developed in this thesis work has been shown to be accurate, and it was demonstrated to be able to verify patients' delivered radiation treatments.

  6. Moving to the Beat and Singing are Linked in Humans

    PubMed Central

    Dalla Bella, Simone; Berkowska, Magdalena; Sowiński, Jakub

    2015-01-01

    The abilities to sing and to move to the beat of a rhythmic auditory stimulus emerge early during development, and both engage perceptual, motor, and sensorimotor processes. These similarities between singing and synchronization to a beat may be rooted in biology. Patel (2008) has suggested that motor synchronization to auditory rhythms may have emerged during evolution as a byproduct of selection for vocal learning (“vocal learning and synchronization hypothesis”). This view predicts a strong link between vocal performance and synchronization skills in humans. Here, we tested this prediction by asking occasional singers to tap along with auditory pulse trains and to imitate familiar melodies. Both vocal imitation and synchronization skills were measured in terms of accuracy and precision or consistency. Accurate and precise singers tapped more in the vicinity of the pacing stimuli (i.e., they were more accurate) than less accurate and less precise singers. Moreover, accurate singers were more consistent when tapping to the beat. These differences cannot be ascribed to basic motor skills or to motivational factors. Individual differences in terms of singing proficiency and synchronization skills may reflect the variability of a shared sensorimotor translation mechanism. PMID:26733370

  7. Towards cleaner combustion engines through groundbreaking detailed chemical kinetic models

    PubMed Central

    Battin-Leclerc, Frédérique; Blurock, Edward; Bounaceur, Roda; Fournet, René; Glaude, Pierre-Alexandre; Herbinet, Olivier; Sirjean, Baptiste; Warth, V.

    2013-01-01

    In the context of limiting the environmental impact of transportation, this paper reviews new directions which are being followed in the development of more predictive and more accurate detailed chemical kinetic models for the combustion of fuels. In the first part, the performance of current models, especially in terms of the prediction of pollutant formation, is evaluated. In the next parts, recent methods and ways to improve these models are described. An emphasis is given on the development of detailed models based on elementary reactions, on the production of the related thermochemical and kinetic parameters, and on the experimental techniques available to produce the data necessary to evaluate model predictions under well defined conditions. PMID:21597604

  8. Superensemble forecasts of dengue outbreaks

    PubMed Central

    Kandula, Sasikiran; Shaman, Jeffrey

    2016-01-01

    In recent years, a number of systems capable of predicting future infectious disease incidence have been developed. As more of these systems are operationalized, it is important that the forecasts generated by these different approaches be formally reconciled so that individual forecast error and bias are reduced. Here we present a first example of such multi-system, or superensemble, forecast. We develop three distinct systems for predicting dengue, which are applied retrospectively to forecast outbreak characteristics in San Juan, Puerto Rico. We then use Bayesian averaging methods to combine the predictions from these systems and create superensemble forecasts. We demonstrate that on average, the superensemble approach produces more accurate forecasts than those made from any of the individual forecasting systems. PMID:27733698

  9. Prediction of essential oil content of oregano by hand-held and Fourier transform NIR spectroscopy.

    PubMed

    Camps, Cédric; Gérard, Marianne; Quennoz, Mélanie; Brabant, Cécile; Oberson, Carine; Simonnet, Xavier

    2014-05-01

    In the framework of a breeding programme, the analysis of hundreds of oregano samples to determine their essential oil content (EOC) is time-consuming and expensive in terms of labour. Therefore developing a new method that is rapid, accurate and less expensive to use would be an asset to breeders. The aim of the present study was to develop a method based on near-inrared (NIR) spectroscopy to determine the EOC of oregano dried powder. Two spectroscopic approaches were compared, the first using a hand-held NIR device and the second a Fourier transform (FT) NIR spectrometer. Hand-held NIR (1000-1800 nm) measurements and partial least squares regression allowed the determination of EOC with R² and SEP values of 0.58 and 0.81 mL per 100 g dry matter (DM) respectively. Measurements with FT-NIR (1000-2500 nm) allowed the determination of EOC with R² and SEP values of 0.91 and 0.68 mL per 100 g DM respectively. RPD, RER and RPIQ values for the model implemented with FT-NIR data were satisfactory for screening application, while those obtained with hand-held NIR data were below the level required to consider the model as enough accurate for screening application. The FT-NIR approach allowed the development of an accurate model for EOC prediction. Although the hand-held NIR approach is promising, it needs additional development before it can be used in practice. © 2013 Society of Chemical Industry.

  10. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs

    PubMed Central

    2017-01-01

    Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package. PMID:29107980

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Partridge Jr, William P.; Choi, Jae-Soon

    By directly resolving spatial and temporal species distributions within operating honeycomb monolith catalysts, spatially resolved capillary inlet mass spectrometry (SpaciMS) provides a uniquely enabling perspective for advancing automotive catalysis. Specifically, the ability to follow the spatiotemporal evolution of reactions throughout the catalyst is a significant advantage over inlet-and-effluent-limited analysis. Intracatalyst resolution elucidates numerous catalyst details including the network and sequence of reactions, clarifying reaction pathways; the relative rates of different reactions and impacts of operating conditions and catalyst state; and reaction dynamics and intermediate species that exist only within the catalyst. These details provide a better understanding of how themore » catalyst functions and have basic and practical benefits; e.g., catalyst system design; strategies for on-road catalyst state assessment, control, and on-board diagnostics; and creating robust and accurate predictive catalyst models. Moreover, such spatiotemporally distributed data provide for critical model assessment, and identification of improvement opportunities that might not be apparent from effluent assessment; i.e., while an incorrectly formulated model may provide correct effluent predictions, one that can accurately predict the spatiotemporal evolution of reactions along the catalyst channels will be more robust, accurate, and reliable. In such ways, intracatalyst diagnostics comprehensively enable improved design and development tools, and faster and lower-cost development of more efficient and durable automotive catalyst systems. Beyond these direct contributions, SpaciMS has spawned and been applied to enable other analytical techniques for resolving transient distributed intracatalyst performance. This chapter focuses on SpaciMS applications and associated catalyst insights and improvements, with specific sections related to lean NOx traps, selective catalytic reduction catalysts, oxidation catalysts, and particulate filters. The objective is to promote broader use and development of intracatalyst analytical methods, and thereby expand the insights resulting from this detailed perspective for advancing automotive catalyst technologies.« less

  12. A Simple Iterative Model Accurately Captures Complex Trapline Formation by Bumblebees Across Spatial Scales and Flower Arrangements

    PubMed Central

    Reynolds, Andrew M.; Lihoreau, Mathieu; Chittka, Lars

    2013-01-01

    Pollinating bees develop foraging circuits (traplines) to visit multiple flowers in a manner that minimizes overall travel distance, a task analogous to the travelling salesman problem. We report on an in-depth exploration of an iterative improvement heuristic model of bumblebee traplining previously found to accurately replicate the establishment of stable routes by bees between flowers distributed over several hectares. The critical test for a model is its predictive power for empirical data for which the model has not been specifically developed, and here the model is shown to be consistent with observations from different research groups made at several spatial scales and using multiple configurations of flowers. We refine the model to account for the spatial search strategy of bees exploring their environment, and test several previously unexplored predictions. We find that the model predicts accurately 1) the increasing propensity of bees to optimize their foraging routes with increasing spatial scale; 2) that bees cannot establish stable optimal traplines for all spatial configurations of rewarding flowers; 3) the observed trade-off between travel distance and prioritization of high-reward sites (with a slight modification of the model); 4) the temporal pattern with which bees acquire approximate solutions to travelling salesman-like problems over several dozen foraging bouts; 5) the instability of visitation schedules in some spatial configurations of flowers; 6) the observation that in some flower arrays, bees' visitation schedules are highly individually different; 7) the searching behaviour that leads to efficient location of flowers and routes between them. Our model constitutes a robust theoretical platform to generate novel hypotheses and refine our understanding about how small-brained insects develop a representation of space and use it to navigate in complex and dynamic environments. PMID:23505353

  13. Development of a Land Use Mapping and Monitoring Protocol for the High Plains Region: A Multitemporal Remote Sensing Application

    NASA Technical Reports Server (NTRS)

    Price, Kevin P.; Nellis, M. Duane

    1996-01-01

    The purpose of this project was to develop a practical protocol that employs multitemporal remotely sensed imagery, integrated with environmental parameters to model and monitor agricultural and natural resources in the High Plains Region of the United States. The value of this project would be extended throughout the region via workshops targeted at carefully selected audiences and designed to transfer remote sensing technology and the methods and applications developed. Implementation of such a protocol using remotely sensed satellite imagery is critical for addressing many issues of regional importance, including: (1) Prediction of rural land use/land cover (LULC) categories within a region; (2) Use of rural LULC maps for successive years to monitor change; (3) Crop types derived from LULC maps as important inputs to water consumption models; (4) Early prediction of crop yields; (5) Multi-date maps of crop types to monitor patterns related to crop change; (6) Knowledge of crop types to monitor condition and improve prediction of crop yield; (7) More precise models of crop types and conditions to improve agricultural economic forecasts; (8;) Prediction of biomass for estimating vegetation production, soil protection from erosion forces, nonpoint source pollution, wildlife habitat quality and other related factors; (9) Crop type and condition information to more accurately predict production of biogeochemicals such as CO2, CH4, and other greenhouse gases that are inputs to global climate models; (10) Provide information regarding limiting factors (i.e., economic constraints of pumping, fertilizing, etc.) used in conjunction with other factors, such as changes in climate for predicting changes in rural LULC; (11) Accurate prediction of rural LULC used to assess the effectiveness of government programs such as the U.S. Soil Conservation Service (SCS) Conservation Reserve Program; and (12) Prediction of water demand based on rural LULC that can be related to rates of draw-down of underground water supplies.

  14. Predicting survival of Escherichia coli O157:H7 in dry fermented sausage using artificial neural networks.

    PubMed

    Palanichamy, A; Jayas, D S; Holley, R A

    2008-01-01

    The Canadian Food Inspection Agency required the meat industry to ensure Escherichia coli O157:H7 does not survive (experiences > or = 5 log CFU/g reduction) in dry fermented sausage (salami) during processing after a series of foodborne illness outbreaks resulting from this pathogenic bacterium occurred. The industry is in need of an effective technique like predictive modeling for estimating bacterial viability, because traditional microbiological enumeration is a time-consuming and laborious method. The accuracy and speed of artificial neural networks (ANNs) for this purpose is an attractive alternative (developed from predictive microbiology), especially for on-line processing in industry. Data from a study of interactive effects of different levels of pH, water activity, and the concentrations of allyl isothiocyanate at various times during sausage manufacture in reducing numbers of E. coli O157:H7 were collected. Data were used to develop predictive models using a general regression neural network (GRNN), a form of ANN, and a statistical linear polynomial regression technique. Both models were compared for their predictive error, using various statistical indices. GRNN predictions for training and test data sets had less serious errors when compared with the statistical model predictions. GRNN models were better and slightly better for training and test sets, respectively, than was the statistical model. Also, GRNN accurately predicted the level of allyl isothiocyanate required, ensuring a 5-log reduction, when an appropriate production set was created by interpolation. Because they are simple to generate, fast, and accurate, ANN models may be of value for industrial use in dry fermented sausage manufacture to reduce the hazard associated with E. coli O157:H7 in fresh beef and permit production of consistently safe products from this raw material.

  15. Characterization of particulate emissions from Australian open-cut coal mines: Toward improved emission estimates.

    PubMed

    Richardson, Claire; Rutherford, Shannon; Agranovski, Igor

    2018-06-01

    Given the significance of mining as a source of particulates, accurate characterization of emissions is important for the development of appropriate emission estimation techniques for use in modeling predictions and to inform regulatory decisions. The currently available emission estimation methods for Australian open-cut coal mines relate primarily to total suspended particulates and PM 10 (particulate matter with an aerodynamic diameter <10 μm), and limited data are available relating to the PM 2.5 (<2.5 μm) size fraction. To provide an initial analysis of the appropriateness of the currently available emission estimation techniques, this paper presents results of sampling completed at three open-cut coal mines in Australia. The monitoring data demonstrate that the particulate size fraction varies for different mining activities, and that the region in which the mine is located influences the characteristics of the particulates emitted to the atmosphere. The proportion of fine particulates in the sample increased with distance from the source, with the coarse fraction being a more significant proportion of total suspended particulates close to the source of emissions. In terms of particulate composition, the results demonstrate that the particulate emissions are predominantly sourced from naturally occurring geological material, and coal comprises less than 13% of the overall emissions. The size fractionation exhibited by the sampling data sets is similar to that adopted in current Australian emission estimation methods but differs from the size fractionation presented in the U.S. Environmental Protection Agency methodology. Development of region-specific emission estimation techniques for PM 10 and PM 2.5 from open-cut coal mines is necessary to allow accurate prediction of particulate emissions to inform regulatory decisions and for use in modeling predictions. Development of region-specific emission estimation techniques for PM 10 and PM 2.5 from open-cut coal mines is necessary to allow accurate prediction of particulate emissions to inform regulatory decisions and for use in modeling predictions. Comprehensive air quality monitoring was undertaken, and corresponding recommendations were provided.

  16. Multidisciplinary Modeling Software for Analysis, Design, and Optimization of HRRLS Vehicles

    NASA Technical Reports Server (NTRS)

    Spradley, Lawrence W.; Lohner, Rainald; Hunt, James L.

    2011-01-01

    The concept for Highly Reliable Reusable Launch Systems (HRRLS) under the NASA Hypersonics project is a two-stage-to-orbit, horizontal-take-off / horizontal-landing, (HTHL) architecture with an air-breathing first stage. The first stage vehicle is a slender body with an air-breathing propulsion system that is highly integrated with the airframe. The light weight slender body will deflect significantly during flight. This global deflection affects the flow over the vehicle and into the engine and thus the loads and moments on the vehicle. High-fidelity multi-disciplinary analyses that accounts for these fluid-structures-thermal interactions are required to accurately predict the vehicle loads and resultant response. These predictions of vehicle response to multi physics loads, calculated with fluid-structural-thermal interaction, are required in order to optimize the vehicle design over its full operating range. This contract with ResearchSouth addresses one of the primary objectives of the Vehicle Technology Integration (VTI) discipline: the development of high-fidelity multi-disciplinary analysis and optimization methods and tools for HRRLS vehicles. The primary goal of this effort is the development of an integrated software system that can be used for full-vehicle optimization. This goal was accomplished by: 1) integrating the master code, FEMAP, into the multidiscipline software network to direct the coupling to assure accurate fluid-structure-thermal interaction solutions; 2) loosely-coupling the Euler flow solver FEFLO to the available and proven aeroelasticity and large deformation (FEAP) code; 3) providing a coupled Euler-boundary layer capability for rapid viscous flow simulation; 4) developing and implementing improved Euler/RANS algorithms into the FEFLO CFD code to provide accurate shock capturing, skin friction, and heat-transfer predictions for HRRLS vehicles in hypersonic flow, 5) performing a Reynolds-averaged Navier-Stokes computation on an HRRLS configuration; 6) integrating the RANS solver with the FEAP code for coupled fluid-structure-thermal capability; and 7) integrating the existing NASA SRGULL propulsion flow path prediction software with the FEFLO software for quasi-3D propulsion flow path predictions, 8) improving and integrating into the network, an existing adjoint-based design optimization code.

  17. Using a prescribed fire to test custom and standard fuel models for fire behaviour prediction in a non-native, grass-invaded tropical dry shrubland

    Treesearch

    Andrew D. Pierce; Sierra McDaniel; Mark Wasser; Alison Ainsworth; Creighton M. Litton; Christian P. Giardina; Susan Cordell; Ralf Ohlemuller

    2014-01-01

    Questions: Do fuel models developed for North American fuel types accurately represent fuel beds found in grass-invaded tropical shrublands? Do standard or custom fuel models for firebehavior models with in situ or RAWS measured fuel moistures affect the accuracy of predicted fire behavior in grass-invaded tropical shrublands? Location: Hawai’i Volcanoes National...

  18. A Comparison of Classical Force-Fields for Molecular Dynamics Simulations of Lubricants

    PubMed Central

    Ewen, James P.; Gattinoni, Chiara; Thakkar, Foram M.; Morgan, Neal; Spikes, Hugh A.; Dini, Daniele

    2016-01-01

    For the successful development and application of lubricants, a full understanding of their complex nanoscale behavior under a wide range of external conditions is required, but this is difficult to obtain experimentally. Nonequilibrium molecular dynamics (NEMD) simulations can be used to yield unique insights into the atomic-scale structure and friction of lubricants and additives; however, the accuracy of the results depend on the chosen force-field. In this study, we demonstrate that the use of an accurate, all-atom force-field is critical in order to; (i) accurately predict important properties of long-chain, linear molecules; and (ii) reproduce experimental friction behavior of multi-component tribological systems. In particular, we focus on n-hexadecane, an important model lubricant with a wide range of industrial applications. Moreover, simulating conditions common in tribological systems, i.e., high temperatures and pressures (HTHP), allows the limits of the selected force-fields to be tested. In the first section, a large number of united-atom and all-atom force-fields are benchmarked in terms of their density and viscosity prediction accuracy of n-hexadecane using equilibrium molecular dynamics (EMD) simulations at ambient and HTHP conditions. Whilst united-atom force-fields accurately reproduce experimental density, the viscosity is significantly under-predicted compared to all-atom force-fields and experiments. Moreover, some all-atom force-fields yield elevated melting points, leading to significant overestimation of both the density and viscosity. In the second section, the most accurate united-atom and all-atom force-field are compared in confined NEMD simulations which probe the structure and friction of stearic acid adsorbed on iron oxide and separated by a thin layer of n-hexadecane. The united-atom force-field provides an accurate representation of the structure of the confined stearic acid film; however, friction coefficients are consistently under-predicted and the friction-coverage and friction-velocity behavior deviates from that observed using all-atom force-fields and experimentally. This has important implications regarding force-field selection for NEMD simulations of systems containing long-chain, linear molecules; specifically, it is recommended that accurate all-atom potentials, such as L-OPLS-AA, are employed. PMID:28773773

  19. A Comparison of Classical Force-Fields for Molecular Dynamics Simulations of Lubricants.

    PubMed

    Ewen, James P; Gattinoni, Chiara; Thakkar, Foram M; Morgan, Neal; Spikes, Hugh A; Dini, Daniele

    2016-08-02

    For the successful development and application of lubricants, a full understanding of their complex nanoscale behavior under a wide range of external conditions is required, but this is difficult to obtain experimentally. Nonequilibrium molecular dynamics (NEMD) simulations can be used to yield unique insights into the atomic-scale structure and friction of lubricants and additives; however, the accuracy of the results depend on the chosen force-field. In this study, we demonstrate that the use of an accurate, all-atom force-field is critical in order to; (i) accurately predict important properties of long-chain, linear molecules; and (ii) reproduce experimental friction behavior of multi-component tribological systems. In particular, we focus on n -hexadecane, an important model lubricant with a wide range of industrial applications. Moreover, simulating conditions common in tribological systems, i.e., high temperatures and pressures (HTHP), allows the limits of the selected force-fields to be tested. In the first section, a large number of united-atom and all-atom force-fields are benchmarked in terms of their density and viscosity prediction accuracy of n -hexadecane using equilibrium molecular dynamics (EMD) simulations at ambient and HTHP conditions. Whilst united-atom force-fields accurately reproduce experimental density, the viscosity is significantly under-predicted compared to all-atom force-fields and experiments. Moreover, some all-atom force-fields yield elevated melting points, leading to significant overestimation of both the density and viscosity. In the second section, the most accurate united-atom and all-atom force-field are compared in confined NEMD simulations which probe the structure and friction of stearic acid adsorbed on iron oxide and separated by a thin layer of n -hexadecane. The united-atom force-field provides an accurate representation of the structure of the confined stearic acid film; however, friction coefficients are consistently under-predicted and the friction-coverage and friction-velocity behavior deviates from that observed using all-atom force-fields and experimentally. This has important implications regarding force-field selection for NEMD simulations of systems containing long-chain, linear molecules; specifically, it is recommended that accurate all-atom potentials, such as L-OPLS-AA, are employed.

  20. Computation of turbulent boundary layers on curved surfaces, 1 June 1975 - 31 January 1976

    NASA Technical Reports Server (NTRS)

    Wilcox, D. C.; Chambers, T. L.

    1976-01-01

    An accurate method was developed for predicting effects of streamline curvature and coordinate system rotation on turbulent boundary layers. A new two-equation model of turbulence was developed which serves as the basis of the study. In developing the new model, physical reasoning is combined with singular perturbation methods to develop a rational, physically-based set of equations which are, on the one hand, as accurate as mixing-length theory for equilibrium boundary layers and, on the other hand, suitable for computing effects of curvature and rotation. The equations are solved numerically for several boundary layer flows over plane and curved surfaces. For incompressible boundary layers, results of the computations are generally within 10% of corresponding experimental data. Somewhat larger discrepancies are noted for compressible applications.

  1. A novel knowledge-based potential for RNA 3D structure evaluation

    NASA Astrophysics Data System (ADS)

    Yang, Yi; Gu, Qi; Zhang, Ben-Gong; Shi, Ya-Zhou; Shao, Zhi-Gang

    2018-03-01

    Ribonucleic acids (RNAs) play a vital role in biology, and knowledge of their three-dimensional (3D) structure is required to understand their biological functions. Recently structural prediction methods have been developed to address this issue, but a series of RNA 3D structures are generally predicted by most existing methods. Therefore, the evaluation of the predicted structures is generally indispensable. Although several methods have been proposed to assess RNA 3D structures, the existing methods are not precise enough. In this work, a new all-atom knowledge-based potential is developed for more accurately evaluating RNA 3D structures. The potential not only includes local and nonlocal interactions but also fully considers the specificity of each RNA by introducing a retraining mechanism. Based on extensive test sets generated from independent methods, the proposed potential correctly distinguished the native state and ranked near-native conformations to effectively select the best. Furthermore, the proposed potential precisely captured RNA structural features such as base-stacking and base-pairing. Comparisons with existing potential methods show that the proposed potential is very reliable and accurate in RNA 3D structure evaluation. Project supported by the National Science Foundation of China (Grants Nos. 11605125, 11105054, 11274124, and 11401448).

  2. Computational Fluid Dynamics of Whole-Body Aircraft

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh

    1999-01-01

    The current state of the art in computational aerodynamics for whole-body aircraft flowfield simulations is described. Recent advances in geometry modeling, surface and volume grid generation, and flow simulation algorithms have led to accurate flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics has emerged as a crucial enabling technology for the design and development of flight vehicles. Examples illustrating the current capability for the prediction of transport and fighter aircraft flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future, inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology, and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  3. On the critical temperature, normal boiling point, and vapor pressure of ionic liquids.

    PubMed

    Rebelo, Luis P N; Canongia Lopes, José N; Esperança, José M S S; Filipe, Eduardo

    2005-04-07

    One-stage, reduced-pressure distillations at moderate temperature of 1-decyl- and 1-dodecyl-3-methylimidazolium bistriflilamide ([Ntf(2)](-)) ionic liquids (ILs) have been performed. These liquid-vapor equilibria can be understood in light of predictions for normal boiling points of ILs. The predictions are based on experimental surface tension and density data, which are used to estimate the critical points of several ILs and their corresponding normal boiling temperatures. In contrast to the situation found for relatively unstable ILs at high-temperature such as those containing [BF(4)](-) or [PF(6)](-) anions, [Ntf(2)](-)-based ILs constitute a promising class in which reliable, accurate vapor pressure measurements can in principle be performed. This property is paramount for assisting in the development and testing of accurate molecular models.

  4. Can the electronegativity equalization method predict spectroscopic properties?

    PubMed

    Verstraelen, T; Bultinck, P

    2015-02-05

    The electronegativity equalization method is classically used as a method allowing the fast generation of atomic charges using a set of calibrated parameters and provided knowledge of the molecular structure. Recently, it has started being used for the calculation of other reactivity descriptors and for the development of polarizable and reactive force fields. For such applications, it is of interest to know whether the method, through the inclusion of the molecular geometry in the Taylor expansion of the energy, would also allow sufficiently accurate predictions of spectroscopic data. In this work, relevant quantities for IR spectroscopy are considered, namely the dipole derivatives and the Cartesian Hessian. Despite careful calibration of parameters for this specific task, it is shown that the current models yield insufficiently accurate results. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Expanded prediction equations of human sweat loss and water needs.

    PubMed

    Gonzalez, R R; Cheuvront, S N; Montain, S J; Goodman, D A; Blanchard, L A; Berglund, L G; Sawka, M N

    2009-08-01

    The Institute of Medicine expressed a need for improved sweating rate (msw) prediction models that calculate hourly and daily water needs based on metabolic rate, clothing, and environment. More than 25 years ago, the original Shapiro prediction equation (OSE) was formulated as msw (g.m(-2).h(-1))=27.9.Ereq.(Emax)(-0.455), where Ereq is required evaporative heat loss and Emax is maximum evaporative power of the environment; OSE was developed for a limited set of environments, exposures times, and clothing systems. Recent evidence shows that OSE often overpredicts fluid needs. Our study developed a corrected OSE and a new msw prediction equation by using independent data sets from a wide range of environmental conditions, metabolic rates (rest to 500 observations) by using a variety of metabolic rates over a range of environmental conditions (ambient temperature, 15-46 degrees C; water vapor pressure, 0.27-4.45 kPa; wind speed, 0.4-2.5 m/s), clothing, and equipment combinations and durations (2-8 h). Data are expressed as grams per square meter per hour and were analyzed using fuzzy piecewise regression. OSE overpredicted sweating rates (P<0.003) compared with observed msw. Both the correction equation (OSEC), msw=147.exp (0.0012.OSE), and a new piecewise (PW) equation, msw=147+1.527.Ereq-0.87.Emax were derived, compared with OSE, and then cross-validated against independent data (21 males and 9 females; >200 observations). OSEC and PW were more accurate predictors of sweating rate (58 and 65% more accurate, P<0.01) and produced minimal error (standard error estimate<100 g.m(-2).h(-1)) for conditions both within and outside the original OSE domain of validity. The new equations provide for more accurate sweat predictions over a broader range of conditions with applications to public health, military, occupational, and sports medicine settings.

  6. Capabilities of current wildfire models when simulating topographical flow

    NASA Astrophysics Data System (ADS)

    Kochanski, A.; Jenkins, M.; Krueger, S. K.; McDermott, R.; Mell, W.

    2009-12-01

    Accurate predictions of the growth, spread and suppression of wild fires rely heavily on the correct prediction of the local wind conditions and the interactions between the fire and the local ambient airflow. Resolving local flows, often strongly affected by topographical features like hills, canyons and ridges, is a prerequisite for accurate simulation and prediction of fire behaviors. In this study, we present the results of high-resolution numerical simulations of the flow over a smooth hill, performed using (1) the NIST WFDS (WUI or Wildland-Urban-Interface version of the FDS or Fire Dynamic Simulator), and (2) the LES version of the NCAR Weather Research and Forecasting (WRF-LES) model. The WFDS model is in the initial stages of development for application to wind flow and fire spread over complex terrain. The focus of the talk is to assess how well simple topographical flow is represented by WRF-LES and the current version of WFDS. If sufficient progress has been made prior to the meeting then the importance of the discrepancies between the predicted and measured winds, in terms of simulated fire behavior, will be examined.

  7. Predictive Model and Software for Inbreeding-Purging Analysis of Pedigreed Populations

    PubMed Central

    García-Dorado, Aurora; Wang, Jinliang; López-Cortegano, Eugenio

    2016-01-01

    The inbreeding depression of fitness traits can be a major threat to the survival of populations experiencing inbreeding. However, its accurate prediction requires taking into account the genetic purging induced by inbreeding, which can be achieved using a “purged inbreeding coefficient”. We have developed a method to compute purged inbreeding at the individual level in pedigreed populations with overlapping generations. Furthermore, we derive the inbreeding depression slope for individual logarithmic fitness, which is larger than that for the logarithm of the population fitness average. In addition, we provide a new software, PURGd, based on these theoretical results that allows analyzing pedigree data to detect purging, and to estimate the purging coefficient, which is the parameter necessary to predict the joint consequences of inbreeding and purging. The software also calculates the purged inbreeding coefficient for each individual, as well as standard and ancestral inbreeding. Analysis of simulation data show that this software produces reasonably accurate estimates for the inbreeding depression rate and for the purging coefficient that are useful for predictive purposes. PMID:27605515

  8. A Research Program for Improving Heat Transfer Prediction Capability for the Laminar to Turbulent Transition Region of Turbine Vanes/Blades

    NASA Technical Reports Server (NTRS)

    Simon, Frederick F.

    2007-01-01

    A program sponsored by the National Aeronautics and Space Administration (NASA) for the investigation of the heat transfer in the transition region of turbine vanes and blades with the object of improving the capability for predicting heat transfer is described,. The accurate prediction of gas-side heat transfer is important to the determination of turbine longevity, engine performance and developmental costs. The need for accurate predictions will become greater as the operating temperatures and stage loading levels of advanced turbine engines increase. The present methods for predicting transition shear stress and heat transfer on turbine blades are based on incomplete knowledge and are largely empirical. To meet the objectives of the NASA program, a team approach consisting of researchers from government, universities, a research institute, and a small business is presented. The research is divided into areas of experimentation, direct numerical simulation (DNS) and turbulence modeling. A summary of the results to date is given for the above research areas in a high-disturbance environment (bypass transition) with a discussion of the model development necessary for use in numerical codes.

  9. Sex-specific lean body mass predictive equations are accurate in the obese paediatric population

    PubMed Central

    Jackson, Lanier B.; Henshaw, Melissa H.; Carter, Janet; Chowdhury, Shahryar M.

    2015-01-01

    Background The clinical assessment of lean body mass (LBM) is challenging in obese children. A sex-specific predictive equation for LBM derived from anthropometric data was recently validated in children. Aim The purpose of this study was to independently validate these predictive equations in the obese paediatric population. Subjects and methods Obese subjects aged 4–21 were analysed retrospectively. Predicted LBM (LBMp) was calculated using equations previously developed in children. Measured LBM (LBMm) was derived from dual-energy x-ray absorptiometry. Agreement was expressed as [(LBMm-LBMp)/LBMm] with 95% limits of agreement. Results Of 310 enrolled patients, 195 (63%) were females. The mean age was 11.8 ± 3.4 years and mean BMI Z-score was 2.3 ± 0.4. The average difference between LBMm and LBMp was −0.6% (−17.0%, 15.8%). Pearson’s correlation revealed a strong linear relationship between LBMm and LBMp (r=0.97, p<0.01). Conclusion This study validates the use of these clinically-derived sex-specific LBM predictive equations in the obese paediatric population. Future studies should use these equations to improve the ability to accurately classify LBM in obese children. PMID:26287383

  10. Developing hybrid approaches to predict pKa values of ionizable groups

    PubMed Central

    Witham, Shawn; Talley, Kemper; Wang, Lin; Zhang, Zhe; Sarkar, Subhra; Gao, Daquan; Yang, Wei

    2011-01-01

    Accurate predictions of pKa values of titratable groups require taking into account all relevant processes associated with the ionization/deionization. Frequently, however, the ionization does not involve significant structural changes and the dominating effects are purely electrostatic in origin allowing accurate predictions to be made based on the electrostatic energy difference between ionized and neutral forms alone using a static structure. On another hand, if the change of the charge state is accompanied by a structural reorganization of the target protein, then the relevant conformational changes have to be taken into account in the pKa calculations. Here we report a hybrid approach that first predicts the titratable groups, which ionization is expected to cause conformational changes, termed “problematic” residues, then applies a special protocol on them, while the rest of the pKa’s are predicted with rigid backbone approach as implemented in multi-conformation continuum electrostatics (MCCE) method. The backbone representative conformations for “problematic” groups are generated with either molecular dynamics simulations with charged and uncharged amino acid or with ab-initio local segment modeling. The corresponding ensembles are then used to calculate the pKa of the “problematic” residues and then the results are averaged. PMID:21744395

  11. Ligand Binding Site Detection by Local Structure Alignment and Its Performance Complementarity

    PubMed Central

    Lee, Hui Sun; Im, Wonpil

    2013-01-01

    Accurate determination of potential ligand binding sites (BS) is a key step for protein function characterization and structure-based drug design. Despite promising results of template-based BS prediction methods using global structure alignment (GSA), there is a room to improve the performance by properly incorporating local structure alignment (LSA) because BS are local structures and often similar for proteins with dissimilar global folds. We present a template-based ligand BS prediction method using G-LoSA, our LSA tool. A large benchmark set validation shows that G-LoSA predicts drug-like ligands’ positions in single-chain protein targets more precisely than TM-align, a GSA-based method, while the overall success rate of TM-align is better. G-LoSA is particularly efficient for accurate detection of local structures conserved across proteins with diverse global topologies. Recognizing the performance complementarity of G-LoSA to TM-align and a non-template geometry-based method, fpocket, a robust consensus scoring method, CMCS-BSP (Complementary Methods and Consensus Scoring for ligand Binding Site Prediction), is developed and shows improvement on prediction accuracy. The G-LoSA source code is freely available at http://im.bioinformatics.ku.edu/GLoSA. PMID:23957286

  12. Accuracy of genomic breeding values in multibreed beef cattle populations derived from deregressed breeding values and phenotypes.

    PubMed

    Weber, K L; Thallman, R M; Keele, J W; Snelling, W M; Bennett, G L; Smith, T P L; McDaneld, T G; Allan, M F; Van Eenennaam, A L; Kuehn, L A

    2012-12-01

    Genomic selection involves the assessment of genetic merit through prediction equations that allocate genetic variation with dense marker genotypes. It has the potential to provide accurate breeding values for selection candidates at an early age and facilitate selection for expensive or difficult to measure traits. Accurate across-breed prediction would allow genomic selection to be applied on a larger scale in the beef industry, but the limited availability of large populations for the development of prediction equations has delayed researchers from providing genomic predictions that are accurate across multiple beef breeds. In this study, the accuracy of genomic predictions for 6 growth and carcass traits were derived and evaluated using 2 multibreed beef cattle populations: 3,358 crossbred cattle of the U.S. Meat Animal Research Center Germplasm Evaluation Program (USMARC_GPE) and 1,834 high accuracy bull sires of the 2,000 Bull Project (2000_BULL) representing influential breeds in the U.S. beef cattle industry. The 2000_BULL EPD were deregressed, scaled, and weighted to adjust for between- and within-breed heterogeneous variance before use in training and validation. Molecular breeding values (MBV) trained in each multibreed population and in Angus and Hereford purebred sires of 2000_BULL were derived using the GenSel BayesCπ function (Fernando and Garrick, 2009) and cross-validated. Less than 10% of large effect loci were shared between prediction equations trained on (USMARC_GPE) relative to 2000_BULL although locus effects were moderately to highly correlated for most traits and the traits themselves were highly correlated between populations. Prediction of MBV accuracy was low and variable between populations. For growth traits, MBV accounted for up to 18% of genetic variation in a pooled, multibreed analysis and up to 28% in single breeds. For carcass traits, MBV explained up to 8% of genetic variation in a pooled, multibreed analysis and up to 42% in single breeds. Prediction equations trained in multibreed populations were more accurate for Angus and Hereford subpopulations because those were the breeds most highly represented in the training populations. Accuracies were less for prediction equations trained in a single breed due to the smaller number of records derived from a single breed in the training populations.

  13. A narrow-band k-distribution model with single mixture gas assumption for radiative flows

    NASA Astrophysics Data System (ADS)

    Jo, Sung Min; Kim, Jae Won; Kwon, Oh Joon

    2018-06-01

    In the present study, the narrow-band k-distribution (NBK) model parameters for mixtures of H2O, CO2, and CO are proposed by utilizing the line-by-line (LBL) calculations with a single mixture gas assumption. For the application of the NBK model to radiative flows, a radiative transfer equation (RTE) solver based on a finite-volume method on unstructured meshes was developed. The NBK model and the RTE solver were verified by solving two benchmark problems including the spectral radiance distribution emitted from one-dimensional slabs and the radiative heat transfer in a truncated conical enclosure. It was shown that the results are accurate and physically reliable by comparing with available data. To examine the applicability of the methods to realistic multi-dimensional problems in non-isothermal and non-homogeneous conditions, radiation in an axisymmetric combustion chamber was analyzed, and then the infrared signature emitted from an aircraft exhaust plume was predicted. For modeling the plume flow involving radiative cooling, a flow-radiation coupled procedure was devised in a loosely coupled manner by adopting a Navier-Stokes flow solver based on unstructured meshes. It was shown that the predicted radiative cooling for the combustion chamber is physically more accurate than other predictions, and is as accurate as that by the LBL calculations. It was found that the infrared signature of aircraft exhaust plume can also be obtained accurately, equivalent to the LBL calculations, by using the present narrow-band approach with a much improved numerical efficiency.

  14. Development and Validation of the Texas Best Management Practice Evaluation Tool (TBET)

    USDA-ARS?s Scientific Manuscript database

    Conservation planners need simple yet accurate tools to predict sediment and nutrient losses from agricultural fields to guide conservation practice implementation and increase cost-effectiveness. The Texas Best management practice Evaluation Tool (TBET), which serves as an input/output interpreter...

  15. A comparison of major petroleum life cycle models

    EPA Science Inventory

    Many organizations have attempted to develop an accurate well-to-pump life cycle model of petroleum products in order to inform decision makers of the consequences of its use. Our paper studies five of these models, demonstrating the differences in their predictions and attemptin...

  16. Impacts of Earth rotation parameters on GNSS ultra-rapid orbit prediction: Derivation and real-time correction

    NASA Astrophysics Data System (ADS)

    Wang, Qianxin; Hu, Chao; Xu, Tianhe; Chang, Guobin; Hernández Moraleda, Alberto

    2017-12-01

    Analysis centers (ACs) for global navigation satellite systems (GNSSs) cannot accurately obtain real-time Earth rotation parameters (ERPs). Thus, the prediction of ultra-rapid orbits in the international terrestrial reference system (ITRS) has to utilize the predicted ERPs issued by the International Earth Rotation and Reference Systems Service (IERS) or the International GNSS Service (IGS). In this study, the accuracy of ERPs predicted by IERS and IGS is analyzed. The error of the ERPs predicted for one day can reach 0.15 mas and 0.053 ms in polar motion and UT1-UTC direction, respectively. Then, the impact of ERP errors on ultra-rapid orbit prediction by GNSS is studied. The methods for orbit integration and frame transformation in orbit prediction with introduced ERP errors dominate the accuracy of the predicted orbit. Experimental results show that the transformation from the geocentric celestial references system (GCRS) to ITRS exerts the strongest effect on the accuracy of the predicted ultra-rapid orbit. To obtain the most accurate predicted ultra-rapid orbit, a corresponding real-time orbit correction method is developed. First, orbits without ERP-related errors are predicted on the basis of ITRS observed part of ultra-rapid orbit for use as reference. Then, the corresponding predicted orbit is transformed from GCRS to ITRS to adjust for the predicted ERPs. Finally, the corrected ERPs with error slopes are re-introduced to correct the predicted orbit in ITRS. To validate the proposed method, three experimental schemes are designed: function extrapolation, simulation experiments, and experiments with predicted ultra-rapid orbits and international GNSS Monitoring and Assessment System (iGMAS) products. Experimental results show that using the proposed correction method with IERS products considerably improved the accuracy of ultra-rapid orbit prediction (except the geosynchronous BeiDou orbits). The accuracy of orbit prediction is enhanced by at least 50% (error related to ERP) when a highly accurate observed orbit is used with the correction method. For iGMAS-predicted orbits, the accuracy improvement ranges from 8.5% for the inclined BeiDou orbits to 17.99% for the GPS orbits. This demonstrates that the correction method proposed by this study can optimize the ultra-rapid orbit prediction.

  17. Using Monte Carlo/Gaussian Based Small Area Estimates to Predict Where Medicaid Patients Reside.

    PubMed

    Behrens, Jess J; Wen, Xuejin; Goel, Satyender; Zhou, Jing; Fu, Lina; Kho, Abel N

    2016-01-01

    Electronic Health Records (EHR) are rapidly becoming accepted as tools for planning and population health 1,2 . With the national dialogue around Medicaid expansion 12 , the role of EHR data has become even more important. For their potential to be fully realized and contribute to these discussions, techniques for creating accurate small area estimates is vital. As such, we examined the efficacy of developing small area estimates for Medicaid patients in two locations, Albuquerque and Chicago, by using a Monte Carlo/Gaussian technique that has worked in accurately locating registered voters in North Carolina 11 . The Albuquerque data, which includes patient address, will first be used to assess the accuracy of the methodology. Subsequently, it will be combined with the EHR data from Chicago to develop a regression that predicts Medicaid patients by US Block Group. We seek to create a tool that is effective in translating EHR data's potential for population health studies.

  18. Multiphysics Thermal-Fluid Design Analysis of a Non-Nuclear Tester for Hot-Hydrogen Materials and Component Development

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Foote, John; Litchford, Ron

    2006-01-01

    The objective of this effort is to perform design analyses for a non-nuclear hot-hydrogen materials tester, as a first step towards developing efficient and accurate multiphysics, thermo-fluid computational methodology to predict environments for hypothetical solid-core, nuclear thermal engine thrust chamber design and analysis. The computational methodology is based on a multidimensional, finite-volume, turbulent, chemically reacting, thermally radiating, unstructured-grid, and pressure-based formulation. The multiphysics invoked in this study include hydrogen dissociation kinetics and thermodynamics, turbulent flow, convective, and thermal radiative heat transfers. The goals of the design analyses are to maintain maximum hot-hydrogen jet impingement energy and to minimize chamber wall heating. The results of analyses on three test fixture configurations and the rationale for final selection are presented. The interrogation of physics revealed that reactions of hydrogen dissociation and recombination are highly correlated with local temperature and are necessary for accurate prediction of the hot-hydrogen jet temperature.

  19. Numerical Simulation of a High Mach Number Jet Flow

    NASA Technical Reports Server (NTRS)

    Hayder, M. Ehtesham; Turkel, Eli; Mankbadi, Reda R.

    1993-01-01

    The recent efforts to develop accurate numerical schemes for transition and turbulent flows are motivated, among other factors, by the need for accurate prediction of flow noise. The success of developing high speed civil transport plane (HSCT) is contingent upon our understanding and suppression of the jet exhaust noise. The radiated sound can be directly obtained by solving the full (time-dependent) compressible Navier-Stokes equations. However, this requires computational storage that is beyond currently available machines. This difficulty can be overcome by limiting the solution domain to the near field where the jet is nonlinear and then use acoustic analogy (e.g., Lighthill) to relate the far-field noise to the near-field sources. The later requires obtaining the time-dependent flow field. The other difficulty in aeroacoustics computations is that at high Reynolds numbers the turbulent flow has a large range of scales. Direct numerical simulations (DNS) cannot obtain all the scales of motion at high Reynolds number of technological interest. However, it is believed that the large scale structure is more efficient than the small-scale structure in radiating noise. Thus, one can model the small scales and calculate the acoustically active scales. The large scale structure in the noise-producing initial region of the jet can be viewed as a wavelike nature, the net radiated sound is the net cancellation after integration over space. As such, aeroacoustics computations are highly sensitive to errors in computing the sound sources. It is therefore essential to use a high-order numerical scheme to predict the flow field. The present paper presents the first step in a ongoing effort to predict jet noise. The emphasis here is in accurate prediction of the unsteady flow field. We solve the full time-dependent Navier-Stokes equations by a high order finite difference method. Time accurate spatial simulations of both plane and axisymmetric jet are presented. Jet Mach numbers of 1.5 and 2.1 are considered. Reynolds number in the simulations was about a million. Our numerical model is based on the 2-4 scheme by Gottlieb & Turkel. Bayliss et al. applied the 2-4 scheme in boundary layer computations. This scheme was also used by Ragab and Sheen to study the nonlinear development of supersonic instability waves in a mixing layer. In this study, we present two dimensional direct simulation results for both plane and axisymmetric jets. These results are compared with linear theory predictions. These computations were made for near nozzle exit region and velocity in spanwise/azimuthal direction was assumed to be zero.

  20. The Prediction of Noise Due to Jet Turbulence Convecting Past Flight Vehicle Trailing Edges

    NASA Technical Reports Server (NTRS)

    Miller, Steven A. E.

    2014-01-01

    High intensity acoustic radiation occurs when turbulence convects past airframe trailing edges. A mathematical model is developed to predict this acoustic radiation. The model is dependent on the local flow and turbulent statistics above the trailing edge of the flight vehicle airframe. These quantities are dependent on the jet and flight vehicle Mach numbers and jet temperature. A term in the model approximates the turbulent statistics of single-stream heated jet flows and is developed based upon measurement. The developed model is valid for a wide range of jet Mach numbers, jet temperature ratios, and flight vehicle Mach numbers. The model predicts traditional trailing edge noise if the jet is not interacting with the airframe. Predictions of mean-flow quantities and the cross-spectrum of static pressure near the airframe trailing edge are compared with measurement. Finally, predictions of acoustic intensity are compared with measurement and the model is shown to accurately capture the phenomenon.

  1. Third molar development: measurements versus scores as age predictor.

    PubMed

    Thevissen, P W; Fieuws, S; Willems, G

    2011-10-01

    Human third molar development is widely used to predict chronological age of sub adult individuals with unknown or doubted age. For these predictions, classically, the radiologically observed third molar growth and maturation is registered using a staging and related scoring technique. Measures of lengths and widths of the developing wisdom tooth and its adjacent second molar can be considered as an alternative registration. The aim of this study was to verify relations between mandibular third molar developmental stages or measurements of mandibular second molar and third molars and age. Age related performance of stages and measurements were compared to assess if measurements added information to age predictions from third molar formation stage. The sample was 340 orthopantomograms (170 females, 170 males) of individuals homogenously distributed in age between 7 and 24 years. Mandibular lower right, third and second molars, were staged following Gleiser and Hunt, length and width measurements were registered, and various ratios of these measurements were calculated. Univariable regression models with age as response and third molar stage, measurements and ratios of second and third molars as predictors, were considered. Multivariable regression models assessed if measurements or ratios added information to age prediction from third molar stage. Coefficients of determination (R(2)) and root mean squared errors (RMSE) obtained from all regression models were compared. The univariable regression model using stages as predictor yielded most accurate age predictions (males: R(2) 0.85, RMSE between 0.85 and 1.22 year; females: R(2) 0.77, RMSE between 1.19 and 2.11 year) compared to all models including measurements and ratios. The multivariable regression models indicated that measurements and ratios added no clinical relevant information to the age prediction from third molar stage. Ratios and measurements of second and third molars are less accurate age predictors than stages of developing third molars. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Short Term Single Station GNSS TEC Prediction Using Radial Basis Function Neural Network

    NASA Astrophysics Data System (ADS)

    Muslim, Buldan; Husin, Asnawi; Efendy, Joni

    2018-04-01

    TEC prediction models for 24 hours ahead have been developed from JOG2 GPS TEC data during 2016. Eleven month of TEC data were used as a training model of the radial basis function neural network (RBFNN) and 1 month of last data (December 2016) is used for the RBFNN model testing. The RBFNN inputs are the previous 24 hour TEC data and the minimum of Dst index during the previous 24 hours. Outputs of the model are 24 ahead TEC prediction. Comparison of model prediction show that the RBFNN model is able to predict the next 24 hours TEC is more accurate than the TEC GIM model.

  3. Accurate force field for molybdenum by machine learning large materials data

    NASA Astrophysics Data System (ADS)

    Chen, Chi; Deng, Zhi; Tran, Richard; Tang, Hanmei; Chu, Iek-Heng; Ong, Shyue Ping

    2017-09-01

    In this work, we present a highly accurate spectral neighbor analysis potential (SNAP) model for molybdenum (Mo) developed through the rigorous application of machine learning techniques on large materials data sets. Despite Mo's importance as a structural metal, existing force fields for Mo based on the embedded atom and modified embedded atom methods do not provide satisfactory accuracy on many properties. We will show that by fitting to the energies, forces, and stress tensors of a large density functional theory (DFT)-computed dataset on a diverse set of Mo structures, a Mo SNAP model can be developed that achieves close to DFT accuracy in the prediction of a broad range of properties, including elastic constants, melting point, phonon spectra, surface energies, grain boundary energies, etc. We will outline a systematic model development process, which includes a rigorous approach to structural selection based on principal component analysis, as well as a differential evolution algorithm for optimizing the hyperparameters in the model fitting so that both the model error and the property prediction error can be simultaneously lowered. We expect that this newly developed Mo SNAP model will find broad applications in large and long-time scale simulations.

  4. Methods for estimating magnitude and frequency of peak flows for natural streams in Utah

    USGS Publications Warehouse

    Kenney, Terry A.; Wilkowske, Chris D.; Wright, Shane J.

    2007-01-01

    Estimates of the magnitude and frequency of peak streamflows is critical for the safe and cost-effective design of hydraulic structures and stream crossings, and accurate delineation of flood plains. Engineers, planners, resource managers, and scientists need accurate estimates of peak-flow return frequencies for locations on streams with and without streamflow-gaging stations. The 2-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year recurrence-interval flows were estimated for 344 unregulated U.S. Geological Survey streamflow-gaging stations in Utah and nearby in bordering states. These data along with 23 basin and climatic characteristics computed for each station were used to develop regional peak-flow frequency and magnitude regression equations for 7 geohydrologic regions of Utah. These regression equations can be used to estimate the magnitude and frequency of peak flows for natural streams in Utah within the presented range of predictor variables. Uncertainty, presented as the average standard error of prediction, was computed for each developed equation. Equations developed using data from more than 35 gaging stations had standard errors of prediction that ranged from 35 to 108 percent, and errors for equations developed using data from less than 35 gaging stations ranged from 50 to 357 percent.

  5. PredictSNP: Robust and Accurate Consensus Classifier for Prediction of Disease-Related Mutations

    PubMed Central

    Bendl, Jaroslav; Stourac, Jan; Salanda, Ondrej; Pavelka, Antonin; Wieben, Eric D.; Zendulka, Jaroslav; Brezovsky, Jan; Damborsky, Jiri

    2014-01-01

    Single nucleotide variants represent a prevalent form of genetic variation. Mutations in the coding regions are frequently associated with the development of various genetic diseases. Computational tools for the prediction of the effects of mutations on protein function are very important for analysis of single nucleotide variants and their prioritization for experimental characterization. Many computational tools are already widely employed for this purpose. Unfortunately, their comparison and further improvement is hindered by large overlaps between the training datasets and benchmark datasets, which lead to biased and overly optimistic reported performances. In this study, we have constructed three independent datasets by removing all duplicities, inconsistencies and mutations previously used in the training of evaluated tools. The benchmark dataset containing over 43,000 mutations was employed for the unbiased evaluation of eight established prediction tools: MAPP, nsSNPAnalyzer, PANTHER, PhD-SNP, PolyPhen-1, PolyPhen-2, SIFT and SNAP. The six best performing tools were combined into a consensus classifier PredictSNP, resulting into significantly improved prediction performance, and at the same time returned results for all mutations, confirming that consensus prediction represents an accurate and robust alternative to the predictions delivered by individual tools. A user-friendly web interface enables easy access to all eight prediction tools, the consensus classifier PredictSNP and annotations from the Protein Mutant Database and the UniProt database. The web server and the datasets are freely available to the academic community at http://loschmidt.chemi.muni.cz/predictsnp. PMID:24453961

  6. Experimental Evaluation of Acoustic Engine Liner Models Developed with COMSOL Multiphysics

    NASA Technical Reports Server (NTRS)

    Schiller, Noah H.; Jones, Michael G.; Bertolucci, Brandon

    2017-01-01

    Accurate modeling tools are needed to design new engine liners capable of reducing aircraft noise. The purpose of this study is to determine if a commercially-available finite element package, COMSOL Multiphysics, can be used to accurately model a range of different acoustic engine liner designs, and in the process, collect and document a benchmark dataset that can be used in both current and future code evaluation activities. To achieve these goals, a variety of liner samples, ranging from conventional perforate-over-honeycomb to extended-reaction designs, were installed in one wall of the grazing flow impedance tube at the NASA Langley Research Center. The liners were exposed to high sound pressure levels and grazing flow, and the effect of the liner on the sound field in the flow duct was measured. These measurements were then compared with predictions. While this report only includes comparisons for a subset of the configurations, the full database of all measurements and predictions is available in electronic format upon request. The results demonstrate that both conventional perforate-over-honeycomb and extended-reaction liners can be accurately modeled using COMSOL. Therefore, this modeling tool can be used with confidence to supplement the current suite of acoustic propagation codes, and ultimately develop new acoustic engine liners designed to reduce aircraft noise.

  7. The mortality risk score and the ADG score: two points-based scoring systems for the Johns Hopkins aggregated diagnosis groups to predict mortality in a general adult population cohort in Ontario, Canada.

    PubMed

    Austin, Peter C; Walraven, Carl van

    2011-10-01

    Logistic regression models that incorporated age, sex, and indicator variables for the Johns Hopkins' Aggregated Diagnosis Groups (ADGs) categories have been shown to accurately predict all-cause mortality in adults. To develop 2 different point-scoring systems using the ADGs. The Mortality Risk Score (MRS) collapses age, sex, and the ADGs to a single summary score that predicts the annual risk of all-cause death in adults. The ADG Score derives weights for the individual ADG diagnosis groups. : Retrospective cohort constructed using population-based administrative data. All 10,498,413 residents of Ontario, Canada, between the age of 20 and 100 years who were alive on their birthday in 2007, participated in this study. Participants were randomly divided into derivation and validation samples. : Death within 1 year. In the derivation cohort, the MRS ranged from -21 to 139 (median value 29, IQR 17 to 44). In the validation group, a logistic regression model with the MRS as the sole predictor significantly predicted the risk of 1-year mortality with a c-statistic of 0.917. A regression model with age, sex, and the ADG Score has similar performance. Both methods accurately predicted the risk of 1-year mortality across the 20 vigintiles of risk. The MRS combined values for a person's age, sex, and the John Hopkins ADGs to accurately predict 1-year mortality in adults. The ADG Score is a weighted score representing the presence or absence of the 32 ADG diagnosis groups. These scores will facilitate health services researchers conducting risk adjustment using administrative health care databases.

  8. Identification of the high risk emergency surgical patient: Which risk prediction model should be used?

    PubMed Central

    Stonelake, Stephen; Thomson, Peter; Suggett, Nigel

    2015-01-01

    Introduction National guidance states that all patients having emergency surgery should have a mortality risk assessment calculated on admission so that the ‘high risk’ patient can receive the appropriate seniority and level of care. We aimed to assess if peri-operative risk scoring tools could accurately calculate mortality and morbidity risk. Methods Mortality risk scores for 86 consecutive emergency laparotomies, were calculated using pre-operative (ASA, Lee index) and post-operative (POSSUM, P-POSSUM and CR-POSSUM) risk calculation tools. Morbidity risk scores were calculated using the POSSUM predicted morbidity and compared against actual morbidity according to the Clavien–Dindo classification. Results The actual mortality was 10.5%. The average predicted risk scores for all laparotomies were: ASA 26.5%, Lee Index 2.5%, POSSUM 29.5%, P-POSSUM 18.5%, CR-POSSUM 10.5%. Complications occurred following 67 laparotomies (78%). The majority (51%) of complications were classified as Clavien–Dindo grade 2–3 (non-life-threatening). Patients having a POSSUM morbidity risk of greater than 50% developed significantly more life-threatening complications (CD 4–5) compared with those who predicted less than or equal to 50% morbidity risk (P = 0.01). Discussion Pre-operative risk stratification remains a challenge because the Lee Index under-predicts and ASA over-predicts mortality risk. Post-operative risk scoring using the CR-POSSUM is more accurate and we suggest can be used to identify patients who require intensive care post-operatively. Conclusions In the absence of accurate risk scoring tools that can be used on admission to hospital it is not possible to reliably audit the achievement of national standards of care for the ‘high-risk’ patient. PMID:26468369

  9. Identification of the high risk emergency surgical patient: Which risk prediction model should be used?

    PubMed

    Stonelake, Stephen; Thomson, Peter; Suggett, Nigel

    2015-09-01

    National guidance states that all patients having emergency surgery should have a mortality risk assessment calculated on admission so that the 'high risk' patient can receive the appropriate seniority and level of care. We aimed to assess if peri-operative risk scoring tools could accurately calculate mortality and morbidity risk. Mortality risk scores for 86 consecutive emergency laparotomies, were calculated using pre-operative (ASA, Lee index) and post-operative (POSSUM, P-POSSUM and CR-POSSUM) risk calculation tools. Morbidity risk scores were calculated using the POSSUM predicted morbidity and compared against actual morbidity according to the Clavien-Dindo classification. The actual mortality was 10.5%. The average predicted risk scores for all laparotomies were: ASA 26.5%, Lee Index 2.5%, POSSUM 29.5%, P-POSSUM 18.5%, CR-POSSUM 10.5%. Complications occurred following 67 laparotomies (78%). The majority (51%) of complications were classified as Clavien-Dindo grade 2-3 (non-life-threatening). Patients having a POSSUM morbidity risk of greater than 50% developed significantly more life-threatening complications (CD 4-5) compared with those who predicted less than or equal to 50% morbidity risk (P = 0.01). Pre-operative risk stratification remains a challenge because the Lee Index under-predicts and ASA over-predicts mortality risk. Post-operative risk scoring using the CR-POSSUM is more accurate and we suggest can be used to identify patients who require intensive care post-operatively. In the absence of accurate risk scoring tools that can be used on admission to hospital it is not possible to reliably audit the achievement of national standards of care for the 'high-risk' patient.

  10. 2018 update to the HIV-TRePS system: the development of new computational models to predict HIV treatment outcomes, with or without a genotype, with enhanced usability for low-income settings.

    PubMed

    Revell, Andrew D; Wang, Dechao; Perez-Elias, Maria-Jesus; Wood, Robin; Cogill, Dolphina; Tempelman, Hugo; Hamers, Raph L; Reiss, Peter; van Sighem, Ard I; Rehm, Catherine A; Pozniak, Anton; Montaner, Julio S G; Lane, H Clifford; Larder, Brendan A

    2018-06-08

    Optimizing antiretroviral drug combination on an individual basis can be challenging, particularly in settings with limited access to drugs and genotypic resistance testing. Here we describe our latest computational models to predict treatment responses, with or without a genotype, and compare their predictive accuracy with that of genotyping. Random forest models were trained to predict the probability of virological response to a new therapy introduced following virological failure using up to 50 000 treatment change episodes (TCEs) without a genotype and 18 000 TCEs including genotypes. Independent data sets were used to evaluate the models. This study tested the effects on model accuracy of relaxing the baseline data timing windows, the use of a new filter to exclude probable non-adherent cases and the addition of maraviroc, tipranavir and elvitegravir to the system. The no-genotype models achieved area under the receiver operator characteristic curve (AUC) values of 0.82 and 0.81 using the standard and relaxed baseline data windows, respectively. The genotype models achieved AUC values of 0.86 with the new non-adherence filter and 0.84 without. Both sets of models were significantly more accurate than genotyping with rules-based interpretation, which achieved AUC values of only 0.55-0.63, and were marginally more accurate than previous models. The models were able to identify alternative regimens that were predicted to be effective for the vast majority of cases in which the new regimen prescribed in the clinic failed. These latest global models predict treatment responses accurately even without a genotype and have the potential to help optimize therapy, particularly in resource-limited settings.

  11. Current and Developing Technologies for Monitoring Agents of Bioterrorism and Biowarfare

    PubMed Central

    Lim, Daniel V.; Simpson, Joyce M.; Kearns, Elizabeth A.; Kramer, Marianne F.

    2005-01-01

    Recent events have made public health officials acutely aware of the importance of rapidly and accurately detecting acts of bioterrorism. Because bioterrorism is difficult to predict or prevent, reliable platforms to rapidly detect and identify biothreat agents are important to minimize the spread of these agents and to protect the public health. These platforms must not only be sensitive and specific, but must also be able to accurately detect a variety of pathogens, including modified or previously uncharacterized agents, directly from complex sample matrices. Various commercial tests utilizing biochemical, immunological, nucleic acid, and bioluminescence procedures are currently available to identify biological threat agents. Newer tests have also been developed to identify such agents using aptamers, biochips, evanescent wave biosensors, cantilevers, living cells, and other innovative technologies. This review describes these current and developing technologies and considers challenges to rapid, accurate detection of biothreat agents. Although there is no ideal platform, many of these technologies have proved invaluable for the detection and identification of biothreat agents. PMID:16223949

  12. A dual-process account of auditory change detection.

    PubMed

    McAnally, Ken I; Martin, Russell L; Eramudugolla, Ranmalee; Stuart, Geoffrey W; Irvine, Dexter R F; Mattingley, Jason B

    2010-08-01

    Listeners can be "deaf" to a substantial change in a scene comprising multiple auditory objects unless their attention has been directed to the changed object. It is unclear whether auditory change detection relies on identification of the objects in pre- and post-change scenes. We compared the rates at which listeners correctly identify changed objects with those predicted by change-detection models based on signal detection theory (SDT) and high-threshold theory (HTT). Detected changes were not identified as accurately as predicted by models based on either theory, suggesting that some changes are detected by a process that does not support change identification. Undetected changes were identified as accurately as predicted by the HTT model but much less accurately than predicted by the SDT models. The process underlying change detection was investigated further by determining receiver-operating characteristics (ROCs). ROCs did not conform to those predicted by either a SDT or a HTT model but were well modeled by a dual-process that incorporated HTT and SDT components. The dual-process model also accurately predicted the rates at which detected and undetected changes were correctly identified.

  13. A deep learning-based multi-model ensemble method for cancer prediction.

    PubMed

    Xiao, Yawen; Wu, Jun; Lin, Zongli; Zhao, Xiaodong

    2018-01-01

    Cancer is a complex worldwide health problem associated with high mortality. With the rapid development of the high-throughput sequencing technology and the application of various machine learning methods that have emerged in recent years, progress in cancer prediction has been increasingly made based on gene expression, providing insight into effective and accurate treatment decision making. Thus, developing machine learning methods, which can successfully distinguish cancer patients from healthy persons, is of great current interest. However, among the classification methods applied to cancer prediction so far, no one method outperforms all the others. In this paper, we demonstrate a new strategy, which applies deep learning to an ensemble approach that incorporates multiple different machine learning models. We supply informative gene data selected by differential gene expression analysis to five different classification models. Then, a deep learning method is employed to ensemble the outputs of the five classifiers. The proposed deep learning-based multi-model ensemble method was tested on three public RNA-seq data sets of three kinds of cancers, Lung Adenocarcinoma, Stomach Adenocarcinoma and Breast Invasive Carcinoma. The test results indicate that it increases the prediction accuracy of cancer for all the tested RNA-seq data sets as compared to using a single classifier or the majority voting algorithm. By taking full advantage of different classifiers, the proposed deep learning-based multi-model ensemble method is shown to be accurate and effective for cancer prediction. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Predicting survival time in noncurative patients with advanced cancer: a prospective study in China.

    PubMed

    Cui, Jing; Zhou, Lingjun; Wee, B; Shen, Fengping; Ma, Xiuqiang; Zhao, Jijun

    2014-05-01

    Accurate prediction of prognosis for cancer patients is important for good clinical decision making in therapeutic and care strategies. The application of prognostic tools and indicators could improve prediction accuracy. This study aimed to develop a new prognostic scale to predict survival time of advanced cancer patients in China. We prospectively collected items that we anticipated might influence survival time of advanced cancer patients. Participants were recruited from 12 hospitals in Shanghai, China. We collected data including demographic information, clinical symptoms and signs, and biochemical test results. Log-rank tests, Cox regression, and linear regression were performed to develop a prognostic scale. Three hundred twenty patients with advanced cancer were recruited. Fourteen prognostic factors were included in the prognostic scale: Karnofsky Performance Scale (KPS) score, pain, ascites, hydrothorax, edema, delirium, cachexia, white blood cell (WBC) count, hemoglobin, sodium, total bilirubin, direct bilirubin, aspartate aminotransferase (AST), and alkaline phosphatase (ALP) values. The score was calculated by summing the partial scores, ranging from 0 to 30. When using the cutoff points of 7-day, 30-day, 90-day, and 180-day survival time, the scores were calculated as 12, 10, 8, and 6, respectively. We propose a new prognostic scale including KPS, pain, ascites, hydrothorax, edema, delirium, cachexia, WBC count, hemoglobin, sodium, total bilirubin, direct bilirubin, AST, and ALP values, which may help guide physicians in predicting the likely survival time of cancer patients more accurately. More studies are needed to validate this scale in the future.

  15. Variant effect prediction tools assessed using independent, functional assay-based datasets: implications for discovery and diagnostics.

    PubMed

    Mahmood, Khalid; Jung, Chol-Hee; Philip, Gayle; Georgeson, Peter; Chung, Jessica; Pope, Bernard J; Park, Daniel J

    2017-05-16

    Genetic variant effect prediction algorithms are used extensively in clinical genomics and research to determine the likely consequences of amino acid substitutions on protein function. It is vital that we better understand their accuracies and limitations because published performance metrics are confounded by serious problems of circularity and error propagation. Here, we derive three independent, functionally determined human mutation datasets, UniFun, BRCA1-DMS and TP53-TA, and employ them, alongside previously described datasets, to assess the pre-eminent variant effect prediction tools. Apparent accuracies of variant effect prediction tools were influenced significantly by the benchmarking dataset. Benchmarking with the assay-determined datasets UniFun and BRCA1-DMS yielded areas under the receiver operating characteristic curves in the modest ranges of 0.52 to 0.63 and 0.54 to 0.75, respectively, considerably lower than observed for other, potentially more conflicted datasets. These results raise concerns about how such algorithms should be employed, particularly in a clinical setting. Contemporary variant effect prediction tools are unlikely to be as accurate at the general prediction of functional impacts on proteins as reported prior. Use of functional assay-based datasets that avoid prior dependencies promises to be valuable for the ongoing development and accurate benchmarking of such tools.

  16. Improved prediction of antibody VL–VH orientation

    PubMed Central

    Marze, Nicholas A.; Lyskov, Sergey; Gray, Jeffrey J.

    2016-01-01

    Antibodies are important immune molecules with high commercial value and therapeutic interest because of their ability to bind diverse antigens. Computational prediction of antibody structure can quickly reveal valuable information about the nature of these antigen-binding interactions, but only if the models are of sufficient quality. To achieve high model quality during complementarity-determining region (CDR) structural prediction, one must account for the VL–VH orientation. We developed a novel four-metric VL–VH orientation coordinate frame. Additionally, we extended the CDR grafting protocol in RosettaAntibody with a new method that diversifies VL–VH orientation by using 10 VL–VH orientation templates rather than a single one. We tested the multiple-template grafting protocol on two datasets of known antibody crystal structures. During the template-grafting phase, the new protocol improved the fraction of accurate VL–VH orientation predictions from only 26% (12/46) to 72% (33/46) of targets. After the full RosettaAntibody protocol, including CDR H3 remodeling and VL–VH re-orientation, the new protocol produced more candidate structures with accurate VL–VH orientation than the standard protocol in 43/46 targets (93%). The improved ability to predict VL–VH orientation will bolster predictions of other parts of the paratope, including the conformation of CDR H3, a grand challenge of antibody homology modeling. PMID:27276984

  17. Predicting the stability of nanodevices

    NASA Astrophysics Data System (ADS)

    Lin, Z. Z.; Yu, W. F.; Wang, Y.; Ning, X. J.

    2011-05-01

    A simple model based on the statistics of single atoms is developed to predict the stability or lifetime of nanodevices without empirical parameters. Under certain conditions, the model produces the Arrhenius law and the Meyer-Neldel compensation rule. Compared with the classical molecular-dynamics simulations for predicting the stability of monatomic carbon chain at high temperature, the model is proved to be much more accurate than the transition state theory. Based on the ab initio calculation of the static potential, the model can give out a corrected lifetime of monatomic carbon and gold chains at higher temperature, and predict that the monatomic chains are very stable at room temperature.

  18. Global vision of druggability issues: applications and perspectives.

    PubMed

    Abi Hussein, Hiba; Geneix, Colette; Petitjean, Michel; Borrel, Alexandre; Flatters, Delphine; Camproux, Anne-Claude

    2017-02-01

    During the preliminary stage of a drug discovery project, the lack of druggability information and poor target selection are the main causes of frequent failures. Elaborating on accurate computational druggability prediction methods is a requirement for prioritizing target selection, designing new drugs and avoiding side effects. In this review, we describe a survey of recently reported druggability prediction methods mainly based on networks, statistical pocket druggability predictions and virtual screening. An application for a frequent mutation of p53 tumor suppressor is presented, illustrating the complementarity of druggability prediction approaches, the remaining challenges and potential new drug development perspectives. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Computer-Assisted Decision Support for Student Admissions Based on Their Predicted Academic Performance.

    PubMed

    Muratov, Eugene; Lewis, Margaret; Fourches, Denis; Tropsha, Alexander; Cox, Wendy C

    2017-04-01

    Objective. To develop predictive computational models forecasting the academic performance of students in the didactic-rich portion of a doctor of pharmacy (PharmD) curriculum as admission-assisting tools. Methods. All PharmD candidates over three admission cycles were divided into two groups: those who completed the PharmD program with a GPA ≥ 3; and the remaining candidates. Random Forest machine learning technique was used to develop a binary classification model based on 11 pre-admission parameters. Results. Robust and externally predictive models were developed that had particularly high overall accuracy of 77% for candidates with high or low academic performance. These multivariate models were highly accurate in predicting these groups to those obtained using undergraduate GPA and composite PCAT scores only. Conclusion. The models developed in this study can be used to improve the admission process as preliminary filters and thus quickly identify candidates who are likely to be successful in the PharmD curriculum.

  20. Life prediction technologies for aeronautical propulsion systems

    NASA Technical Reports Server (NTRS)

    Mcgaw, Michael A.

    1987-01-01

    Fatigue and fracture problems continue to occur in aeronautical gas turbine engines. Components whose useful life is limited by these failure modes include turbine hot-section blades, vanes and disks. Safety considerations dictate that catastrophic failures be avoided, while economic considerations dictate that noncatastrophic failures occur as infrequently as possible. The design decision is therefore in making the tradeoff between engine performance and durability. The NASA Lewis Research Center has contributed to the aeropropulsion industry in the areas of life prediction technology for 30 years, developing creep and fatigue life prediction methodologies for hot-section materials. Emphasis is placed on the development of methods capable of handling both thermal and mechanical fatigue under severe environments. Recent accomplishments include the development of more accurate creep-fatigue life prediction methods such as the total strain version of Lewis' Strainrange Partitioning (SRP) and the HOST-developed Cyclic Damage Accumulation (CDA) model. Other examples include the Double Damage Curve Approach (DDCA), which provides greatly improved accuracy for cumulative fatigue design rules.

  1. Predicting the Risk of Clostridium difficile Infection upon Admission: A Score to Identify Patients for Antimicrobial Stewardship Efforts.

    PubMed

    Kuntz, Jennifer L; Smith, David H; Petrik, Amanda F; Yang, Xiuhai; Thorp, Micah L; Barton, Tracy; Barton, Karen; Labreche, Matthew; Spindel, Steven J; Johnson, Eric S

    2016-01-01

    Increasing morbidity and health care costs related to Clostridium difficile infection (CDI) have heightened interest in methods to identify patients who would most benefit from interventions to mitigate the likelihood of CDI. To develop a risk score that can be calculated upon hospital admission and used by antimicrobial stewards, including pharmacists and clinicians, to identify patients at risk for CDI who would benefit from enhanced antibiotic review and patient education. We assembled a cohort of Kaiser Permanente Northwest patients with a hospital admission from July 1, 2005, through December 30, 2012, and identified CDI in the six months following hospital admission. Using Cox regression, we constructed a score to identify patients at high risk for CDI on the basis of preadmission characteristics. We calculated and plotted the observed six-month CDI risk for each decile of predicted risk. We identified 721 CDIs following 54,186 hospital admissions-a 6-month incidence of 13.3 CDIs/1000 patient admissions. Patients with the highest predicted risk of CDI had an observed incidence of 53 CDIs/1000 patient admissions. The score differentiated between patients who do and do not develop CDI, with values for the extended C-statistic of 0.75. Predicted risk for CDI agreed closely with observed risk. Our risk score accurately predicted six-month risk for CDI using preadmission characteristics. Accurate predictions among the highest-risk patient subgroups allow for the identification of patients who could be targeted for and who would likely benefit from review of inpatient antibiotic use or enhanced educational efforts at the time of discharge planning.

  2. Accurate prediction of bacterial type IV secreted effectors using amino acid composition and PSSM profiles.

    PubMed

    Zou, Lingyun; Nan, Chonghan; Hu, Fuquan

    2013-12-15

    Various human pathogens secret effector proteins into hosts cells via the type IV secretion system (T4SS). These proteins play important roles in the interaction between bacteria and hosts. Computational methods for T4SS effector prediction have been developed for screening experimental targets in several isolated bacterial species; however, widely applicable prediction approaches are still unavailable In this work, four types of distinctive features, namely, amino acid composition, dipeptide composition, .position-specific scoring matrix composition and auto covariance transformation of position-specific scoring matrix, were calculated from primary sequences. A classifier, T4EffPred, was developed using the support vector machine with these features and their different combinations for effector prediction. Various theoretical tests were performed in a newly established dataset, and the results were measured with four indexes. We demonstrated that T4EffPred can discriminate IVA and IVB effectors in benchmark datasets with positive rates of 76.7% and 89.7%, respectively. The overall accuracy of 95.9% shows that the present method is accurate for distinguishing the T4SS effector in unidentified sequences. A classifier ensemble was designed to synthesize all single classifiers. Notable performance improvement was observed using this ensemble system in benchmark tests. To demonstrate the model's application, a genome-scale prediction of effectors was performed in Bartonella henselae, an important zoonotic pathogen. A number of putative candidates were distinguished. A web server implementing the prediction method and the source code are both available at http://bioinfo.tmmu.edu.cn/T4EffPred.

  3. A probabilistic and adaptive approach to modeling performance of pavement infrastructure

    DOT National Transportation Integrated Search

    2007-08-01

    Accurate prediction of pavement performance is critical to pavement management agencies. Reliable and accurate predictions of pavement infrastructure performance can save significant amounts of money for pavement infrastructure management agencies th...

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livescu, Veronica; Bronkhorst, Curt Allan; Vander Wiel, Scott Alan

    Many challenges exist with regard to understanding and representing complex physical processes involved with ductile damage and failure in polycrystalline metallic materials. Currently, the ability to accurately predict the macroscale ductile damage and failure response of metallic materials is lacking. Research at Los Alamos National Laboratory (LANL) is aimed at building a coupled experimental and computational methodology that supports the development of predictive damage capabilities by: capturing real distributions of microstructural features from real material and implementing them as digitally generated microstructures in damage model development; and, distilling structure-property information to link microstructural details to damage evolution under a multitudemore » of loading states.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hobbs, Michael L.

    We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model,more » implementation, and validation.« less

  6. High performance computation of residual stress and distortion in laser welded 301L stainless sheets

    DOE PAGES

    Huang, Hui; Tsutsumi, Seiichiro; Wang, Jiandong; ...

    2017-07-11

    Transient thermo-mechanical simulation of stainless plate laser welding process was performed by a highly efficient and accurate approach-hybrid iterative substructure and adaptive mesh method. Especially, residual stress prediction was enhanced by considering various heat effects in the numerical model. The influence of laser welding heat input on residual stress and welding distortion of stainless thin sheets were investigated by experiment and simulation. X-ray diffraction (XRD) and contour method were used to measure the surficial and internal residual stress respectively. Effect of strain hardening, annealing and melting on residual stress prediction was clarified through a parametric study. It was shown thatmore » these heat effects must be taken into account for accurate prediction of residual stresses in laser welded stainless sheets. Reasonable agreement among residual stresses by numerical method, XRD and contour method was obtained. Buckling type welding distortion was also well reproduced by the developed thermo-mechanical FEM.« less

  7. High performance computation of residual stress and distortion in laser welded 301L stainless sheets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Hui; Tsutsumi, Seiichiro; Wang, Jiandong

    Transient thermo-mechanical simulation of stainless plate laser welding process was performed by a highly efficient and accurate approach-hybrid iterative substructure and adaptive mesh method. Especially, residual stress prediction was enhanced by considering various heat effects in the numerical model. The influence of laser welding heat input on residual stress and welding distortion of stainless thin sheets were investigated by experiment and simulation. X-ray diffraction (XRD) and contour method were used to measure the surficial and internal residual stress respectively. Effect of strain hardening, annealing and melting on residual stress prediction was clarified through a parametric study. It was shown thatmore » these heat effects must be taken into account for accurate prediction of residual stresses in laser welded stainless sheets. Reasonable agreement among residual stresses by numerical method, XRD and contour method was obtained. Buckling type welding distortion was also well reproduced by the developed thermo-mechanical FEM.« less

  8. Forecasting the spatial transmission of influenza in the United States.

    PubMed

    Pei, Sen; Kandula, Sasikiran; Yang, Wan; Shaman, Jeffrey

    2018-03-13

    Recurrent outbreaks of seasonal and pandemic influenza create a need for forecasts of the geographic spread of this pathogen. Although it is well established that the spatial progression of infection is largely attributable to human mobility, difficulty obtaining real-time information on human movement has limited its incorporation into existing infectious disease forecasting techniques. In this study, we develop and validate an ensemble forecast system for predicting the spatiotemporal spread of influenza that uses readily accessible human mobility data and a metapopulation model. In retrospective state-level forecasts for 35 US states, the system accurately predicts local influenza outbreak onset,-i.e., spatial spread, defined as the week that local incidence increases above a baseline threshold-up to 6 wk in advance of this event. In addition, the metapopulation prediction system forecasts influenza outbreak onset, peak timing, and peak intensity more accurately than isolated location-specific forecasts. The proposed framework could be applied to emergent respiratory viruses and, with appropriate modifications, other infectious diseases.

  9. Predicting Multicomponent Adsorption Isotherms in Open-Metal Site Materials Using Force Field Calculations Based on Energy Decomposed Density Functional Theory.

    PubMed

    Heinen, Jurn; Burtch, Nicholas C; Walton, Krista S; Fonseca Guerra, Célia; Dubbeldam, David

    2016-12-12

    For the design of adsorptive-separation units, knowledge is required of the multicomponent adsorption behavior. Ideal adsorbed solution theory (IAST) breaks down for olefin adsorption in open-metal site (OMS) materials due to non-ideal donor-acceptor interactions. Using a density-function-theory-based energy decomposition scheme, we develop a physically justifiable classical force field that incorporates the missing orbital interactions using an appropriate functional form. Our first-principles derived force field shows greatly improved quantitative agreement with the inflection points, initial uptake, saturation capacity, and enthalpies of adsorption obtained from our in-house adsorption experiments. While IAST fails to make accurate predictions, our improved force field model is able to correctly predict the multicomponent behavior. Our approach is also transferable to other OMS structures, allowing the accurate study of their separation performances for olefins/paraffins and further mixtures involving complex donor-acceptor interactions. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Theory of mind selectively predicts preschoolers’ knowledge-based selective word learning

    PubMed Central

    Brosseau-Liard, Patricia; Penney, Danielle; Poulin-Dubois, Diane

    2015-01-01

    Children can selectively attend to various attributes of a model, such as past accuracy or physical strength, to guide their social learning. There is a debate regarding whether a relation exists between theory-of-mind skills and selective learning. We hypothesized that high performance on theory-of-mind tasks would predict preference for learning new words from accurate informants (an epistemic attribute), but not from physically strong informants (a non-epistemic attribute). Three- and 4-year-olds (N = 65) completed two selective learning tasks, and their theory of mind abilities were assessed. As expected, performance on a theory-of-mind battery predicted children’s preference to learn from more accurate informants but not from physically stronger informants. Results thus suggest that preschoolers with more advanced theory of mind have a better understanding of knowledge and apply that understanding to guide their selection of informants. This work has important implications for research on children’s developing social cognition and early learning. PMID:26211504

  11. Theory of mind selectively predicts preschoolers' knowledge-based selective word learning.

    PubMed

    Brosseau-Liard, Patricia; Penney, Danielle; Poulin-Dubois, Diane

    2015-11-01

    Children can selectively attend to various attributes of a model, such as past accuracy or physical strength, to guide their social learning. There is a debate regarding whether a relation exists between theory-of-mind skills and selective learning. We hypothesized that high performance on theory-of-mind tasks would predict preference for learning new words from accurate informants (an epistemic attribute), but not from physically strong informants (a non-epistemic attribute). Three- and 4-year-olds (N = 65) completed two selective learning tasks, and their theory-of-mind abilities were assessed. As expected, performance on a theory-of-mind battery predicted children's preference to learn from more accurate informants but not from physically stronger informants. Results thus suggest that preschoolers with more advanced theory of mind have a better understanding of knowledge and apply that understanding to guide their selection of informants. This work has important implications for research on children's developing social cognition and early learning. © 2015 The British Psychological Society.

  12. Method for Accurately Calibrating a Spectrometer Using Broadband Light

    NASA Technical Reports Server (NTRS)

    Simmons, Stephen; Youngquist, Robert

    2011-01-01

    A novel method has been developed for performing very fine calibration of a spectrometer. This process is particularly useful for modern miniature charge-coupled device (CCD) spectrometers where a typical factory wavelength calibration has been performed and a finer, more accurate calibration is desired. Typically, the factory calibration is done with a spectral line source that generates light at known wavelengths, allowing specific pixels in the CCD array to be assigned wavelength values. This method is good to about 1 nm across the spectrometer s wavelength range. This new method appears to be accurate to about 0.1 nm, a factor of ten improvement. White light is passed through an unbalanced Michelson interferometer, producing an optical signal with significant spectral variation. A simple theory can be developed to describe this spectral pattern, so by comparing the actual spectrometer output against this predicted pattern, errors in the wavelength assignment made by the spectrometer can be determined.

  13. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    PubMed

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Estimation of basal metabolic rate in Chinese: are the current prediction equations applicable?

    PubMed

    Camps, Stefan G; Wang, Nan Xin; Tan, Wei Shuan Kimberly; Henry, C Jeyakumar

    2016-08-31

    Measurement of basal metabolic rate (BMR) is suggested as a tool to estimate energy requirements. Therefore, BMR prediction equations have been developed in multiple populations because indirect calorimetry is not always feasible. However, there is a paucity of data on BMR measured in overweight and obese adults living in Asia and equations developed for this group of interest. The aim of this study was to develop a new BMR prediction equation for Chinese adults applicable for a large BMI range and compare it with commonly used prediction equations. Subjects were 121 men and 111 women (age: 21-67 years, BMI: 16-41 kg/m(2)). Height, weight, and BMR were measured. Continuous open-circuit indirect calorimetry using a ventilated hood system for 30 min was used to measure BMR. A regression equation was derived using stepwise regression and accuracy was compared to 6 existing equations (Harris-Benedict, Henry, Liu, Yang, Owen and Mifflin). Additionally, the newly derived equation was cross-validated in a separate group of 70 Chinese subjects (26 men and 44 women, age: 21-69 years, BMI: 17-39 kg/m(2)). The equation developed from our data was: BMR (kJ/d) = 52.6 x weight (kg) + 828 x gender + 1960 (women = 0, men = 1; R(2) = 0.81). The accuracy rate (within 10 % accurate) was 78 % which compared well to Owen (70 %), Henry (67 %), Mifflin (67 %), Liu (58 %), Harris-Benedict (45 %) and Yang (37 %) for the whole range of BMI. For a BMI greater than 23, the Singapore equation reached an accuracy rate of 76 %. Cross-validation proved an accuracy rate of 80 %. To date, the newly developed Singapore equation is the most accurate BMR prediction equation in Chinese and is applicable for use in a large BMI range including those overweight and obese.

  15. MetaPSICOV: combining coevolution methods for accurate prediction of contacts and long range hydrogen bonding in proteins.

    PubMed

    Jones, David T; Singh, Tanya; Kosciolek, Tomasz; Tetchner, Stuart

    2015-04-01

    Recent developments of statistical techniques to infer direct evolutionary couplings between residue pairs have rendered covariation-based contact prediction a viable means for accurate 3D modelling of proteins, with no information other than the sequence required. To extend the usefulness of contact prediction, we have designed a new meta-predictor (MetaPSICOV) which combines three distinct approaches for inferring covariation signals from multiple sequence alignments, considers a broad range of other sequence-derived features and, uniquely, a range of metrics which describe both the local and global quality of the input multiple sequence alignment. Finally, we use a two-stage predictor, where the second stage filters the output of the first stage. This two-stage predictor is additionally evaluated on its ability to accurately predict the long range network of hydrogen bonds, including correctly assigning the donor and acceptor residues. Using the original PSICOV benchmark set of 150 protein families, MetaPSICOV achieves a mean precision of 0.54 for top-L predicted long range contacts-around 60% higher than PSICOV, and around 40% better than CCMpred. In de novo protein structure prediction using FRAGFOLD, MetaPSICOV is able to improve the TM-scores of models by a median of 0.05 compared with PSICOV. Lastly, for predicting long range hydrogen bonding, MetaPSICOV-HB achieves a precision of 0.69 for the top-L/10 hydrogen bonds compared with just 0.26 for the baseline MetaPSICOV. MetaPSICOV is available as a freely available web server at http://bioinf.cs.ucl.ac.uk/MetaPSICOV. Raw data (predicted contact lists and 3D models) and source code can be downloaded from http://bioinf.cs.ucl.ac.uk/downloads/MetaPSICOV. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  16. Accurate prediction of protein-protein interactions by integrating potential evolutionary information embedded in PSSM profile and discriminative vector machine classifier.

    PubMed

    Li, Zheng-Wei; You, Zhu-Hong; Chen, Xing; Li, Li-Ping; Huang, De-Shuang; Yan, Gui-Ying; Nie, Ru; Huang, Yu-An

    2017-04-04

    Identification of protein-protein interactions (PPIs) is of critical importance for deciphering the underlying mechanisms of almost all biological processes of cell and providing great insight into the study of human disease. Although much effort has been devoted to identifying PPIs from various organisms, existing high-throughput biological techniques are time-consuming, expensive, and have high false positive and negative results. Thus it is highly urgent to develop in silico methods to predict PPIs efficiently and accurately in this post genomic era. In this article, we report a novel computational model combining our newly developed discriminative vector machine classifier (DVM) and an improved Weber local descriptor (IWLD) for the prediction of PPIs. Two components, differential excitation and orientation, are exploited to build evolutionary features for each protein sequence. The main characteristics of the proposed method lies in introducing an effective feature descriptor IWLD which can capture highly discriminative evolutionary information from position-specific scoring matrixes (PSSM) of protein data, and employing the powerful and robust DVM classifier. When applying the proposed method to Yeast and H. pylori data sets, we obtained excellent prediction accuracies as high as 96.52% and 91.80%, respectively, which are significantly better than the previous methods. Extensive experiments were then performed for predicting cross-species PPIs and the predictive results were also pretty promising. To further validate the performance of the proposed method, we compared it with the state-of-the-art support vector machine (SVM) classifier on Human data set. The experimental results obtained indicate that our method is highly effective for PPIs prediction and can be taken as a supplementary tool for future proteomics research.

  17. Predictive validity of four bioelectrical impedance equations in determining percent fat mass in overweight and obese children.

    PubMed

    Cleary, Jane; Daniells, Suzie; Okely, Anthony D; Batterham, Marijka; Nicholls, Jessie

    2008-01-01

    Bioelectrical impedance equations are frequently used by food and nutrition professionals to estimate percent fat mass in overweight and obese children. However, it is not known whether they are accurate for such children, as they have been primarily developed for children of varying body weights. The aim of this cross-sectional study was to evaluate the predictive validity of four previously published prediction equations developed for the pediatric population, among a sample of overweight and obese children. Thirty overweight or obese children (mean age=7.57+/-1.28 years) underwent measurement of fat mass, percent fat mass, and fat-free mass using dual-energy x-ray absorptiometry (DEXA) and bioelectrical impedance analysis (BIA). Impedance values from the BIA were entered into the four prediction equations and Pearson correlations used to determine the significance of associations between each of the BIA prediction equations and DEXA for percent fat mass, fat mass, and fat-free mass. For percent fat mass, paired t tests were used to assess differences between the methods and the technique of Bland and Altman was used to determine bias and error. Results showed that the mean percent fat mass as determined by DEXA for this age group was 40.79%. In comparison with other BIA prediction equations, the Schaefer equation had the closest mean value of 41.98%, and was the only equation not to significantly differ from the DEXA (P=0.121). This study suggests that the Schaefer equation is the only accurate BIA prediction equation for assessing percent fat mass in this sample of overweight and obese children from primarily white backgrounds.

  18. [How exactly can we predict the prognosis of COPD].

    PubMed

    Atiş, Sibel; Kanik, Arzu; Ozgür, Eylem Sercan; Eker, Suzan; Tümkaya, Münir; Ozge, Cengiz

    2009-01-01

    Predictive models play a pivotal role in the provision of accurate and useful probabilistic assessments of clinical outcomes in chronic diseases. This study was aimed to develop a dedicated prognostic index for quantifying progression risk in chronic obstructive pulmonary disease (COPD). Data were collected prospectively from 75 COPD patients during a three years period. A predictive model of progression risk of COPD was developed using Bayesian logistic regression analysis by Markov chain Monte Carlo method. One-year cycles were used for the disease progression in this model. Primary end points for progression were impairment in basal dyspne index (BDI) score, FEV(1) decline, and exacerbation frequency in last three years. Time-varying covariates age, smoking, body mass index (BMI), severity of disease according to GOLD, PaO2, PaCO(2), IC, RV/TLC, DLCO were used under the study. The mean age was 57.1 + or - 8.1. BDI were strongly correlated with exacerbation frequency (p= 0.001) but not with FEV(1) decline. BMI was found to be a predictor factor for impairment in BDI (p= 0.03). The following independent risk factors were significant to predict exacerbation frequency: GOLD staging (OR for GOLD I vs. II and III = 2.3 and 4.0), hypoxemia (OR for mild vs moderate and severe = 2.1 and 5.1) and hyperinflation (OR= 1.6). PaO2 (p= 0.026), IC (p= 0.02) and RV/TLC (p= 0.03) were found to be predictive factors for FEV(1) decline. The model estimated BDI, lung function and exacerbation frequency at the last time point by testing initial data of three years with 95% reliability (p< 0.001). Accordingly, this model was evaluated as confident of 95% for assessing the future status of COPD patients. Using Bayesian predictive models, it was possible to develop a risk-stratification index that accurately predicted progression of COPD. This model can provide decision-making about future in COPD patients with high reliability looking clinical data of beginning.

  19. Towards A Predictive First Principles Understanding Of Molecular Adsorption On Graphene

    DTIC Science & Technology

    2016-10-05

    used and developed state-of-the-art quantum mechanical methods to make accurate predictions about the interaction strength and adsorption structure...density functional theory, ab initio methods 16.  SECURITY CLASSIFICATION OF: 17.  LIMITATION OF ABSTRACT SAR 18.  NUMBER OF PAGES   11   19a.  NAME OF...important physical properties for a whole class of systems with weak non-covalent interactions, for example those involving the binding between water

  20. Early Diagnosis and Intervention Strategies for Post-Traumatic Heterotopic Ossification in Severely Injured Extremities

    DTIC Science & Technology

    2013-10-01

    study will recruit wounded warriors with severe extremity trauma, which places them at high risk for heterotopic ossification (HO); bone formation at...involved in HO; 2) to define accurate and practical methods to predict where HO will develop; and 3) to define potential therapies for prevention or...elicit HO. These tools also need to provide effective methods for early diagnosis or risk assessment (prediction) so that therapies for prevention or

  1. Free surface profiles in river flows: Can standard energy-based gradually-varied flow computations be pursued?

    NASA Astrophysics Data System (ADS)

    Cantero, Francisco; Castro-Orgaz, Oscar; Garcia-Marín, Amanda; Ayuso, José Luis; Dey, Subhasish

    2015-10-01

    Is the energy equation for gradually-varied flow the best approximation for the free surface profile computations in river flows? Determination of flood inundation in rivers and natural waterways is based on the hydraulic computation of flow profiles. This is usually done using energy-based gradually-varied flow models, like HEC-RAS, that adopts a vertical division method for discharge prediction in compound channel sections. However, this discharge prediction method is not so accurate in the context of advancements over the last three decades. This paper firstly presents a study of the impact of discharge prediction on the gradually-varied flow computations by comparing thirteen different methods for compound channels, where both energy and momentum equations are applied. The discharge, velocity distribution coefficients, specific energy, momentum and flow profiles are determined. After the study of gradually-varied flow predictions, a new theory is developed to produce higher-order energy and momentum equations for rapidly-varied flow in compound channels. These generalized equations enable to describe the flow profiles with more generality than the gradually-varied flow computations. As an outcome, results of gradually-varied flow provide realistic conclusions for computations of flow in compound channels, showing that momentum-based models are in general more accurate; whereas the new theory developed for rapidly-varied flow opens a new research direction, so far not investigated in flows through compound channels.

  2. A simple equation to estimate body fat percentage in children with overweightness or obesity: a retrospective study.

    PubMed

    Cortés-Castell, Ernesto; Juste, Mercedes; Palazón-Bru, Antonio; Monge, Laura; Sánchez-Ferrer, Francisco; Rizo-Baeza, María Mercedes

    2017-01-01

    Dual-energy X-ray absorptiometry (DXA) provides separate measurements of fat mass, fat-free mass and bone mass, and is a quick, accurate, and safe technique, yet one that is not readily available in routine clinical practice. Consequently, we aimed to develop statistical formulas to predict fat mass (%) and fat mass index (FMI) with simple parameters (age, sex, weight and height). We conducted a retrospective observational cross-sectional study in 416 overweight or obese patients aged 4-18 years that involved assessing adiposity by DXA (fat mass percentage and FMI), body mass index (BMI), sex and age. We randomly divided the sample into two parts (construction and validation). In the construction sample, we developed formulas to predict fat mass and FMI using linear multiple regression models. The formulas were validated in the other sample, calculating the intraclass correlation coefficient via bootstrapping. The fat mass percentage formula had a coefficient of determination of 0.65. This value was 0.86 for FMI. In the validation, the constructed formulas had an intraclass correlation coefficient of 0.77 for fat mass percentage and 0.92 for FMI. Our predictive formulas accurately predicted fat mass and FMI with simple parameters (BMI, sex and age) in children with overweight and obesity. The proposed methodology could be applied in other fields. Further studies are needed to externally validate these formulas.

  3. Fraction of organic carbon predicts labile desorption rates of chlorinated organic pollutants in laboratory-spiked geosorbents.

    PubMed

    Ginsbach, Jake W; Killops, Kato L; Olsen, Robert M; Peterson, Brittney; Dunnivant, Frank M

    2010-05-01

    The resuspension of large volumes of sediments that are contaminated with chlorinated pollutants continues to threaten environmental quality and human health. Whereas kinetic models are more accurate for estimating the environmental impact of these events, their widespread use is substantially hampered by the need for costly, time-consuming, site-specific kinetics experiments. The present study investigated the development of a predictive model for desorption rates from easily measurable sorbent and pollutant properties by examining the relationship between the fraction of organic carbon (fOC) and labile release rates. Duplicate desorption measurements were performed on 46 unique combinations of pollutants and sorbents with fOC values ranging from 0.001 to 0.150. Labile desorption rate constants indicate that release rates predominantly depend upon the fOC in the geosorbent. Previous theoretical models, such as the macro-mesopore and organic matter (MOM) diffusion model, have predicted such a relationship but could not accurately predict the experimental rate constants collected in the present study. An empirical model was successfully developed to correlate the labile desorption rate constant (krap) to the fraction of organic material where log(krap)=0.291-0.785 . log(fOC). These results provide the first experimental evidence that kinetic pollution releases during resuspension events are governed by the fOC content in natural geosorbents. Copyright (c) 2010 SETAC.

  4. Genome-Scale Screening of Drug-Target Associations Relevant to Ki Using a Chemogenomics Approach

    PubMed Central

    Cao, Dong-Sheng; Liang, Yi-Zeng; Deng, Zhe; Hu, Qian-Nan; He, Min; Xu, Qing-Song; Zhou, Guang-Hua; Zhang, Liu-Xia; Deng, Zi-xin; Liu, Shao

    2013-01-01

    The identification of interactions between drugs and target proteins plays a key role in genomic drug discovery. In the present study, the quantitative binding affinities of drug-target pairs are differentiated as a measurement to define whether a drug interacts with a protein or not, and then a chemogenomics framework using an unbiased set of general integrated features and random forest (RF) is employed to construct a predictive model which can accurately classify drug-target pairs. The predictability of the model is further investigated and validated by several independent validation sets. The built model is used to predict drug-target associations, some of which were confirmed by comparing experimental data from public biological resources. A drug-target interaction network with high confidence drug-target pairs was also reconstructed. This network provides further insight for the action of drugs and targets. Finally, a web-based server called PreDPI-Ki was developed to predict drug-target interactions for drug discovery. In addition to providing a high-confidence list of drug-target associations for subsequent experimental investigation guidance, these results also contribute to the understanding of drug-target interactions. We can also see that quantitative information of drug-target associations could greatly promote the development of more accurate models. The PreDPI-Ki server is freely available via: http://sdd.whu.edu.cn/dpiki. PMID:23577055

  5. Development and evaluation of height diameter at breast models for native Chinese Metasequoia.

    PubMed

    Liu, Mu; Feng, Zhongke; Zhang, Zhixiang; Ma, Chenghui; Wang, Mingming; Lian, Bo-Ling; Sun, Renjie; Zhang, Li

    2017-01-01

    Accurate tree height and diameter at breast height (dbh) are important input variables for growth and yield models. A total of 5503 Chinese Metasequoia trees were used in this study. We studied 53 fitted models, of which 7 were linear models and 46 were non-linear models. These models were divided into two groups of single models and multivariate models according to the number of independent variables. The results show that the allometry equation of tree height which has diameter at breast height as independent variable can better reflect the change of tree height; in addition the prediction accuracy of the multivariate composite models is higher than that of the single variable models. Although tree age is not the most important variable in the study of the relationship between tree height and dbh, the consideration of tree age when choosing models and parameters in model selection can make the prediction of tree height more accurate. The amount of data is also an important parameter what can improve the reliability of models. Other variables such as tree height, main dbh and altitude, etc can also affect models. In this study, the method of developing the recommended models for predicting the tree height of native Metasequoias aged 50-485 years is statistically reliable and can be used for reference in predicting the growth and production of mature native Metasequoia.

  6. Development and evaluation of height diameter at breast models for native Chinese Metasequoia

    PubMed Central

    Feng, Zhongke; Zhang, Zhixiang; Ma, Chenghui; Wang, Mingming; Lian, Bo-ling; Sun, Renjie; Zhang, Li

    2017-01-01

    Accurate tree height and diameter at breast height (dbh) are important input variables for growth and yield models. A total of 5503 Chinese Metasequoia trees were used in this study. We studied 53 fitted models, of which 7 were linear models and 46 were non-linear models. These models were divided into two groups of single models and multivariate models according to the number of independent variables. The results show that the allometry equation of tree height which has diameter at breast height as independent variable can better reflect the change of tree height; in addition the prediction accuracy of the multivariate composite models is higher than that of the single variable models. Although tree age is not the most important variable in the study of the relationship between tree height and dbh, the consideration of tree age when choosing models and parameters in model selection can make the prediction of tree height more accurate. The amount of data is also an important parameter what can improve the reliability of models. Other variables such as tree height, main dbh and altitude, etc can also affect models. In this study, the method of developing the recommended models for predicting the tree height of native Metasequoias aged 50–485 years is statistically reliable and can be used for reference in predicting the growth and production of mature native Metasequoia. PMID:28817600

  7. Exploiting Locality in Quantum Computation for Quantum Chemistry.

    PubMed

    McClean, Jarrod R; Babbush, Ryan; Love, Peter J; Aspuru-Guzik, Alán

    2014-12-18

    Accurate prediction of chemical and material properties from first-principles quantum chemistry is a challenging task on traditional computers. Recent developments in quantum computation offer a route toward highly accurate solutions with polynomial cost; however, this solution still carries a large overhead. In this Perspective, we aim to bring together known results about the locality of physical interactions from quantum chemistry with ideas from quantum computation. We show that the utilization of spatial locality combined with the Bravyi-Kitaev transformation offers an improvement in the scaling of known quantum algorithms for quantum chemistry and provides numerical examples to help illustrate this point. We combine these developments to improve the outlook for the future of quantum chemistry on quantum computers.

  8. Global Aerodynamic Modeling for Stall/Upset Recovery Training Using Efficient Piloted Flight Test Techniques

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Cunningham, Kevin; Hill, Melissa A.

    2013-01-01

    Flight test and modeling techniques were developed for efficiently identifying global aerodynamic models that can be used to accurately simulate stall, upset, and recovery on large transport airplanes. The techniques were developed and validated in a high-fidelity fixed-base flight simulator using a wind-tunnel aerodynamic database, realistic sensor characteristics, and a realistic flight deck representative of a large transport aircraft. Results demonstrated that aerodynamic models for stall, upset, and recovery can be identified rapidly and accurately using relatively simple piloted flight test maneuvers. Stall maneuver predictions and comparisons of identified aerodynamic models with data from the underlying simulation aerodynamic database were used to validate the techniques.

  9. Validation of a remote sensing model to identify Simulium damnosum s.l. breeding sites in Sub-Saharan Africa.

    PubMed

    Jacob, Benjamin G; Novak, Robert J; Toe, Laurent D; Sanfo, Moussa; Griffith, Daniel A; Lakwo, Thomson L; Habomugisha, Peace; Katabarwa, Moses N; Unnasch, Thomas R

    2013-01-01

    Recently, most onchocerciasis control programs have begun to focus on elimination. Developing an effective elimination strategy relies upon accurately mapping the extent of endemic foci. In areas of Africa that suffer from a lack of infrastructure and/or political instability, developing such accurate maps has been difficult. Onchocerciasis foci are localized near breeding sites for the black fly vectors of the infection. The goal of this study was to conduct ground validation studies to evaluate the sensitivity and specificity of a remote sensing model developed to predict S. damnosum s.l. breeding sites. Remote sensing images from Togo were analyzed to identify areas containing signature characteristics of S. damnosum s.l. breeding habitat. All 30 sites with the spectral signature were found to contain S. damnosum larvae, while 0/52 other sites judged as likely to contain larvae were found to contain larvae. The model was then used to predict breeding sites in Northern Uganda. This area is hyper-endemic for onchocerciasis, but political instability had precluded mass distribution of ivermectin until 2009. Ground validation revealed that 23/25 sites with the signature contained S. damnosum larvae, while 8/10 sites examined lacking the signature were larvae free. Sites predicted to have larvae contained significantly more larvae than those that lacked the signature. This study suggests that a signature extracted from remote sensing images may be used to predict the location of S. damnosum s.l. breeding sites with a high degree of accuracy. This method should be of assistance in predicting communities at risk for onchocerciasis in areas of Africa where ground-based epidemiological surveys are difficult to implement.

  10. Validation of a Remote Sensing Model to Identify Simulium damnosum s.l. Breeding Sites in Sub-Saharan Africa

    PubMed Central

    Jacob, Benjamin G.; Novak, Robert J.; Toe, Laurent D.; Sanfo, Moussa; Griffith, Daniel A.; Lakwo, Thomson L.; Habomugisha, Peace; Katabarwa, Moses N.; Unnasch, Thomas R.

    2013-01-01

    Background Recently, most onchocerciasis control programs have begun to focus on elimination. Developing an effective elimination strategy relies upon accurately mapping the extent of endemic foci. In areas of Africa that suffer from a lack of infrastructure and/or political instability, developing such accurate maps has been difficult. Onchocerciasis foci are localized near breeding sites for the black fly vectors of the infection. The goal of this study was to conduct ground validation studies to evaluate the sensitivity and specificity of a remote sensing model developed to predict S. damnosum s.l. breeding sites. Methodology/Principal Findings Remote sensing images from Togo were analyzed to identify areas containing signature characteristics of S. damnosum s.l. breeding habitat. All 30 sites with the spectral signature were found to contain S. damnosum larvae, while 0/52 other sites judged as likely to contain larvae were found to contain larvae. The model was then used to predict breeding sites in Northern Uganda. This area is hyper-endemic for onchocerciasis, but political instability had precluded mass distribution of ivermectin until 2009. Ground validation revealed that 23/25 sites with the signature contained S. damnosum larvae, while 8/10 sites examined lacking the signature were larvae free. Sites predicted to have larvae contained significantly more larvae than those that lacked the signature. Conclusions/Significance This study suggests that a signature extracted from remote sensing images may be used to predict the location of S. damnosum s.l. breeding sites with a high degree of accuracy. This method should be of assistance in predicting communities at risk for onchocerciasis in areas of Africa where ground-based epidemiological surveys are difficult to implement. PMID:23936571

  11. Analysing the accuracy of machine learning techniques to develop an integrated influent time series model: case study of a sewage treatment plant, Malaysia.

    PubMed

    Ansari, Mozafar; Othman, Faridah; Abunama, Taher; El-Shafie, Ahmed

    2018-04-01

    The function of a sewage treatment plant is to treat the sewage to acceptable standards before being discharged into the receiving waters. To design and operate such plants, it is necessary to measure and predict the influent flow rate. In this research, the influent flow rate of a sewage treatment plant (STP) was modelled and predicted by autoregressive integrated moving average (ARIMA), nonlinear autoregressive network (NAR) and support vector machine (SVM) regression time series algorithms. To evaluate the models' accuracy, the root mean square error (RMSE) and coefficient of determination (R 2 ) were calculated as initial assessment measures, while relative error (RE), peak flow criterion (PFC) and low flow criterion (LFC) were calculated as final evaluation measures to demonstrate the detailed accuracy of the selected models. An integrated model was developed based on the individual models' prediction ability for low, average and peak flow. An initial assessment of the results showed that the ARIMA model was the least accurate and the NAR model was the most accurate. The RE results also prove that the SVM model's frequency of errors above 10% or below - 10% was greater than the NAR model's. The influent was also forecasted up to 44 weeks ahead by both models. The graphical results indicate that the NAR model made better predictions than the SVM model. The final evaluation of NAR and SVM demonstrated that SVM made better predictions at peak flow and NAR fit well for low and average inflow ranges. The integrated model developed includes the NAR model for low and average influent and the SVM model for peak inflow.

  12. An Experimental and Computational Investigation of Oscillating Airfoil Unsteady Aerodynamics at Large Mean Incidence

    NASA Technical Reports Server (NTRS)

    Capece, Vincent R.; Platzer, Max F.

    2003-01-01

    A major challenge in the design and development of turbomachine airfoils for gas turbine engines is high cycle fatigue failures due to flutter and aerodynamically induced forced vibrations. In order to predict the aeroelastic response of gas turbine airfoils early in the design phase, accurate unsteady aerodynamic models are required. However, accurate predictions of flutter and forced vibration stress at all operating conditions have remained elusive. The overall objectives of this research program are to develop a transition model suitable for unsteady separated flow and quantify the effects of transition on airfoil steady and unsteady aerodynamics for attached and separated flow using this model. Furthermore, the capability of current state-of-the-art unsteady aerodynamic models to predict the oscillating airfoil response of compressor airfoils over a range of realistic reduced frequencies, Mach numbers, and loading levels will be evaluated through correlation with benchmark data. This comprehensive evaluation will assess the assumptions used in unsteady aerodynamic models. The results of this evaluation can be used to direct improvement of current models and the development of future models. The transition modeling effort will also make strides in improving predictions of steady flow performance of fan and compressor blades at off-design conditions. This report summarizes the progress and results obtained in the first year of this program. These include: installation and verification of the operation of the parallel version of TURBO; the grid generation and initiation of steady flow simulations of the NASA/Pratt&Whitney airfoil at a Mach number of 0.5 and chordal incidence angles of 0 and 10 deg.; and the investigation of the prediction of laminar separation bubbles on a NACA 0012 airfoil.

  13. The use of machine learning for the identification of peripheral artery disease and future mortality risk.

    PubMed

    Ross, Elsie Gyang; Shah, Nigam H; Dalman, Ronald L; Nead, Kevin T; Cooke, John P; Leeper, Nicholas J

    2016-11-01

    A key aspect of the precision medicine effort is the development of informatics tools that can analyze and interpret "big data" sets in an automated and adaptive fashion while providing accurate and actionable clinical information. The aims of this study were to develop machine learning algorithms for the identification of disease and the prognostication of mortality risk and to determine whether such models perform better than classical statistical analyses. Focusing on peripheral artery disease (PAD), patient data were derived from a prospective, observational study of 1755 patients who presented for elective coronary angiography. We employed multiple supervised machine learning algorithms and used diverse clinical, demographic, imaging, and genomic information in a hypothesis-free manner to build models that could identify patients with PAD and predict future mortality. Comparison was made to standard stepwise linear regression models. Our machine-learned models outperformed stepwise logistic regression models both for the identification of patients with PAD (area under the curve, 0.87 vs 0.76, respectively; P = .03) and for the prediction of future mortality (area under the curve, 0.76 vs 0.65, respectively; P = .10). Both machine-learned models were markedly better calibrated than the stepwise logistic regression models, thus providing more accurate disease and mortality risk estimates. Machine learning approaches can produce more accurate disease classification and prediction models. These tools may prove clinically useful for the automated identification of patients with highly morbid diseases for which aggressive risk factor management can improve outcomes. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  14. Accuracy of episodic autobiographical memory in children with early thyroid hormone deficiency using a staged event.

    PubMed

    Willoughby, Karen A; McAndrews, Mary Pat; Rovet, Joanne F

    2014-07-01

    Autobiographical memory (AM) is a highly constructive cognitive process that often contains memory errors. No study has specifically examined AM accuracy in children with abnormal development of the hippocampus, a crucial brain region for AM retrieval. Thus, the present study investigated AM accuracy in 68 typically and atypically developing children using a staged autobiographical event, the Children's Autobiographical Interview, and structural magnetic resonance imaging. The atypically developing group consisted of 17 children (HYPO) exposed during gestation to insufficient maternal thyroid hormone (TH), a critical substrate for hippocampal development, and 25 children with congenital hypothyroidism (CH), who were compared to 26 controls. Groups differed significantly in the number of accurate episodic details recalled and proportion accuracy scores, with controls having more accurate recollections of the staged event than both TH-deficient groups. Total hippocampal volumes and anterior hippocampal volumes were positively correlated with proportion accuracy scores, but not total accurate episodic details, in HYPO and CH. In addition, greater severity of TH deficiency predicted lower proportion accuracy scores in both HYPO and CH. Overall, these results indicate that children with early TH deficiency have deficits in AM accuracy and that the anterior hippocampus may play a particularly important role in accurate AM retrieval. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Comparison of prediction models for use of medical resources at urban auto-racing events.

    PubMed

    Nable, Jose V; Margolis, Asa M; Lawner, Benjamin J; Hirshon, Jon Mark; Perricone, Alexander J; Galvagno, Samuel M; Lee, Debra; Millin, Michael G; Bissell, Richard A; Alcorta, Richard L

    2014-12-01

    INTRODUCTION Predicting the number of patient encounters and transports during mass gatherings can be challenging. The nature of these events necessitates that proper resources are available to meet the needs that arise. Several prediction models to assist event planners in forecasting medical utilization have been proposed in the literature. The objective of this study was to determine the accuracy of the Arbon and Hartman models in predicting the number of patient encounters and transportations from the Baltimore Grand Prix (BGP), held in 2011 and 2012. It was hypothesized that the Arbon method, which utilizes regression model-derived equations to estimate, would be more accurate than the Hartman model, which categorizes events into only three discreet severity types. This retrospective analysis of the BGP utilized data collected from an electronic patient tracker system. The actual number of patients evaluated and transported at the BGP was tabulated and compared to the numbers predicted by the two studied models. Several environmental features including weather, crowd attendance, and presence of alcohol were used in the Arbon and Hartman models. Approximately 130,000 spectators attended the first event, and approximately 131,000 attended the second. The number of patient encounters per day ranged from 19 to 57 in 2011, and the number of transports from the scene ranged from two to nine. In 2012, the number of patients ranged from 19 to 44 per day, and the number of transports to emergency departments ranged from four to nine. With the exception of one day in 2011, the Arbon model over predicted the number of encounters. For both events, the Hartman model over predicted the number of patient encounters. In regard to hospital transports, the Arbon model under predicted the actual numbers whereas the Hartman model both over predicted and under predicted the number of transports from both events, varying by day. These findings call attention to the need for the development of a versatile and accurate model that can more accurately predict the number of patient encounters and transports associated with mass-gathering events so that medical needs can be anticipated and sufficient resources can be provided.

  16. Machine learning and predictive data analytics enabling metrology and process control in IC fabrication

    NASA Astrophysics Data System (ADS)

    Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.

    2015-03-01

    Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.

  17. Failure of Colorectal Surgical Site Infection Predictive Models Applied to an Independent Dataset: Do They Add Value or Just Confusion?

    PubMed

    Bergquist, John R; Thiels, Cornelius A; Etzioni, David A; Habermann, Elizabeth B; Cima, Robert R

    2016-04-01

    Colorectal surgical site infections (C-SSIs) are a major source of postoperative morbidity. Institutional C-SSI rates are modeled and scrutinized, and there is increasing movement in the direction of public reporting. External validation of C-SSI risk prediction models is lacking. Factors governing C-SSI occurrence are complicated and multifactorial. We hypothesized that existing C-SSI prediction models have limited ability to accurately predict C-SSI in independent data. Colorectal resections identified from our institutional ACS-NSQIP dataset (2006 to 2014) were reviewed. The primary outcome was any C-SSI according to the ACS-NSQIP definition. Emergency cases were excluded. Published C-SSI risk scores: the National Nosocomial Infection Surveillance (NNIS), Contamination, Obesity, Laparotomy, and American Society of Anesthesiologists (ASA) class (COLA), Preventie Ziekenhuisinfecties door Surveillance (PREZIES), and NSQIP-based models were compared with receiver operating characteristic (ROC) analysis to evaluate discriminatory quality. There were 2,376 cases included, with an overall C-SSI rate of 9% (213 cases). None of the models produced reliable and high quality C-SSI predictions. For any C-SSI, the NNIS c-index was 0.57 vs 0.61 for COLA, 0.58 for PREZIES, and 0.62 for NSQIP: all well below the minimum "reasonably" predictive c-index of 0.7. Predictions for superficial, deep, and organ space SSI were similarly poor. Published C-SSI risk prediction models do not accurately predict C-SSI in our independent institutional dataset. Application of externally developed prediction models to any individual practice must be validated or modified to account for institution and case-mix specific factors. This questions the validity of using externally or nationally developed models for "expected" outcomes and interhospital comparisons. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  18. Trip-oriented travel time prediction (TOTTP) with historical vehicle trajectories

    NASA Astrophysics Data System (ADS)

    Xu, Tao; Li, Xiang; Claramunt, Christophe

    2018-06-01

    Accurate travel time prediction is undoubtedly of importance to both traffic managers and travelers. In highly-urbanized areas, trip-oriented travel time prediction (TOTTP) is valuable to travelers rather than traffic managers as the former usually expect to know the travel time of a trip which may cross over multiple road sections. There are two obstacles to the development of TOTTP, including traffic complexity and traffic data coverage.With large scale historical vehicle trajectory data and meteorology data, this research develops a BPNN-based approach through integrating multiple factors affecting trip travel time into a BPNN model to predict trip-oriented travel time for OD pairs in urban network. Results of experiments demonstrate that it helps discover the dominate trends of travel time changes daily and weekly, and the impact of weather conditions is non-trivial.

  19. Evaluation of Turbulence-Model Performance as Applied to Jet-Noise Prediction

    NASA Technical Reports Server (NTRS)

    Woodruff, S. L.; Seiner, J. M.; Hussaini, M. Y.; Erlebacher, G.

    1998-01-01

    The accurate prediction of jet noise is possible only if the jet flow field can be predicted accurately. Predictions for the mean velocity and turbulence quantities in the jet flowfield are typically the product of a Reynolds-averaged Navier-Stokes solver coupled with a turbulence model. To evaluate the effectiveness of solvers and turbulence models in predicting those quantities most important to jet noise prediction, two CFD codes and several turbulence models were applied to a jet configuration over a range of jet temperatures for which experimental data is available.

  20. Validity of the BodyGem calorimeter and prediction equations for the assessment of resting energy expenditure in overweight and obese Saudi males.

    PubMed

    Almajwal, Ali M; Williams, Peter G; Batterham, Marijka J

    2011-07-01

    To assess the accuracy of resting energy expenditure (REE) measurement in a sample of overweight and obese Saudi males, using the BodyGem device (BG) with whole room calorimetry (WRC) as a reference, and to evaluate the accuracy of predictive equations. Thirty-eight subjects (mean +/- SD, age 26.8+/- 3.7 years, body mass index 31.0+/- 4.8) were recruited during the period from 5 February 2007 to 28 March 2008. Resting energy expenditure was measured using a WRC and BG device, and also calculated using 7 prediction equations. Mean differences, bias, percent of bias (%bias), accurate estimation, underestimation and overestimation were calculated. Repeated measures with the BG were not significantly different (accurate prediction: 81.6%; %bias 1.1+/- 6.3, p>0.24) with limits of agreement ranging from +242 to -200 kcal. Resting energy expenditure measured by BG was significantly less than WRC values (accurate prediction: 47.4%; %bias: 11.0+/- 14.6, p = 0.0001) with unacceptably wide limits of agreement. Harris-Benedict, Schofield and World Health Organization equations were the most accurate, estimating REE within 10% of measured REE, but none seem appropriate to predict the REE of individuals. There was a poor agreement between the REE measured by WRC compared to BG or predictive equations. The BG assessed REE accurately in 47.4% of the subjects on an individual level.

  1. Vesicular stomatitis forecasting based on Google Trends

    PubMed Central

    Lu, Yi; Zhou, GuangYa; Chen, Qin

    2018-01-01

    Background Vesicular stomatitis (VS) is an important viral disease of livestock. The main feature of VS is irregular blisters that occur on the lips, tongue, oral mucosa, hoof crown and nipple. Humans can also be infected with vesicular stomatitis and develop meningitis. This study analyses 2014 American VS outbreaks in order to accurately predict vesicular stomatitis outbreak trends. Methods American VS outbreaks data were collected from OIE. The data for VS keywords were obtained by inputting 24 disease-related keywords into Google Trends. After calculating the Pearson and Spearman correlation coefficients, it was found that there was a relationship between outbreaks and keywords derived from Google Trends. Finally, the predicted model was constructed based on qualitative classification and quantitative regression. Results For the regression model, the Pearson correlation coefficients between the predicted outbreaks and actual outbreaks are 0.953 and 0.948, respectively. For the qualitative classification model, we constructed five classification predictive models and chose the best classification predictive model as the result. The results showed, SN (sensitivity), SP (specificity) and ACC (prediction accuracy) values of the best classification predictive model are 78.52%,72.5% and 77.14%, respectively. Conclusion This study applied Google search data to construct a qualitative classification model and a quantitative regression model. The results show that the method is effective and that these two models obtain more accurate forecast. PMID:29385198

  2. Prediction of gradient retention data for hydrophilic interaction liquid chromatographic separation of native and fluorescently labeled oligosaccharides.

    PubMed

    Vaňková, Nikola; Česla, Petr

    2017-02-17

    In this work, we have investigated the predictive properties of mixed-mode retention model and oligomeric mixed-mode model, taking into account the contribution of monomeric units to the retention, in hydrophilic interaction liquid chromatography. The gradient retention times of native maltooligosaccharides and their fluorescent derivatives were predicted in the oligomeric series with number of monomeric glucose units in the range from two to seven. The maltooligosaccharides were separated on a packed column with carbamoyl-bonded silica stationary phase and 15 gradient profiles with different initial and final mobile phase composition were used with the gradient times 5; 7.5 and 10min. The predicted gradient retention times were compared for calculations based on isocratic retention data and gradient retention data, which provided better accuracy of the results. By comparing two different mobile phase additives, the more accurate retention times were predicted in mobile phases containing ammonium acetate. The acidic derivatives, prepared by reaction of an oligosaccharide with 2-aminobenzoic acid or 8-aminonaphthalene-1,3,6-trisulfonic acid, provided more accurate predictions of the retention data in comparison to native oligosaccharides or their neutral derivatives. The oligomeric mixed-mode model allowed prediction of gradient retention times using only one gradient profile, which significantly speeded-up the method development. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Toward Fully in Silico Melting Point Prediction Using Molecular Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y; Maginn, EJ

    2013-03-01

    Melting point is one of the most fundamental and practically important properties of a compound. Molecular computation of melting points. However, all of these methods simulation methods have been developed for the accurate need an experimental crystal structure as input, which means that such calculations are not really predictive since the melting point can be measured easily in experiments once a crystal structure is known. On the other hand, crystal structure prediction (CSP) has become an active field and significant progress has been made, although challenges still exist. One of the main challenges is the existence of many crystal structuresmore » (polymorphs) that are very close in energy. Thermal effects and kinetic factors make the situation even more complicated, such that it is still not trivial to predict experimental crystal structures. In this work, we exploit the fact that free energy differences are often small between crystal structures. We show that accurate melting point predictions can be made by using a reasonable crystal structure from CSP as a starting point for a free energy-based melting point calculation. The key is that most crystal structures predicted by CSP have free energies that are close to that of the experimental structure. The proposed method was tested on two rigid molecules and the results suggest that a fully in silico melting point prediction method is possible.« less

  4. The Use of the STAGS Finite Element Code in Stitched Structures Development

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C.; Lovejoy, Andrew E.

    2014-01-01

    In the last 30 years NASA has worked in collaboration with industry to develop enabling technologies needed to make aircraft more fuel-efficient and more affordable. The focus on the airframe has been to reduce weight, improve damage tolerance and better understand structural behavior under realistic flight and ground loading conditions. Stitched structure is a technology that can address the weight savings, cost reduction, and damage tolerance goals, but only if it is supported by accurate analytical techniques. Development of stitched technology began in the 1990's as a partnership between NASA and Boeing (McDonnell Douglas at the time) under the Advanced Composites Technology Program and has continued under various titles and programs and into the Environmentally Responsible Aviation Project today. These programs contained development efforts involving manufacturing development, design, detailed analysis, and testing. Each phase of development, from coupons to large aircraft components was supported by detailed analysis to prove that the behavior of these structures was well-understood and predictable. The Structural Analysis of General Shells (STAGS) computer code was a critical tool used in the development of many stitched structures. As a key developer of STAGS, Charles Rankin's contribution to the programs was quite significant. Key features of STAGS used in these analyses and discussed in this paper include its accurate nonlinear and post-buckling capabilities, its ability to predict damage growth, and the use of Lagrange constraints and follower forces.

  5. ASTRAL, DRAGON and SEDAN scores predict stroke outcome more accurately than physicians.

    PubMed

    Ntaios, G; Gioulekas, F; Papavasileiou, V; Strbian, D; Michel, P

    2016-11-01

    ASTRAL, SEDAN and DRAGON scores are three well-validated scores for stroke outcome prediction. Whether these scores predict stroke outcome more accurately compared with physicians interested in stroke was investigated. Physicians interested in stroke were invited to an online anonymous survey to provide outcome estimates in randomly allocated structured scenarios of recent real-life stroke patients. Their estimates were compared to scores' predictions in the same scenarios. An estimate was considered accurate if it was within 95% confidence intervals of actual outcome. In all, 244 participants from 32 different countries responded assessing 720 real scenarios and 2636 outcomes. The majority of physicians' estimates were inaccurate (1422/2636, 53.9%). 400 (56.8%) of physicians' estimates about the percentage probability of 3-month modified Rankin score (mRS) > 2 were accurate compared with 609 (86.5%) of ASTRAL score estimates (P < 0.0001). 394 (61.2%) of physicians' estimates about the percentage probability of post-thrombolysis symptomatic intracranial haemorrhage were accurate compared with 583 (90.5%) of SEDAN score estimates (P < 0.0001). 160 (24.8%) of physicians' estimates about post-thrombolysis 3-month percentage probability of mRS 0-2 were accurate compared with 240 (37.3%) DRAGON score estimates (P < 0.0001). 260 (40.4%) of physicians' estimates about the percentage probability of post-thrombolysis mRS 5-6 were accurate compared with 518 (80.4%) DRAGON score estimates (P < 0.0001). ASTRAL, DRAGON and SEDAN scores predict outcome of acute ischaemic stroke patients with higher accuracy compared to physicians interested in stroke. © 2016 EAN.

  6. Monte Carlo method for photon heating using temperature-dependent optical properties.

    PubMed

    Slade, Adam Broadbent; Aguilar, Guillermo

    2015-02-01

    The Monte Carlo method for photon transport is often used to predict the volumetric heating that an optical source will induce inside a tissue or material. This method relies on constant (with respect to temperature) optical properties, specifically the coefficients of scattering and absorption. In reality, optical coefficients are typically temperature-dependent, leading to error in simulation results. The purpose of this study is to develop a method that can incorporate variable properties and accurately simulate systems where the temperature will greatly vary, such as in the case of laser-thawing of frozen tissues. A numerical simulation was developed that utilizes the Monte Carlo method for photon transport to simulate the thermal response of a system that allows temperature-dependent optical and thermal properties. This was done by combining traditional Monte Carlo photon transport with a heat transfer simulation to provide a feedback loop that selects local properties based on current temperatures, for each moment in time. Additionally, photon steps are segmented to accurately obtain path lengths within a homogenous (but not isothermal) material. Validation of the simulation was done using comparisons to established Monte Carlo simulations using constant properties, and a comparison to the Beer-Lambert law for temperature-variable properties. The simulation is able to accurately predict the thermal response of a system whose properties can vary with temperature. The difference in results between variable-property and constant property methods for the representative system of laser-heated silicon can become larger than 100K. This simulation will return more accurate results of optical irradiation absorption in a material which undergoes a large change in temperature. This increased accuracy in simulated results leads to better thermal predictions in living tissues and can provide enhanced planning and improved experimental and procedural outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Evaluating the Zebrafish Embryo Toxicity Test for Pesticide Hazard Screening

    EPA Science Inventory

    Given the numerous chemicals used in society, it is critical to develop tools for accurate and efficient evaluation of potential risks to human and ecological receptors. Fish embryo acute toxicity tests are 1 tool that has been shown to be highly predictive of standard, more reso...

  8. Minimal Pair Distinctions and Intelligibility in Preschool Children with and without Speech Sound Disorders

    ERIC Educational Resources Information Center

    Hodge, Megan M.; Gotzke, Carrie L.

    2011-01-01

    Listeners' identification of young children's productions of minimally contrastive words and predictive relationships between accurately identified words and intelligibility scores obtained from a 100-word spontaneous speech sample were determined for 36 children with typically developing speech (TDS) and 36 children with speech sound disorders…

  9. Development of a Field-Portable Near-Infrared Water-Cement Ratio Meter. Phase I: Investigation of Spectral Feasibility

    DOT National Transportation Integrated Search

    1999-06-25

    In order to accurately predict the strength and durability of an adequately mixed and compacted concrete mix it is useful, if not necessary, to know the water-cement ratio of the fresh concrete mix. In this work near-infrared reflection spectroscopy ...

  10. Wheat mill stream properties for discrete element method modeling

    USDA-ARS?s Scientific Manuscript database

    A discrete phase approach based on individual wheat kernel characteristics is needed to overcome the limitations of previous statistical models and accurately predict the milling behavior of wheat. As a first step to develop a discrete element method (DEM) model for the wheat milling process, this s...

  11. Using Gene Expression Biomarkers to Identify Chemicals that Induce Key Events in Cancer and Endocrine Disruption AOPs: Androgen Receptor as an Example

    EPA Science Inventory

    High-throughput transcriptomic (HTTr) technologies are increasingly being used to screen environmental chemicals in vitro to provide mechanistic context for regulatory testing. The development of gene expression biomarkers that accurately predict molecular and toxicological effec...

  12. Modelling C3 photosynthesis from the chloroplast to the ecosystem

    USDA-ARS?s Scientific Manuscript database

    Globally, photosynthesis accounts for the largest flux of CO2 from the atmosphere into ecosystems and is the driving process for terrestrial ecosystem function. The importance of accurate predictions of photosynthesis over a range of plant growth conditions led to the development of a C3 photosynthe...

  13. A Simple Close Range Photogrammetry Technique to Assess Soil Erosion in the Field

    USDA-ARS?s Scientific Manuscript database

    Evaluating the performance of a soil erosion prediction model depends on the ability to accurately measure the gain or loss of sediment in an area. Recent development in acquiring detailed surface elevation data (DEM) makes it feasible to assess soil erosion and deposition spatially. Digital photogr...

  14. Ensemble forecast of human West Nile virus cases and mosquito infection rates

    NASA Astrophysics Data System (ADS)

    Defelice, Nicholas B.; Little, Eliza; Campbell, Scott R.; Shaman, Jeffrey

    2017-02-01

    West Nile virus (WNV) is now endemic in the continental United States; however, our ability to predict spillover transmission risk and human WNV cases remains limited. Here we develop a model depicting WNV transmission dynamics, which we optimize using a data assimilation method and two observed data streams, mosquito infection rates and reported human WNV cases. The coupled model-inference framework is then used to generate retrospective ensemble forecasts of historical WNV outbreaks in Long Island, New York for 2001-2014. Accurate forecasts of mosquito infection rates are generated before peak infection, and >65% of forecasts accurately predict seasonal total human WNV cases up to 9 weeks before the past reported case. This work provides the foundation for implementation of a statistically rigorous system for real-time forecast of seasonal outbreaks of WNV.

  15. Ensemble forecast of human West Nile virus cases and mosquito infection rates.

    PubMed

    DeFelice, Nicholas B; Little, Eliza; Campbell, Scott R; Shaman, Jeffrey

    2017-02-24

    West Nile virus (WNV) is now endemic in the continental United States; however, our ability to predict spillover transmission risk and human WNV cases remains limited. Here we develop a model depicting WNV transmission dynamics, which we optimize using a data assimilation method and two observed data streams, mosquito infection rates and reported human WNV cases. The coupled model-inference framework is then used to generate retrospective ensemble forecasts of historical WNV outbreaks in Long Island, New York for 2001-2014. Accurate forecasts of mosquito infection rates are generated before peak infection, and >65% of forecasts accurately predict seasonal total human WNV cases up to 9 weeks before the past reported case. This work provides the foundation for implementation of a statistically rigorous system for real-time forecast of seasonal outbreaks of WNV.

  16. Predicting poverty and wealth from mobile phone metadata.

    PubMed

    Blumenstock, Joshua; Cadamuro, Gabriel; On, Robert

    2015-11-27

    Accurate and timely estimates of population characteristics are a critical input to social and economic research and policy. In industrialized economies, novel sources of data are enabling new approaches to demographic profiling, but in developing countries, fewer sources of big data exist. We show that an individual's past history of mobile phone use can be used to infer his or her socioeconomic status. Furthermore, we demonstrate that the predicted attributes of millions of individuals can, in turn, accurately reconstruct the distribution of wealth of an entire nation or to infer the asset distribution of microregions composed of just a few households. In resource-constrained environments where censuses and household surveys are rare, this approach creates an option for gathering localized and timely information at a fraction of the cost of traditional methods. Copyright © 2015, American Association for the Advancement of Science.

  17. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction

    PubMed Central

    Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta

    2018-01-01

    The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research. PMID:29599739

  18. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction.

    PubMed

    Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta

    2018-01-01

    The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  19. Development and fabrication of improved Schottky power diodes, phases I and II

    NASA Technical Reports Server (NTRS)

    Cordes, L. F.; Garfinkle, M.; Taft, E. A.

    1974-01-01

    Reproducible methods for the fabrication of silicon Schottky diodes were developed for the metals tungsten, aluminum, conventional platinum silicide and low temperature platinum silicide. Barrier heights and barrier lowering were measured permitting the accurate prediction of ideal forward and reverse diode performance. Processing procedures were developed which permit the fabrication of large area (approximately 1 sqcm) mesa-geometry power Schottky diodes with forward and reverse characteristics that approach theoretical values.

  20. Morphological Awareness and Children's Writing: Accuracy, Error, and Invention

    PubMed Central

    McCutchen, Deborah; Stull, Sara

    2014-01-01

    This study examined the relationship between children's morphological awareness and their ability to produce accurate morphological derivations in writing. Fifth-grade U.S. students (n = 175) completed two writing tasks that invited or required morphological manipulation of words. We examined both accuracy and error, specifically errors in spelling and errors of the sort we termed morphological inventions, which entailed inappropriate, novel pairings of stems and suffixes. Regressions were used to determine the relationship between morphological awareness, morphological accuracy, and spelling accuracy, as well as between morphological awareness and morphological inventions. Linear regressions revealed that morphological awareness uniquely predicted children's generation of accurate morphological derivations, regardless of whether or not accurate spelling was required. A logistic regression indicated that morphological awareness was also uniquely predictive of morphological invention, with higher morphological awareness increasing the probability of morphological invention. These findings suggest that morphological knowledge may not only assist children with spelling during writing, but may also assist with word production via generative experimentation with morphological rules during sentence generation. Implications are discussed for the development of children's morphological knowledge and relationships with writing. PMID:25663748

  1. A bio-optical model for integration into ecosystem models for the Ligurian Sea

    NASA Astrophysics Data System (ADS)

    Bengil, Fethi; McKee, David; Beşiktepe, Sükrü T.; Sanjuan Calzado, Violeta; Trees, Charles

    2016-12-01

    A bio-optical model has been developed for the Ligurian Sea which encompasses both deep, oceanic Case 1 waters and shallow, coastal Case 2 waters. The model builds on earlier Case 1 models for the region and uses field data collected on the BP09 research cruise to establish new relationships for non-biogenic particles and CDOM. The bio-optical model reproduces in situ IOPs accurately and is used to parameterize radiative transfer simulations which demonstrate its utility for modeling underwater light levels and above surface remote sensing reflectance. Prediction of euphotic depth is found to be accurate to within ∼3.2 m (RMSE). Previously published light field models work well for deep oceanic parts of the Ligurian Sea that fit the Case 1 classification. However, they are found to significantly over-estimate euphotic depth in optically complex coastal waters where the influence of non-biogenic materials is strongest. For these coastal waters, the combination of the bio-optical model proposed here and full radiative transfer simulations provides significantly more accurate predictions of euphotic depth.

  2. Automatic analysis for neuron by confocal laser scanning microscope

    NASA Astrophysics Data System (ADS)

    Satou, Kouhei; Aoki, Yoshimitsu; Mataga, Nobuko; Hensh, Takao K.; Taki, Katuhiko

    2005-12-01

    The aim of this study is to develop a system that recognizes both the macro- and microscopic configurations of nerve cells and automatically performs the necessary 3-D measurements and functional classification of spines. The acquisition of 3-D images of cranial nerves has been enabled by the use of a confocal laser scanning microscope, although the highly accurate 3-D measurements of the microscopic structures of cranial nerves and their classification based on their configurations have not yet been accomplished. In this study, in order to obtain highly accurate measurements of the microscopic structures of cranial nerves, existing positions of spines were predicted by the 2-D image processing of tomographic images. Next, based on the positions that were predicted on the 2-D images, the positions and configurations of the spines were determined more accurately by 3-D image processing of the volume data. We report the successful construction of an automatic analysis system that uses a coarse-to-fine technique to analyze the microscopic structures of cranial nerves with high speed and accuracy by combining 2-D and 3-D image analyses.

  3. An improved correlation to predict molecular weight between crosslinks based on equilibrium degree of swelling of hydrogel networks.

    PubMed

    Jimenez-Vergara, Andrea C; Lewis, John; Hahn, Mariah S; Munoz-Pinto, Dany J

    2018-04-01

    Accurate characterization of hydrogel diffusional properties is of substantial importance for a range of biotechnological applications. The diffusional capacity of hydrogels has commonly been estimated using the average molecular weight between crosslinks (M c ), which is calculated based on the equilibrium degree of swelling. However, the existing correlation linking M c and equilibrium swelling fails to accurately reflect the diffusional properties of highly crosslinked hydrogel networks. Also, as demonstrated herein, the current model fails to accurately predict the diffusional properties of hydrogels when polymer concentration and molecular weight are varied simultaneously. To address these limitations, we evaluated the diffusional properties of 48 distinct hydrogel formulations using two different photoinitiator systems, employing molecular size exclusion as an alternative methodology to calculate average hydrogel mesh size. The resulting data were then utilized to develop a revised correlation between M c and hydrogel equilibrium swelling that substantially reduces the limitations associated with the current correlation. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 106B: 1339-1348, 2018. © 2017 Wiley Periodicals, Inc.

  4. The prediction of drug metabolism, tissue distribution, and bioavailability of 50 structurally diverse compounds in rat using mechanism-based absorption, distribution, and metabolism prediction tools.

    PubMed

    De Buck, Stefan S; Sinha, Vikash K; Fenu, Luca A; Gilissen, Ron A; Mackie, Claire E; Nijsen, Marjoleen J

    2007-04-01

    The aim of this study was to assess a physiologically based modeling approach for predicting drug metabolism, tissue distribution, and bioavailability in rat for a structurally diverse set of neutral and moderate-to-strong basic compounds (n = 50). Hepatic blood clearance (CL(h)) was projected using microsomal data and shown to be well predicted, irrespective of the type of hepatic extraction model (80% within 2-fold). Best predictions of CL(h) were obtained disregarding both plasma and microsomal protein binding, whereas strong bias was seen using either blood binding only or both plasma and microsomal protein binding. Two mechanistic tissue composition-based equations were evaluated for predicting volume of distribution (V(dss)) and tissue-to-plasma partitioning (P(tp)). A first approach, which accounted for ionic interactions with acidic phospholipids, resulted in accurate predictions of V(dss) (80% within 2-fold). In contrast, a second approach, which disregarded ionic interactions, was a poor predictor of V(dss) (60% within 2-fold). The first approach also yielded accurate predictions of P(tp) in muscle, heart, and kidney (80% within 3-fold), whereas in lung, liver, and brain, predictions ranged from 47% to 62% within 3-fold. Using the second approach, P(tp) prediction accuracy in muscle, heart, and kidney was on average 70% within 3-fold, and ranged from 24% to 54% in all other tissues. Combining all methods for predicting V(dss) and CL(h) resulted in accurate predictions of the in vivo half-life (70% within 2-fold). Oral bioavailability was well predicted using CL(h) data and Gastroplus Software (80% within 2-fold). These results illustrate that physiologically based prediction tools can provide accurate predictions of rat pharmacokinetics.

  5. Predicting human olfactory perception from chemical features of odor molecules.

    PubMed

    Keller, Andreas; Gerkin, Richard C; Guan, Yuanfang; Dhurandhar, Amit; Turu, Gabor; Szalai, Bence; Mainland, Joel D; Ihara, Yusuke; Yu, Chung Wen; Wolfinger, Russ; Vens, Celine; Schietgat, Leander; De Grave, Kurt; Norel, Raquel; Stolovitzky, Gustavo; Cecchi, Guillermo A; Vosshall, Leslie B; Meyer, Pablo

    2017-02-24

    It is still not possible to predict whether a given molecule will have a perceived odor or what olfactory percept it will produce. We therefore organized the crowd-sourced DREAM Olfaction Prediction Challenge. Using a large olfactory psychophysical data set, teams developed machine-learning algorithms to predict sensory attributes of molecules based on their chemoinformatic features. The resulting models accurately predicted odor intensity and pleasantness and also successfully predicted 8 among 19 rated semantic descriptors ("garlic," "fish," "sweet," "fruit," "burnt," "spices," "flower," and "sour"). Regularized linear models performed nearly as well as random forest-based ones, with a predictive accuracy that closely approaches a key theoretical limit. These models help to predict the perceptual qualities of virtually any molecule with high accuracy and also reverse-engineer the smell of a molecule. Copyright © 2017, American Association for the Advancement of Science.

  6. Bankruptcy prediction based on financial ratios using Jordan Recurrent Neural Networks: a case study in Polish companies

    NASA Astrophysics Data System (ADS)

    Hardinata, Lingga; Warsito, Budi; Suparti

    2018-05-01

    Complexity of bankruptcy causes the accurate models of bankruptcy prediction difficult to be achieved. Various prediction models have been developed to improve the accuracy of bankruptcy predictions. Machine learning has been widely used to predict because of its adaptive capabilities. Artificial Neural Networks (ANN) is one of machine learning which proved able to complete inference tasks such as prediction and classification especially in data mining. In this paper, we propose the implementation of Jordan Recurrent Neural Networks (JRNN) to classify and predict corporate bankruptcy based on financial ratios. Feedback interconnection in JRNN enable to make the network keep important information well allowing the network to work more effectively. The result analysis showed that JRNN works very well in bankruptcy prediction with average success rate of 81.3785%.

  7. Quantum chemical calculations of interatomic potentials for computer simulation of solids

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A comprehensive mathematical model by which the collective behavior of a very large number of atoms within a metal or alloy can accurately be simulated was developed. Work was done in order to predict and modify the strength of materials to suit our technological needs. The method developed is useful in studying atomic interactions related to dislocation motion and crack extension.

  8. Parental perception of child’s weight status and subsequent BMIz change: the KOALA birth cohort study

    PubMed Central

    2014-01-01

    Background Parents often fail to correctly perceive their children’s weight status, but no studies have examined the association between parental weight status perception and longitudinal BMIz change (BMI standardized to a reference population) at various ages. We investigated whether parents are able to accurately perceive their child’s weight status at age 5. We also investigated predictors of accurate weight status perception. Finally, we investigated the predictive value of accurate weight status perception in explaining children’s longitudinal weight development up to the age of 9, in children who were overweight at the age of 5. Methods We used longitudinal data from the KOALA Birth Cohort Study. At the child’s age of 5 years, parents filled out a questionnaire regarding child and parent characteristics and their perception of their child’s weight status. We calculated the children’s actual weight status from parental reports of weight and height at ages 2, 5, 6, 7, 8, and 9 years. Regression analyses were used to identify factors predicting which parents accurately perceived their child’s weight status. Finally, regression analyses were used to predict subsequent longitudinal BMIz change in overweight children. Results Eighty-five percent of the parents of overweight children underestimated their child’s weight status at age 5. The child’s BMIz at age 2 and 5 were significant positive predictors of accurate weight status perception (vs. underestimation) in normal weight and overweight children. Accurate weight status perception was a predictor of higher future BMI in overweight children, corrected for actual BMI at baseline. Conclusions Children of parents who accurately perceived their child’s weight status had a higher BMI over time, probably making it easier for parents to correctly perceive their child’s overweight. Parental awareness of the child’s overweight as such may not be sufficient for subsequent weight management by the parents, implying that parents who recognize their child’s overweight may not be able or willing to adequately manage the overweight. PMID:24678601

  9. Augmented Method to Improve Thermal Data for the Figure Drift Thermal Distortion Predictions of the JWST OTIS Cryogenic Vacuum Test

    NASA Technical Reports Server (NTRS)

    Park, Sang C.; Carnahan, Timothy M.; Cohen, Lester M.; Congedo, Cherie B.; Eisenhower, Michael J.; Ousley, Wes; Weaver, Andrew; Yang, Kan

    2017-01-01

    The JWST Optical Telescope Element (OTE) assembly is the largest optically stable infrared-optimized telescope currently being manufactured and assembled, and is scheduled for launch in 2018. The JWST OTE, including the 18 segment primary mirror, secondary mirror, and the Aft Optics Subsystem (AOS) are designed to be passively cooled and operate near 45K. These optical elements are supported by a complex composite backplane structure. As a part of the structural distortion model validation efforts, a series of tests are planned during the cryogenic vacuum test of the fully integrated flight hardware at NASA JSC Chamber A. The successful ends to the thermal-distortion phases are heavily dependent on the accurate temperature knowledge of the OTE structural members. However, the current temperature sensor allocations during the cryo-vac test may not have sufficient fidelity to provide accurate knowledge of the temperature distributions within the composite structure. A method based on an inverse distance relationship among the sensors and thermal model nodes was developed to improve the thermal data provided for the nanometer scale WaveFront Error (WFE) predictions. The Linear Distance Weighted Interpolation (LDWI) method was developed to augment the thermal model predictions based on the sparse sensor information. This paper will encompass the development of the LDWI method using the test data from the earlier pathfinder cryo-vac tests, and the results of the notional and as tested WFE predictions from the structural finite element model cases to characterize the accuracies of this LDWI method.

  10. Energy prediction equations are inadequate for obese Hispanic youth.

    PubMed

    Klein, Catherine J; Villavicencio, Stephan A; Schweitzer, Amy; Bethepu, Joel S; Hoffman, Heather J; Mirza, Nazrat M

    2011-08-01

    Assessing energy requirements is a fundamental activity in clinical dietetics practice. A study was designed to determine whether published linear regression equations were accurate for predicting resting energy expenditure (REE) in fasted Hispanic children with obesity (aged 7 to 15 years). REE was measured using indirect calorimetry; body composition was estimated with whole-body air displacement plethysmography. REE was predicted using four equations: Institute of Medicine for healthy-weight children (IOM-HW), IOM for overweight and obese children (IOM-OS), Harris-Benedict, and Schofield. Accuracy of the prediction was calculated as the absolute value of the difference between the measured and predicted REE divided by the measured REE, expressed as a percentage. Predicted values within 85% to 115% of measured were defined as accurate. Participants (n=58; 53% boys) were mean age 11.8±2.1 years, had 43.5%±5.1% body fat, and had a body mass index of 31.5±5.8 (98.6±1.1 body mass index percentile). Measured REE was 2,339±680 kcal/day; predicted REE was 1,815±401 kcal/day (IOM-HW), 1,794±311 kcal/day (IOM-OS), 1,151±300 kcal/day (Harris-Benedict), and, 1,771±316 kcal/day (Schofield). Measured REE adjusted for body weight averaged 32.0±8.4 kcal/kg/day (95% confidence interval 29.8 to 34.2). Published equations predicted REE within 15% accuracy for only 36% to 40% of 58 participants, except for Harris-Benedict, which did not achieve accuracy for any participant. The most frequently accurate values were obtained using IOM-HW, which predicted REE within 15% accuracy for 55% (17/31) of boys. Published equations did not accurately predict REE for youth in the study sample. Further studies are warranted to formulate accurate energy prediction equations for this population. Copyright © 2011 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  11. Using beta binomials to estimate classification uncertainty for ensemble models.

    PubMed

    Clark, Robert D; Liang, Wenkel; Lee, Adam C; Lawless, Michael S; Fraczkiewicz, Robert; Waldman, Marvin

    2014-01-01

    Quantitative structure-activity (QSAR) models have enormous potential for reducing drug discovery and development costs as well as the need for animal testing. Great strides have been made in estimating their overall reliability, but to fully realize that potential, researchers and regulators need to know how confident they can be in individual predictions. Submodels in an ensemble model which have been trained on different subsets of a shared training pool represent multiple samples of the model space, and the degree of agreement among them contains information on the reliability of ensemble predictions. For artificial neural network ensembles (ANNEs) using two different methods for determining ensemble classification - one using vote tallies and the other averaging individual network outputs - we have found that the distribution of predictions across positive vote tallies can be reasonably well-modeled as a beta binomial distribution, as can the distribution of errors. Together, these two distributions can be used to estimate the probability that a given predictive classification will be in error. Large data sets comprised of logP, Ames mutagenicity, and CYP2D6 inhibition data are used to illustrate and validate the method. The distributions of predictions and errors for the training pool accurately predicted the distribution of predictions and errors for large external validation sets, even when the number of positive and negative examples in the training pool were not balanced. Moreover, the likelihood of a given compound being prospectively misclassified as a function of the degree of consensus between networks in the ensemble could in most cases be estimated accurately from the fitted beta binomial distributions for the training pool. Confidence in an individual predictive classification by an ensemble model can be accurately assessed by examining the distributions of predictions and errors as a function of the degree of agreement among the constituent submodels. Further, ensemble uncertainty estimation can often be improved by adjusting the voting or classification threshold based on the parameters of the error distribution. Finally, the profiles for models whose predictive uncertainty estimates are not reliable provide clues to that effect without the need for comparison to an external test set.

  12. GMAT versus Alternatives: Predictive Validity Evidence from Central Europe and the Middle East

    ERIC Educational Resources Information Center

    Koys, Daniel

    2010-01-01

    The author found that the GPA at the end of the MBA program is most accurately predicted by the Graduate Management Admission Test (GMAT) and the Test of English as a Foreign Language (TOEFL). MBA GPA is also predicted, though less accurately, by the Scholastic Level Exam, a mathematics test, undergraduate GPA, and previous career progression. If…

  13. High Speed Civil Transport (HSCT) Isolated Nacelle Transonic Boattail Drag Study and Results Using Computational Fluid Dynamics (CFD)

    NASA Technical Reports Server (NTRS)

    Midea, Anthony C.; Austin, Thomas; Pao, S. Paul; DeBonis, James R.; Mani, Mori

    2005-01-01

    Nozzle boattail drag is significant for the High Speed Civil Transport (HSCT) and can be as high as 25 percent of the overall propulsion system thrust at transonic conditions. Thus, nozzle boattail drag has the potential to create a thrust drag pinch and can reduce HSCT aircraft aerodynamic efficiencies at transonic operating conditions. In order to accurately predict HSCT performance, it is imperative that nozzle boattail drag be accurately predicted. Previous methods to predict HSCT nozzle boattail drag were suspect in the transonic regime. In addition, previous prediction methods were unable to account for complex nozzle geometry and were not flexible enough for engine cycle trade studies. A computational fluid dynamics (CFD) effort was conducted by NASA and McDonnell Douglas to evaluate the magnitude and characteristics of HSCT nozzle boattail drag at transonic conditions. A team of engineers used various CFD codes and provided consistent, accurate boattail drag coefficient predictions for a family of HSCT nozzle configurations. The CFD results were incorporated into a nozzle drag database that encompassed the entire HSCT flight regime and provided the basis for an accurate and flexible prediction methodology.

  14. High Speed Civil Transport (HSCT) Isolated Nacelle Transonic Boattail Drag Study and Results Using Computational Fluid Dynamics (CFD)

    NASA Technical Reports Server (NTRS)

    Midea, Anthony C.; Austin, Thomas; Pao, S. Paul; DeBonis, James R.; Mani, Mori

    1999-01-01

    Nozzle boattail drag is significant for the High Speed Civil Transport (HSCT) and can be as high as 25% of the overall propulsion system thrust at transonic conditions. Thus, nozzle boattail drag has the potential to create a thrust-drag pinch and can reduce HSCT aircraft aerodynamic efficiencies at transonic operating conditions. In order to accurately predict HSCT performance, it is imperative that nozzle boattail drag be accurately predicted. Previous methods to predict HSCT nozzle boattail drag were suspect in the transonic regime. In addition, previous prediction methods were unable to account for complex nozzle geometry and were not flexible enough for engine cycle trade studies. A computational fluid dynamics (CFD) effort was conducted by NASA and McDonnell Douglas to evaluate the magnitude and characteristics of HSCT nozzle boattail drag at transonic conditions. A team of engineers used various CFD codes and provided consistent, accurate boattail drag coefficient predictions for a family of HSCT nozzle configurations. The CFD results were incorporated into a nozzle drag database that encompassed the entire HSCT flight regime and provided the basis for an accurate and flexible prediction methodology.

  15. Protein docking prediction using predicted protein-protein interface.

    PubMed

    Li, Bin; Kihara, Daisuke

    2012-01-10

    Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm), is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  16. Control surface hinge moment prediction using computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Simpson, Christopher David

    The following research determines the feasibility of predicting control surface hinge moments using various computational methods. A detailed analysis is conducted using a 2D GA(W)-1 airfoil with a 20% plain flap. Simple hinge moment prediction methods are tested, including empirical Datcom relations and XFOIL. Steady-state and time-accurate turbulent, viscous, Navier-Stokes solutions are computed using Fun3D. Hinge moment coefficients are computed. Mesh construction techniques are discussed. An adjoint-based mesh adaptation case is also evaluated. An NACA 0012 45-degree swept horizontal stabilizer with a 25% elevator is also evaluated using Fun3D. Results are compared with experimental wind-tunnel data obtained from references. Finally, the costs of various solution methods are estimated. Results indicate that while a steady-state Navier-Stokes solution can accurately predict control surface hinge moments for small angles of attack and deflection angles, a time-accurate solution is necessary to accurately predict hinge moments in the presence of flow separation. The ability to capture the unsteady vortex shedding behavior present in moderate to large control surface deflections is found to be critical to hinge moment prediction accuracy. Adjoint-based mesh adaptation is shown to give hinge moment predictions similar to a globally-refined mesh for a steady-state 2D simulation.

  17. Physiologically Based Pharmacokinetic Model for Long-Circulating Inorganic Nanoparticles.

    PubMed

    Liang, Xiaowen; Wang, Haolu; Grice, Jeffrey E; Li, Li; Liu, Xin; Xu, Zhi Ping; Roberts, Michael S

    2016-02-10

    A physiologically based pharmacokinetic model was developed for accurately characterizing and predicting the in vivo fate of long-circulating inorganic nanoparticles (NPs). This model is built based on direct visualization of NP disposition details at the organ and cellular level. It was validated with multiple data sets, indicating robust inter-route and interspecies predictive capability. We suggest that the biodistribution of long-circulating inorganic NPs is determined by the uptake and release of NPs by phagocytic cells in target organs.

  18. Experimental and analytical studies of a model helicopter rotor in hover

    NASA Technical Reports Server (NTRS)

    Caradonna, F. X.; Tung, C.

    1981-01-01

    A benchmark test to aid the development of various rotor performance codes was conducted. Simultaneous blade pressure measurements and tip vortex surveys were made for a wide range of tip Mach numbers including the transonic flow regime. The measured tip vortex strength and geometry permit effective blade loading predictions when used as input to a prescribed wake lifting surface code. It is also shown that with proper inflow and boundary layer modeling, the supercritical flow regime can be accurately predicted.

  19. Monitoring and regulation of learning in medical education: the need for predictive cues.

    PubMed

    de Bruin, Anique B H; Dunlosky, John; Cavalcanti, Rodrigo B

    2017-06-01

    Being able to accurately monitor learning activities is a key element in self-regulated learning in all settings, including medical schools. Yet students' ability to monitor their progress is often limited, leading to inefficient use of study time. Interventions that improve the accuracy of students' monitoring can optimise self-regulated learning, leading to higher achievement. This paper reviews findings from cognitive psychology and explores potential applications in medical education, as well as areas for future research. Effective monitoring depends on students' ability to generate information ('cues') that accurately reflects their knowledge and skills. The ability of these 'cues' to predict achievement is referred to as 'cue diagnosticity'. Interventions that improve the ability of students to elicit predictive cues typically fall into two categories: (i) self-generation of cues and (ii) generation of cues that is delayed after self-study. Providing feedback and support is useful when cues are predictive but may be too complex to be readily used. Limited evidence exists about interventions to improve the accuracy of self-monitoring among medical students or trainees. Developing interventions that foster use of predictive cues can enhance the accuracy of self-monitoring, thereby improving self-study and clinical reasoning. First, insight should be gained into the characteristics of predictive cues used by medical students and trainees. Next, predictive cue prompts should be designed and tested to improve monitoring and regulation of learning. Finally, the use of predictive cues should be explored in relation to teaching and learning clinical reasoning. Improving self-regulated learning is important to help medical students and trainees efficiently acquire knowledge and skills necessary for clinical practice. Interventions that help students generate and use predictive cues hold the promise of improved self-regulated learning and achievement. This framework is applicable to learning in several areas, including the development of clinical reasoning. © 2017 The Authors Medical Education published by Association for the Study of Medical Education and John Wiley & Sons Ltd.

  20. Predicting Time to Hospital Discharge for Extremely Preterm Infants

    PubMed Central

    Hintz, Susan R.; Bann, Carla M.; Ambalavanan, Namasivayam; Cotten, C. Michael; Das, Abhik; Higgins, Rosemary D.

    2010-01-01

    As extremely preterm infant mortality rates have decreased, concerns regarding resource utilization have intensified. Accurate models to predict time to hospital discharge could aid in resource planning, family counseling, and perhaps stimulate quality improvement initiatives. Objectives For infants <27 weeks estimated gestational age (EGA), to develop, validate and compare several models to predict time to hospital discharge based on time-dependent covariates, and based on the presence of 5 key risk factors as predictors. Patients and Methods This was a retrospective analysis of infants <27 weeks EGA, born 7/2002-12/2005 and surviving to discharge from a NICHD Neonatal Research Network site. Time to discharge was modeled as continuous (postmenstrual age at discharge, PMAD), and categorical variables (“Early” and “Late” discharge). Three linear and logistic regression models with time-dependent covariate inclusion were developed (perinatal factors only, perinatal+early neonatal factors, perinatal+early+later factors). Models for Early and Late discharge using the cumulative presence of 5 key risk factors as predictors were also evaluated. Predictive capabilities were compared using coefficient of determination (R2) for linear models, and AUC of ROC curve for logistic models. Results Data from 2254 infants were included. Prediction of PMAD was poor, with only 38% of variation explained by linear models. However, models incorporating later clinical characteristics were more accurate in predicting “Early” or “Late” discharge (full models: AUC 0.76-0.83 vs. perinatal factor models: AUC 0.56-0.69). In simplified key risk factors models, predicted probabilities for Early and Late discharge compared favorably with observed rates. Furthermore, the AUC (0.75-0.77) were similar to those of models including the full factor set. Conclusions Prediction of Early or Late discharge is poor if only perinatal factors are considered, but improves substantially with knowledge of later-occurring morbidities. Prediction using a few key risk factors is comparable to full models, and may offer a clinically applicable strategy. PMID:20008430

Top