Sample records for predictive models needed

  1. Assessing the capacity of social determinants of health data to augment predictive models identifying patients in need of wraparound social services.

    PubMed

    Kasthurirathne, Suranga N; Vest, Joshua R; Menachemi, Nir; Halverson, Paul K; Grannis, Shaun J

    2018-01-01

    A growing variety of diverse data sources is emerging to better inform health care delivery and health outcomes. We sought to evaluate the capacity for clinical, socioeconomic, and public health data sources to predict the need for various social service referrals among patients at a safety-net hospital. We integrated patient clinical data and community-level data representing patients' social determinants of health (SDH) obtained from multiple sources to build random forest decision models to predict the need for any, mental health, dietitian, social work, or other SDH service referrals. To assess the impact of SDH on improving performance, we built separate decision models using clinical and SDH determinants and clinical data only. Decision models predicting the need for any, mental health, and dietitian referrals yielded sensitivity, specificity, and accuracy measures ranging between 60% and 75%. Specificity and accuracy scores for social work and other SDH services ranged between 67% and 77%, while sensitivity scores were between 50% and 63%. Area under the receiver operating characteristic curve values for the decision models ranged between 70% and 78%. Models for predicting the need for any services reported positive predictive values between 65% and 73%. Positive predictive values for predicting individual outcomes were below 40%. The need for various social service referrals can be predicted with considerable accuracy using a wide range of readily available clinical and community data that measure socioeconomic and public health conditions. While the use of SDH did not result in significant performance improvements, our approach represents a novel and important application of risk predictive modeling. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  2. Estimating community health needs against a Triple Aim background: What can we learn from current predictive risk models?

    PubMed

    Elissen, Arianne M J; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk

    2015-05-01

    To support providers and commissioners in accurately assessing their local populations' health needs, this study produces an overview of Dutch predictive risk models for health care, focusing specifically on the type, combination and relevance of included determinants for achieving the Triple Aim (improved health, better care experience, and lower costs). We conducted a mixed-methods study combining document analyses, interviews and a Delphi study. Predictive risk models were identified based on a web search and expert input. Participating in the study were Dutch experts in predictive risk modelling (interviews; n=11) and experts in healthcare delivery, insurance and/or funding methodology (Delphi panel; n=15). Ten predictive risk models were analysed, comprising 17 unique determinants. Twelve were considered relevant by experts for estimating community health needs. Although some compositional similarities were identified between models, the combination and operationalisation of determinants varied considerably. Existing predictive risk models provide a good starting point, but optimally balancing resources and targeting interventions on the community level will likely require a more holistic approach to health needs assessment. Development of additional determinants, such as measures of people's lifestyle and social network, may require policies pushing the integration of routine data from different (healthcare) sources. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. Do We Know the Actual Magnetopause Position for Typical Solar Wind Conditions?

    NASA Technical Reports Server (NTRS)

    Samsonov, A. A.; Gordeev, E.; Tsyganenko, N. A.; Safrankova, J.; Nemecek, Z.; Simunek, J.; Sibeck, D. G.; Toth, G.; Merkin, V. G.; Raeder, J.

    2016-01-01

    We compare predicted magnetopause positions at the subsolar point and four reference points in the terminator plane obtained from several empirical and numerical MHD (magnetohydrodynamics) models. Empirical models using various sets of magnetopause crossings and making different assumptions about the magnetopause shape predict significantly different magnetopause positions (with a scatter greater than 1 Earth radius (R (sub E)) even at the subsolar point. Axisymmetric magnetopause models cannot reproduce the cusp indentations or the changes related to the dipole tilt effect, and most of them predict the magnetopause closer to the Earth than non axisymmetric models for typical solar wind conditions and zero tilt angle. Predictions of two global non axisymmetric models do not match each other, and the models need additional verification. MHD models often predict the magnetopause closer to the Earth than the non axisymmetric empirical models, but the predictions of MHD simulations may need corrections for the ring current effect and decreases of the solar wind pressure that occur in the foreshock. Comparing MHD models in which the ring current magnetic field is taken into account with the empirical Lin et al. model, we find that the differences in the reference point positions predicted by these models are relatively small for B (sub z) equals 0 (note: B (sub z) is when the Earth's magnetic field points north versus Sun's magnetic field pointing south). Therefore, we assume that these predictions indicate the actual magnetopause position, but future investigations are still needed.

  4. Sociodemographic, perceived and objective need indicators of mental health treatment use and treatment-seeking intentions among primary care medical patients.

    PubMed

    Elhai, Jon D; Voorhees, Summer; Ford, Julian D; Min, Kyeong Sam; Frueh, B Christopher

    2009-01-30

    We explored sociodemographic and illness/need associations with both recent mental healthcare utilization intensity and self-reported behavioral intentions to seek treatment. Data were examined from a community sample of 201 participants presenting for medical appointments at a Midwestern U.S. primary care clinic, in a cross-sectional survey study. Using non-linear regression analyses accounting for the excess of zero values in treatment visit counts, we found that both sociodemographic and illness/need models were significantly predictive of both recent treatment utilization intensity and intentions to seek treatment. Need models added substantial variance in prediction, above and beyond sociodemographic models. Variables with the greatest predictive role in explaining past treatment utilization intensity were greater depression severity, perceived need for treatment, older age, and lower income. Robust variables in predicting intentions to seek treatment were greater depression severity, perceived need for treatment, and more positive treatment attitudes. This study extends research findings on mental health treatment utilization, specifically addressing medical patients and using statistical methods appropriate to examining treatment visit counts, and demonstrates the importance of both objective and subjective illness/need variables in predicting recent service use intensity and intended future utilization.

  5. A Public-Private Partnership Develops and Externally Validates a 30-Day Hospital Readmission Risk Prediction Model

    PubMed Central

    Choudhry, Shahid A.; Li, Jing; Davis, Darcy; Erdmann, Cole; Sikka, Rishi; Sutariya, Bharat

    2013-01-01

    Introduction: Preventing the occurrence of hospital readmissions is needed to improve quality of care and foster population health across the care continuum. Hospitals are being held accountable for improving transitions of care to avert unnecessary readmissions. Advocate Health Care in Chicago and Cerner (ACC) collaborated to develop all-cause, 30-day hospital readmission risk prediction models to identify patients that need interventional resources. Ideally, prediction models should encompass several qualities: they should have high predictive ability; use reliable and clinically relevant data; use vigorous performance metrics to assess the models; be validated in populations where they are applied; and be scalable in heterogeneous populations. However, a systematic review of prediction models for hospital readmission risk determined that most performed poorly (average C-statistic of 0.66) and efforts to improve their performance are needed for widespread usage. Methods: The ACC team incorporated electronic health record data, utilized a mixed-method approach to evaluate risk factors, and externally validated their prediction models for generalizability. Inclusion and exclusion criteria were applied on the patient cohort and then split for derivation and internal validation. Stepwise logistic regression was performed to develop two predictive models: one for admission and one for discharge. The prediction models were assessed for discrimination ability, calibration, overall performance, and then externally validated. Results: The ACC Admission and Discharge Models demonstrated modest discrimination ability during derivation, internal and external validation post-recalibration (C-statistic of 0.76 and 0.78, respectively), and reasonable model fit during external validation for utility in heterogeneous populations. Conclusions: The ACC Admission and Discharge Models embody the design qualities of ideal prediction models. The ACC plans to continue its partnership to further improve and develop valuable clinical models. PMID:24224068

  6. Artificial intelligence: a new approach for prescription and monitoring of hemodialysis therapy.

    PubMed

    Akl, A I; Sobh, M A; Enab, Y M; Tattersall, J

    2001-12-01

    The effect of dialysis on patients is conventionally predicted using a formal mathematical model. This approach requires many assumptions of the processes involved, and validation of these may be difficult. The validity of dialysis urea modeling using a formal mathematical model has been challenged. Artificial intelligence using neural networks (NNs) has been used to solve complex problems without needing a mathematical model or an understanding of the mechanisms involved. In this study, we applied an NN model to study and predict concentrations of urea during a hemodialysis session. We measured blood concentrations of urea, patient weight, and total urea removal by direct dialysate quantification (DDQ) at 30-minute intervals during the session (in 15 chronic hemodialysis patients). The NN model was trained to recognize the evolution of measured urea concentrations and was subsequently able to predict hemodialysis session time needed to reach a target solute removal index (SRI) in patients not previously studied by the NN model (in another 15 chronic hemodialysis patients). Comparing results of the NN model with the DDQ model, the prediction error was 10.9%, with a not significant difference between predicted total urea nitrogen (UN) removal and measured UN removal by DDQ. NN model predictions of time showed a not significant difference with actual intervals needed to reach the same SRI level at the same patient conditions, except for the prediction of SRI at the first 30-minute interval, which showed a significant difference (P = 0.001). This indicates the sensitivity of the NN model to what is called patient clearance time; the prediction error was 8.3%. From our results, we conclude that artificial intelligence applications in urea kinetics can give an idea of intradialysis profiling according to individual clinical needs. In theory, this approach can be extended easily to other solutes, making the NN model a step forward to achieving artificial-intelligent dialysis control.

  7. A model to predict accommodations needed by disabled persons.

    PubMed

    Babski-Reeves, Kari; Williams, Sabrina; Waters, Tzer Nan; Crumpton-Young, Lesia L; McCauley-Bell, Pamela

    2005-09-01

    In this paper, several approaches to assist employers in the accommodation process for disabled employees are discussed and a mathematical model is proposed to assist employers in predicting the accommodation level needed by an individual with a mobility-related disability. This study investigates the validity and reliability of this model in assessing the accommodation level needed by individuals utilizing data collected from twelve individuals with mobility-related disabilities. Based on the results of the statistical analyses, this proposed model produces a feasible preliminary measure for assessing the accommodation level needed for persons with mobility-related disabilities. Suggestions for practical application of this model in an industrial setting are addressed.

  8. Incorporating uncertainty in predictive species distribution modelling.

    PubMed

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  9. Self-determination theory and diminished functioning: the role of interpersonal control and psychological need thwarting.

    PubMed

    Bartholomew, Kimberley J; Ntoumanis, Nikos; Ryan, Richard M; Bosch, Jos A; Thøgersen-Ntoumani, Cecilie

    2011-11-01

    Drawing from self-determination theory, three studies explored the social-environmental conditions that satisfy versus thwart psychological needs and, in turn, affect psychological functioning and well-being or ill-being. In cross-sectional Studies 1 and 2, structural equation modeling analyses supported latent factor models in which need satisfaction was predicted by athletes' perceptions of autonomy support, and need thwarting was better predicted by coach control. Athletes' perceptions of need satisfaction predicted positive outcomes associated with sport participation (vitality and positive affect), whereas need thwarting more consistently predicted maladaptive outcomes (disordered eating, burnout, depression, negative affect, and physical symptoms). In addition, athletes' perceptions of psychological need thwarting were significantly associated with perturbed physiological arousal (elevated levels of secretory immunoglobulin A) prior to training. The final study involved the completion of a diary and supported the relations observed in the cross-sectional studies at a daily level. These findings have important implications for the operationalization and measurement of interpersonal styles and psychological needs.

  10. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  11. Goal striving, goal attainment, and well-being: adapting and testing the self-concordance model in sport.

    PubMed

    Smith, Alison; Ntoumanis, Nikos; Duda, Joan

    2007-12-01

    Grounded in self-determination theory (Deci & Ryan, 1985) and the self-concordance model (Sheldon & Elliot, 1999), this study examined the motivational processes underlying goal striving in sport as well as the role of perceived coach autonomy support in the goal process. Structural equation modeling with a sample of 210 British athletes showed that autonomous goal motives positively predicted effort, which, in turn, predicted goal attainment. Goal attainment was positively linked to need satisfaction, which, in turn, predicted psychological well-being. Effort and need satisfaction were found to mediate the associations between autonomous motives and goal attainment and between attainment and well-being, respectively. Controlled motives negatively predicted well-being, and coach autonomy support positively predicted both autonomous motives and need satisfaction. Associations of autonomous motives with effort were not reducible to goal difficulty, goal specificity, or goal efficacy. These findings support the self-concordance model as a framework for further research on goal setting in sport.

  12. Mine Burial Assessment State-of the Art in Prediction and Modeling Workshop and Initiation of Technical Program

    DTIC Science & Technology

    2000-09-30

    Burial Assessment State-of-the Art Science , Technology, and Modeling. A Review of Coastal Research, Modeling, and Naval Operational Needs in Shallow Water...the ONR Mine Burial Prediction Program are summarized below. 1) Completed comprehensive technical reports: a. Mine Burial Assessment, State-of-the Art ... Science , Technology, and Modeling. A review of Coastal Research, Modeling, and Naval Operational Needs in Shallow Water Environments with

  13. Web 2.0 Articles: Content Analysis and a Statistical Model to Predict Recognition of the Need for New Instructional Design Strategies

    ERIC Educational Resources Information Center

    Liu, Leping; Maddux, Cleborne D.

    2008-01-01

    This article presents a study of Web 2.0 articles intended to (a) analyze the content of what is written and (b) develop a statistical model to predict whether authors' write about the need for new instructional design strategies and models. Eighty-eight technology articles were subjected to lexical analysis and a logistic regression model was…

  14. Implications of a Need-Press-Competence Model for Institutionalized Elderly.

    ERIC Educational Resources Information Center

    Wirzbicki, Philip J.; Smith, Barry D.

    The predictive utility of a proposed need-press competence (NPC) model of satisfaction was compared with that of the traditional need-press fit model. Structured interviews with 30 residents from two nursing homes provided measures of needs, press, competence, and satisfaction. The NPC model was a better predictor of expressed satisfaction than…

  15. Assimilation of Satellite to Improve Cloud Simulation in Wrf Model

    NASA Astrophysics Data System (ADS)

    Park, Y. H.; Pour Biazar, A.; McNider, R. T.

    2012-12-01

    A simple approach has been introduced to improve cloud simulation spatially and temporally in a meteorological model. The first step for this approach is to use Geostationary Operational Environmental Satellite (GOES) observations to identify clouds and estimate the clouds structure. Then by comparing GOES observations to model cloud field, we identify areas in which model has under-predicted or over-predicted clouds. Next, by introducing subsidence in areas with over-prediction and lifting in areas with under-prediction, erroneous clouds are removed and new clouds are formed. The technique estimates a vertical velocity needed for the cloud correction and then uses a one dimensional variation schemes (1D_Var) to calculate the horizontal divergence components and the consequent horizontal wind components needed to sustain such vertical velocity. Finally, the new horizontal winds are provided as a nudging field to the model. This nudging provides the dynamical support needed to create/clear clouds in a sustainable manner. The technique was implemented and tested in the Weather Research and Forecast (WRF) Model and resulted in substantial improvement in model simulated clouds. Some of the results are presented here.

  16. The North American Multi-Model Ensemble (NMME): Phase-1 Seasonal to Interannual Prediction, Phase-2 Toward Developing Intra-Seasonal Prediction

    NASA Technical Reports Server (NTRS)

    Kirtman, Ben P.; Min, Dughong; Infanti, Johnna M.; Kinter, James L., III; Paolino, Daniel A.; Zhang, Qin; vandenDool, Huug; Saha, Suranjana; Mendez, Malaquias Pena; Becker, Emily; hide

    2013-01-01

    The recent US National Academies report "Assessment of Intraseasonal to Interannual Climate Prediction and Predictability" was unequivocal in recommending the need for the development of a North American Multi-Model Ensemble (NMME) operational predictive capability. Indeed, this effort is required to meet the specific tailored regional prediction and decision support needs of a large community of climate information users. The multi-model ensemble approach has proven extremely effective at quantifying prediction uncertainty due to uncertainty in model formulation, and has proven to produce better prediction quality (on average) then any single model ensemble. This multi-model approach is the basis for several international collaborative prediction research efforts, an operational European system and there are numerous examples of how this multi-model ensemble approach yields superior forecasts compared to any single model. Based on two NOAA Climate Test Bed (CTB) NMME workshops (February 18, and April 8, 2011) a collaborative and coordinated implementation strategy for a NMME prediction system has been developed and is currently delivering real-time seasonal-to-interannual predictions on the NOAA Climate Prediction Center (CPC) operational schedule. The hindcast and real-time prediction data is readily available (e.g., http://iridl.ldeo.columbia.edu/SOURCES/.Models/.NMME/) and in graphical format from CPC (http://origin.cpc.ncep.noaa.gov/products/people/wd51yf/NMME/index.html). Moreover, the NMME forecast are already currently being used as guidance for operational forecasters. This paper describes the new NMME effort, presents an overview of the multi-model forecast quality, and the complementary skill associated with individual models.

  17. Bringing modeling to the masses: A web based system to predict potential species distributions

    USGS Publications Warehouse

    Graham, Jim; Newman, Greg; Kumar, Sunil; Jarnevich, Catherine S.; Young, Nick; Crall, Alycia W.; Stohlgren, Thomas J.; Evangelista, Paul

    2010-01-01

    Predicting current and potential species distributions and abundance is critical for managing invasive species, preserving threatened and endangered species, and conserving native species and habitats. Accurate predictive models are needed at local, regional, and national scales to guide field surveys, improve monitoring, and set priorities for conservation and restoration. Modeling capabilities, however, are often limited by access to software and environmental data required for predictions. To address these needs, we built a comprehensive web-based system that: (1) maintains a large database of field data; (2) provides access to field data and a wealth of environmental data; (3) accesses values in rasters representing environmental characteristics; (4) runs statistical spatial models; and (5) creates maps that predict the potential species distribution. The system is available online at www.niiss.org, and provides web-based tools for stakeholders to create potential species distribution models and maps under current and future climate scenarios.

  18. Towards more accurate and reliable predictions for nuclear applications

    NASA Astrophysics Data System (ADS)

    Goriely, Stephane; Hilaire, Stephane; Dubray, Noel; Lemaître, Jean-François

    2017-09-01

    The need for nuclear data far from the valley of stability, for applications such as nuclear astrophysics or future nuclear facilities, challenges the robustness as well as the predictive power of present nuclear models. Most of the nuclear data evaluation and prediction are still performed on the basis of phenomenological nuclear models. For the last decades, important progress has been achieved in fundamental nuclear physics, making it now feasible to use more reliable, but also more complex microscopic or semi-microscopic models in the evaluation and prediction of nuclear data for practical applications. Nowadays mean-field models can be tuned at the same level of accuracy as the phenomenological models, renormalized on experimental data if needed, and therefore can replace the phenomenological inputs in the evaluation of nuclear data. The latest achievements to determine nuclear masses within the non-relativistic HFB approach, including the related uncertainties in the model predictions, are discussed. Similarly, recent efforts to determine fission observables within the mean-field approach are described and compared with more traditional existing models.

  19. Automatically updating predictive modeling workflows support decision-making in drug design.

    PubMed

    Muegge, Ingo; Bentzien, Jörg; Mukherjee, Prasenjit; Hughes, Robert O

    2016-09-01

    Using predictive models for early decision-making in drug discovery has become standard practice. We suggest that model building needs to be automated with minimum input and low technical maintenance requirements. Models perform best when tailored to answering specific compound optimization related questions. If qualitative answers are required, 2-bin classification models are preferred. Integrating predictive modeling results with structural information stimulates better decision making. For in silico models supporting rapid structure-activity relationship cycles the performance deteriorates within weeks. Frequent automated updates of predictive models ensure best predictions. Consensus between multiple modeling approaches increases the prediction confidence. Combining qualified and nonqualified data optimally uses all available information. Dose predictions provide a holistic alternative to multiple individual property predictions for reaching complex decisions.

  20. Comparison of prediction models for use of medical resources at urban auto-racing events.

    PubMed

    Nable, Jose V; Margolis, Asa M; Lawner, Benjamin J; Hirshon, Jon Mark; Perricone, Alexander J; Galvagno, Samuel M; Lee, Debra; Millin, Michael G; Bissell, Richard A; Alcorta, Richard L

    2014-12-01

    INTRODUCTION Predicting the number of patient encounters and transports during mass gatherings can be challenging. The nature of these events necessitates that proper resources are available to meet the needs that arise. Several prediction models to assist event planners in forecasting medical utilization have been proposed in the literature. The objective of this study was to determine the accuracy of the Arbon and Hartman models in predicting the number of patient encounters and transportations from the Baltimore Grand Prix (BGP), held in 2011 and 2012. It was hypothesized that the Arbon method, which utilizes regression model-derived equations to estimate, would be more accurate than the Hartman model, which categorizes events into only three discreet severity types. This retrospective analysis of the BGP utilized data collected from an electronic patient tracker system. The actual number of patients evaluated and transported at the BGP was tabulated and compared to the numbers predicted by the two studied models. Several environmental features including weather, crowd attendance, and presence of alcohol were used in the Arbon and Hartman models. Approximately 130,000 spectators attended the first event, and approximately 131,000 attended the second. The number of patient encounters per day ranged from 19 to 57 in 2011, and the number of transports from the scene ranged from two to nine. In 2012, the number of patients ranged from 19 to 44 per day, and the number of transports to emergency departments ranged from four to nine. With the exception of one day in 2011, the Arbon model over predicted the number of encounters. For both events, the Hartman model over predicted the number of patient encounters. In regard to hospital transports, the Arbon model under predicted the actual numbers whereas the Hartman model both over predicted and under predicted the number of transports from both events, varying by day. These findings call attention to the need for the development of a versatile and accurate model that can more accurately predict the number of patient encounters and transports associated with mass-gathering events so that medical needs can be anticipated and sufficient resources can be provided.

  1. Prediction of air temperature for thermal comfort of people using sleeping bags: a review

    NASA Astrophysics Data System (ADS)

    Huang, Jianhua

    2008-11-01

    Six models for determining air temperatures for thermal comfort of people using sleeping bags were reviewed. These models were based on distinctive metabolic rates and mean skin temperatures. All model predictions of air temperatures are low when the insulation values of the sleeping bag are high. Nevertheless, prediction variations are greatest for the sleeping bags with high insulation values, and there is a high risk of hypothermia if an inappropriate sleeping bag is chosen for the intended conditions of use. There is, therefore, a pressing need to validate the models by wear trial and determine which one best reflects ordinary consumer needs.

  2. Prediction of air temperature for thermal comfort of people using sleeping bags: a review.

    PubMed

    Huang, Jianhua

    2008-11-01

    Six models for determining air temperatures for thermal comfort of people using sleeping bags were reviewed. These models were based on distinctive metabolic rates and mean skin temperatures. All model predictions of air temperatures are low when the insulation values of the sleeping bag are high. Nevertheless, prediction variations are greatest for the sleeping bags with high insulation values, and there is a high risk of hypothermia if an inappropriate sleeping bag is chosen for the intended conditions of use. There is, therefore, a pressing need to validate the models by wear trial and determine which one best reflects ordinary consumer needs.

  3. Overview of the 1986--1987 atomic mass predictions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haustein, P.E.

    1988-07-01

    The need for a comprehensive update of earlier sets of atomic mass predictions is documented. A project that grew from this need and which resulted in the preparation of the 1986--1987 Atomic Mass Predictions is summarized. Ten sets of new mass predictions and expository text from a variety of types of mass models are combined with the latest evaluation of experimentally determined atomic masses. The methodology employed in constructing these mass predictions is outlined. The models are compared with regard to their reproduction of the experimental mass surface and their use of varying numbers of adjustable parameters. Plots are presented,more » for each set of predictions, of differences between model calculations and the measured masses. These plots may be used to estimate the reliability of the new mass predictions in unmeasured regions that border the experimetally known mass surface. copyright 1988 Academic Press, Inc.« less

  4. Toward Process-resolving Synthesis and Prediction of Arctic Climate Change Using the Regional Arctic System Model

    NASA Astrophysics Data System (ADS)

    Maslowski, W.

    2017-12-01

    The Regional Arctic System Model (RASM) has been developed to better understand the operation of Arctic System at process scale and to improve prediction of its change at a spectrum of time scales. RASM is a pan-Arctic, fully coupled ice-ocean-atmosphere-land model with marine biogeochemistry extension to the ocean and sea ice models. The main goal of our research is to advance a system-level understanding of critical processes and feedbacks in the Arctic and their links with the Earth System. The secondary, an equally important objective, is to identify model needs for new or additional observations to better understand such processes and to help constrain models. Finally, RASM has been used to produce sea ice forecasts for September 2016 and 2017, in contribution to the Sea Ice Outlook of the Sea Ice Prediction Network. Future RASM forecasts, are likely to include increased resolution for model components and ecosystem predictions. Such research is in direct support of the US environmental assessment and prediction needs, including those of the U.S. Navy, Department of Defense, and the recent IARPC Arctic Research Plan 2017-2021. In addition to an overview of RASM technical details, selected model results are presented from a hierarchy of climate models together with available observations in the region to better understand potential oceanic contributions to polar amplification. RASM simulations are analyzed to evaluate model skill in representing seasonal climatology as well as interannual and multi-decadal climate variability and predictions. Selected physical processes and resulting feedbacks are discussed to emphasize the need for fully coupled climate model simulations, high model resolution and sensitivity of simulated sea ice states to scale dependent model parameterizations controlling ice dynamics, thermodynamics and coupling with the atmosphere and ocean.

  5. A test of basic psychological needs theory in young soccer players: time-lagged design at the individual and team levels.

    PubMed

    González, L; Tomás, I; Castillo, I; Duda, J L; Balaguer, I

    2017-11-01

    Within the framework of basic psychological needs theory (Deci & Ryan, 2000), multilevel structural equation modeling (MSEM) with a time-lagged design was used to test a mediation model examining the relationship between perceptions of coaches' interpersonal styles (autonomy supportive and controlling), athletes' basic psychological needs (satisfaction and thwarting), and indicators of well-being (subjective vitality) and ill-being (burnout), estimating separately between and within effects. The participants were 597 Spanish male soccer players aged between 11 and 14 years (M = 12.57, SD = 0.54) from 40 teams who completed a questionnaire package at two time points in a competitive season. Results revealed that at the individual level, athletes' perceptions of autonomy support positively predicted athletes' need satisfaction (autonomy, competence, and relatedness), whereas athletes' perceptions of controlling style positively predicted athletes' need thwarting (autonomy, competence, and relatedness). In turn, all three athletes' need satisfaction dimensions predicted athletes' subjective vitality and burnout (positively and negatively, respectively), whereas competence thwarting negatively predicted subjective vitality and competence and relatedness positively predicted burnout. At the team level, team perceptions of autonomy supportive style positively predicted team autonomy and relatedness satisfaction. Mediation effects only appeared at the individual level. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. Unhealthy weight control behaviours in adolescent girls: a process model based on self-determination theory.

    PubMed

    Thøgersen-Ntoumani, Cecilie; Ntoumanis, Nikos; Nikitaras, Nikitas

    2010-06-01

    This study used self-determination theory (Deci, E.L., & Ryan, R.M. (2000). The 'what' and 'why' of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11, 227-268.) to examine predictors of body image concerns and unhealthy weight control behaviours in a sample of 350 Greek adolescent girls. A process model was tested which proposed that perceptions of parental autonomy support and two life goals (health and image) would predict adolescents' degree of satisfaction of their basic psychological needs. In turn, psychological need satisfaction was hypothesised to negatively predict body image concerns (i.e. drive for thinness and body dissatisfaction) and, indirectly, unhealthy weight control behaviours. The predictions of the model were largely supported indicating that parental autonomy support and adaptive life goals can indirectly impact upon the extent to which female adolescents engage in unhealthy weight control behaviours via facilitating the latter's psychological need satisfaction.

  7. The need for data science in epidemic modelling. Comment on: "Mathematical models to characterize early epidemic growth: A review" by Gerardo Chowell et al.

    NASA Astrophysics Data System (ADS)

    Danon, Leon; Brooks-Pollock, Ellen

    2016-09-01

    In their review, Chowell et al. consider the ability of mathematical models to predict early epidemic growth [1]. In particular, they question the central prediction of classical differential equation models that the number of cases grows exponentially during the early stages of an epidemic. Using examples including HIV and Ebola, they argue that classical models fail to capture key qualitative features of early growth and describe a selection of models that do capture non-exponential epidemic growth. An implication of this failure is that predictions may be inaccurate and unusable, highlighting the need for care when embarking upon modelling using classical methodology. There remains a lack of understanding of the mechanisms driving many observed epidemic patterns; we argue that data science should form a fundamental component of epidemic modelling, providing a rigorous methodology for data-driven approaches, rather than trying to enforce established frameworks. The need for refinement of classical models provides a strong argument for the use of data science, to identify qualitative characteristics and pinpoint the mechanisms responsible for the observed epidemic patterns.

  8. Within-person variation in security of attachment: a self-determination theory perspective on attachment, need fulfillment, and well-being.

    PubMed

    La Guardia, J G; Ryan, R M; Couchman, C E; Deci, E L

    2000-09-01

    Attachment research has traditionally focused on individual differences in global patterns of attachment to important others. The current research instead focuses primarily on within-person variability in attachments across relational partners. It was predicted that within-person variability would be substantial, even among primary attachment figures of mother, father, romantic partner, and best friend. The prediction was supported in three studies. Furthermore, in line with self-determination theory, multilevel modeling and regression analyses showed that, at the relationship level, individuals' experience of fulfillment of the basic needs for autonomy, competence, and relatedness positively predicted overall attachment security, model of self, and model of other. Relations of both attachment and need satisfaction to well-being were also explored.

  9. Profile and predictors of service needs for families of children with autism spectrum disorders

    PubMed Central

    Zwaigenbaum, Lonnie; Nicholas, David

    2015-01-01

    Purpose: Increasing demand for autism services is straining service systems. Tailoring services to best meet families’ needs could improve their quality of life and decrease burden on the system. We explored overall, best, and worst met service needs, and predictors of those needs, for families of children with autism spectrum disorders. Methods: Parents of 143 children with autism spectrum disorders (2–18 years) completed a survey including demographic and descriptive information, the Family Needs Survey–Revised, and an open-ended question about service needs. Descriptive statistics characterize the sample and determine the degree to which items were identified and met as needs. Predictors of total and unmet needs were modeled with regression or generalized linear model. Qualitative responses were thematically analyzed. Results: The most frequently identified overall and unmet service needs were information on services, family support, and respite care. The funding and quality of professional support available were viewed positively. Decreased child’s age and income and being an older mother predicted more total needs. Having an older child or mother, lower income, and disruptive behaviors predicted more total unmet needs, yet only disruptive behaviors predicted proportional unmet need. Child’s language or intellectual abilities did not predict needs. Conclusion: Findings can help professionals, funders, and policy-makers tailor services to best meet families’ needs. PMID:25073749

  10. Modeling moisture content of fine dead wildland fuels: Input to the BEHAVE fire prediction system

    Treesearch

    Richard C. Rothermel; Ralph A. Wilson; Glen A. Morris; Stephen S. Sackett

    1986-01-01

    Describes a model for predicting moisture content of fine fuels for use with the BEHAVE fire behavior and fuel modeling system. The model is intended to meet the need for more accurate predictions of fine fuel moisture, particularly in northern conifer stands and on days following rain. The model is based on the Canadian Fine Fuel Moisture Code (FFMC), modified to...

  11. A way forward for fire-caused tree mortality prediction: Modeling a physiological consequence of fire

    Treesearch

    Kathleen L. Kavanaugh; Matthew B. Dickinson; Anthony S. Bova

    2010-01-01

    Current operational methods for predicting tree mortality from fire injury are regression-based models that only indirectly consider underlying causes and, thus, have limited generality. A better understanding of the physiological consequences of tree heating and injury are needed to develop biophysical process models that can make predictions under changing or novel...

  12. Progress in space weather predictions and applications

    NASA Astrophysics Data System (ADS)

    Lundstedt, H.

    The methods of today's predictions of space weather and effects are so much more advanced and yesterday's statistical methods are now replaced by integrated knowledge-based neuro-computing models and MHD methods. Within the ESA Space Weather Programme Study a real-time forecast service has been developed for space weather and effects. This prototype is now being implemented for specific users. Today's applications are not only so many more but also so much more advanced and user-oriented. A scientist needs real-time predictions of a global index as input for an MHD model calculating the radiation dose for EVAs. A power company system operator needs a prediction of the local value of a geomagnetically induced current. A science tourist needs to know whether or not aurora will occur. Soon we might even be able to predict the tropospheric climate changes and weather caused by the space weather.

  13. Quality metrics for sensor images

    NASA Technical Reports Server (NTRS)

    Ahumada, AL

    1993-01-01

    Methods are needed for evaluating the quality of augmented visual displays (AVID). Computational quality metrics will help summarize, interpolate, and extrapolate the results of human performance tests with displays. The FLM Vision group at NASA Ames has been developing computational models of visual processing and using them to develop computational metrics for similar problems. For example, display modeling systems use metrics for comparing proposed displays, halftoning optimizing methods use metrics to evaluate the difference between the halftone and the original, and image compression methods minimize the predicted visibility of compression artifacts. The visual discrimination models take as input two arbitrary images A and B and compute an estimate of the probability that a human observer will report that A is different from B. If A is an image that one desires to display and B is the actual displayed image, such an estimate can be regarded as an image quality metric reflecting how well B approximates A. There are additional complexities associated with the problem of evaluating the quality of radar and IR enhanced displays for AVID tasks. One important problem is the question of whether intruding obstacles are detectable in such displays. Although the discrimination model can handle detection situations by making B the original image A plus the intrusion, this detection model makes the inappropriate assumption that the observer knows where the intrusion will be. Effects of signal uncertainty need to be added to our models. A pilot needs to make decisions rapidly. The models need to predict not just the probability of a correct decision, but the probability of a correct decision by the time the decision needs to be made. That is, the models need to predict latency as well as accuracy. Luce and Green have generated models for auditory detection latencies. Similar models are needed for visual detection. Most image quality models are designed for static imagery. Watson has been developing a general spatial-temporal vision model to optimize video compression techniques. These models need to be adapted and calibrated for AVID applications.

  14. Challenges facing developers of CAD/CAM models that seek to predict human working postures

    NASA Astrophysics Data System (ADS)

    Wiker, Steven F.

    2005-11-01

    This paper outlines the need for development of human posture prediction models for Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) design applications in product, facility and work design. Challenges facing developers of posture prediction algorithms are presented and discussed.

  15. Warranty optimisation based on the prediction of costs to the manufacturer using neural network model and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Stamenkovic, Dragan D.; Popovic, Vladimir M.

    2015-02-01

    Warranty is a powerful marketing tool, but it always involves additional costs to the manufacturer. In order to reduce these costs and make use of warranty's marketing potential, the manufacturer needs to master the techniques for warranty cost prediction according to the reliability characteristics of the product. In this paper a combination free replacement and pro rata warranty policy is analysed as warranty model for one type of light bulbs. Since operating conditions have a great impact on product reliability, they need to be considered in such analysis. A neural network model is used to predict light bulb reliability characteristics based on the data from the tests of light bulbs in various operating conditions. Compared with a linear regression model used in the literature for similar tasks, the neural network model proved to be a more accurate method for such prediction. Reliability parameters obtained in this way are later used in Monte Carlo simulation for the prediction of times to failure needed for warranty cost calculation. The results of the analysis make possible for the manufacturer to choose the optimal warranty policy based on expected product operating conditions. In such a way, the manufacturer can lower the costs and increase the profit.

  16. Goal striving within agentic and communal roles: separate but functionally similar pathways to enhanced well-being.

    PubMed

    Sheldon, Kennon M; Cooper, M Lynne

    2008-06-01

    Do agency and communion strivings provide functionally similar but predictively independent pathways to enhanced well-being? We tested this idea via a year-long study of 493 diverse community adults. Our process model, based on self-determination and motive disposition theories, fit the data well. First, the need for achievement predicted initial autonomous motivation for agentic (work and school) role-goals and the need for intimacy predicted felt autonomy for communal (relationship and parenting) goals. For both agentic and communal goals, autonomous motivation predicted corresponding initial expectancies that predicted later goal attainment. Finally, each type of attainment predicted improved adjustment or role-satisfaction over the year. Besides being similar across agency and communion, the model was also similar across race and gender, except that the beneficial effects of communal goal attainment were stronger for high need for intimacy women and Blacks. Implications for agency/communion theories, motivation theories, and theories of well-being are discussed.

  17. Perspectives on multifield models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banerjee, S.

    1997-07-01

    Multifield models for prediction of nuclear reactor thermalhydraulics are reviewed from the viewpoint of their structure and requirements for closure relationships. Their strengths and weaknesses are illustrated with examples, indicating that they are effective in predicting separated and distributed flow regimes, but have problems for flows with large oscillations. Needs for multifield models are also discussed in the context of reactor operations and accident simulations. The highest priorities for future developments appear to relate to closure relationships for three-dimensional multifield models with emphasis on those needed for calculations of phase separation and entrainment/de-entrainment in complex geometries.

  18. Effective moisture penetration depth model for residential buildings: Sensitivity analysis and guidance on model inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Jason; Winkler, Jon

    Moisture buffering of building materials has a significant impact on the building's indoor humidity, and building energy simulations need to model this buffering to accurately predict the humidity. Researchers requiring a simple moisture-buffering approach typically rely on the effective-capacitance model, which has been shown to be a poor predictor of actual indoor humidity. This paper describes an alternative two-layer effective moisture penetration depth (EMPD) model and its inputs. While this model has been used previously, there is a need to understand the sensitivity of this model to uncertain inputs. In this paper, we use the moisture-adsorbent materials exposed to themore » interior air: drywall, wood, and carpet. We use a global sensitivity analysis to determine which inputs are most influential and how the model's prediction capability degrades due to uncertainty in these inputs. We then compare the model's humidity prediction with measured data from five houses, which shows that this model, and a set of simple inputs, can give reasonable prediction of the indoor humidity.« less

  19. Effective moisture penetration depth model for residential buildings: Sensitivity analysis and guidance on model inputs

    DOE PAGES

    Woods, Jason; Winkler, Jon

    2018-01-31

    Moisture buffering of building materials has a significant impact on the building's indoor humidity, and building energy simulations need to model this buffering to accurately predict the humidity. Researchers requiring a simple moisture-buffering approach typically rely on the effective-capacitance model, which has been shown to be a poor predictor of actual indoor humidity. This paper describes an alternative two-layer effective moisture penetration depth (EMPD) model and its inputs. While this model has been used previously, there is a need to understand the sensitivity of this model to uncertain inputs. In this paper, we use the moisture-adsorbent materials exposed to themore » interior air: drywall, wood, and carpet. We use a global sensitivity analysis to determine which inputs are most influential and how the model's prediction capability degrades due to uncertainty in these inputs. We then compare the model's humidity prediction with measured data from five houses, which shows that this model, and a set of simple inputs, can give reasonable prediction of the indoor humidity.« less

  20. A model for prediction of STOVL ejector dynamics

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.

    1989-01-01

    A semi-empirical control-volume approach to ejector modeling for transient performance prediction is presented. This new approach is motivated by the need for a predictive real-time ejector sub-system simulation for Short Take-Off Verticle Landing (STOVL) integrated flight and propulsion controls design applications. Emphasis is placed on discussion of the approximate characterization of the mixing process central to thrust augmenting ejector operation. The proposed ejector model suggests transient flow predictions are possible with a model based on steady-flow data. A practical test case is presented to illustrate model calibration.

  1. Efficacy of the Supports Intensity Scale (SIS) to Predict Extraordinary Support Needs

    ERIC Educational Resources Information Center

    Wehmeyer, Michael; Chapman, Theodore E.; Little, Todd D.; Thompson, James R.; Schalock, Robert; Tasse, Marc J.

    2009-01-01

    Data were collected on 274 adults to investigate the efficacy of the Supports Intensity Scale (SIS) as a tool to measure the support needs of individuals with intellectual and related developmental disabilities. Findings showed that SIS scores contributed significantly to a model that predicted greater levels of support need. Moreover, scores from…

  2. Recent development of risk-prediction models for incident hypertension: An updated systematic review

    PubMed Central

    Xiao, Lei; Liu, Ya; Wang, Zuoguang; Li, Chuang; Jin, Yongxin; Zhao, Qiong

    2017-01-01

    Background Hypertension is a leading global health threat and a major cardiovascular disease. Since clinical interventions are effective in delaying the disease progression from prehypertension to hypertension, diagnostic prediction models to identify patient populations at high risk for hypertension are imperative. Methods Both PubMed and Embase databases were searched for eligible reports of either prediction models or risk scores of hypertension. The study data were collected, including risk factors, statistic methods, characteristics of study design and participants, performance measurement, etc. Results From the searched literature, 26 studies reporting 48 prediction models were selected. Among them, 20 reports studied the established models using traditional risk factors, such as body mass index (BMI), age, smoking, blood pressure (BP) level, parental history of hypertension, and biochemical factors, whereas 6 reports used genetic risk score (GRS) as the prediction factor. AUC ranged from 0.64 to 0.97, and C-statistic ranged from 60% to 90%. Conclusions The traditional models are still the predominant risk prediction models for hypertension, but recently, more models have begun to incorporate genetic factors as part of their model predictors. However, these genetic predictors need to be well selected. The current reported models have acceptable to good discrimination and calibration ability, but whether the models can be applied in clinical practice still needs more validation and adjustment. PMID:29084293

  3. Predictive modeling of complications.

    PubMed

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  4. Prediction of brittleness based on anisotropic rock physics model for kerogen-rich shale

    NASA Astrophysics Data System (ADS)

    Qian, Ke-Ran; He, Zhi-Liang; Chen, Ye-Quan; Liu, Xi-Wu; Li, Xiang-Yang

    2017-12-01

    The construction of a shale rock physics model and the selection of an appropriate brittleness index ( BI) are two significant steps that can influence the accuracy of brittleness prediction. On one hand, the existing models of kerogen-rich shale are controversial, so a reasonable rock physics model needs to be built. On the other hand, several types of equations already exist for predicting the BI whose feasibility needs to be carefully considered. This study constructed a kerogen-rich rock physics model by performing the selfconsistent approximation and the differential effective medium theory to model intercoupled clay and kerogen mixtures. The feasibility of our model was confirmed by comparison with classical models, showing better accuracy. Templates were constructed based on our model to link physical properties and the BI. Different equations for the BI had different sensitivities, making them suitable for different types of formations. Equations based on Young's Modulus were sensitive to variations in lithology, while those using Lame's Coefficients were sensitive to porosity and pore fluids. Physical information must be considered to improve brittleness prediction.

  5. Accurate and dynamic predictive model for better prediction in medicine and healthcare.

    PubMed

    Alanazi, H O; Abdullah, A H; Qureshi, K N; Ismail, A S

    2018-05-01

    Information and communication technologies (ICTs) have changed the trend into new integrated operations and methods in all fields of life. The health sector has also adopted new technologies to improve the systems and provide better services to customers. Predictive models in health care are also influenced from new technologies to predict the different disease outcomes. However, still, existing predictive models have suffered from some limitations in terms of predictive outcomes performance. In order to improve predictive model performance, this paper proposed a predictive model by classifying the disease predictions into different categories. To achieve this model performance, this paper uses traumatic brain injury (TBI) datasets. TBI is one of the serious diseases worldwide and needs more attention due to its seriousness and serious impacts on human life. The proposed predictive model improves the predictive performance of TBI. The TBI data set is developed and approved by neurologists to set its features. The experiment results show that the proposed model has achieved significant results including accuracy, sensitivity, and specificity.

  6. Progress in Finite Element Modeling of the Lower Extremities

    DTIC Science & Technology

    2015-06-01

    bending and subsequent injury , e.g., the distal tibia motion results in bending of the tibia rather than the tibia rotating about the knee joint...layers, rich anisotropy, and wide variability. Developing a model for predictive injury capability, therefore, needs to be versatile and flexible to... injury capability presents many challenges, the first of which is identifying the types of conditions where injury prediction is needed. Our focus

  7. Broadening the trans-contextual model of motivation: A study with Spanish adolescents.

    PubMed

    González-Cutre, D; Sicilia, Á; Beas-Jiménez, M; Hagger, M S

    2014-08-01

    The original trans-contextual model of motivation proposed that autonomy support from teachers develops students' autonomous motivation in physical education (PE), and that autonomous motivation is transferred from PE contexts to physical activity leisure-time contexts, and predicts attitudes, perceived behavioral control and subjective norms, and forming intentions to participate in future physical activity behavior. The purpose of this study was to test an extended trans-contextual model of motivation including autonomy support from peers and parents and basic psychological needs in a Spanish sample. School students (n = 400) aged between 12 and 18 years completed measures of perceived autonomy support from three sources, autonomous motivation and constructs from the theory of planned behavior at three different points in time and in two contexts, PE and leisure-time. A path analysis controlling for past physical activity behavior supported the main postulates of the model. Autonomous motivation in a PE context predicted autonomous motivation in a leisure-time physical activity context, perceived autonomy support from teachers predicted satisfaction of basic psychological needs in PE, and perceived autonomy support from peers and parents predicted need satisfaction in leisure-time. This study provides a cross-cultural replication of the trans-contextual model of motivation and broadens it to encompass basic psychological needs. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    PubMed Central

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models. PMID:26890307

  9. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness.

    PubMed

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models.

  10. PRELIM: Predictive Relevance Estimation from Linked Models

    DTIC Science & Technology

    2014-10-14

    code ) 14-10-2014 Final Report 11-07-2014 to 14-10-2014 PRELIM: Predictive Relevance Estimation from Linked Models N00014-14-P-1185 10257H. Van Dyke...Parunak, Ph.D. Soar Technology, Inc. 1 Executive  Summary   PRELIM (Predictive Relevance Estimation from Linked Models) draws on semantic models...The central challenge in proactive decision support is to anticipate the decision and information needs of decision-makers, in the light of likely

  11. Suspended Sediment Modeling of Dredge-Disposal Effluent in the GREAT-II Study Reach,

    DTIC Science & Technology

    1980-03-01

    Illinois site. Ky = 1000 cm2/sec .... .......... 81 5-11 Model prediction vs. field observation for centerline Rock Island site ..... ............. 82 5-12...Model prediction vs. field observation for centerline Keithsburg site ...... ............. 83 5-13 Deposition rate at all points in plume for Rock...material disposal to be examined were impairment of the water column and the covering of benthic communities. The need for mathematical models to predict

  12. Predictive models of long-term anatomic outcome in age-related macular degeneration treated with as-needed Ranibizumab.

    PubMed

    Gonzalez-Buendia, Lucia; Delgado-Tirado, Santiago; Sanabria, M Rosa; Fernandez, Itziar; Coco, Rosa M

    2017-08-18

    To analyze predictors and develop predictive models of anatomic outcome in neovascular age-related macular degeneration (AMD) treated with as-needed ranibizumab after 4 years of follow-up. A multicenter consecutive case series non-interventional study was performed. Clinical, funduscopic and OCT characteristics of 194 treatment-naïve patients with AMD treated with as-needed ranibizumab for at least 2 years and up to 4 years were analyzed at baseline, 3 months and each year until the end of the follow-up. Baseline demographic and angiographic characteristics were also evaluated. R Statistical Software was used for statistical analysis. Main outcome measure was final anatomic status. Factors associated with less probability of preserved macula were diagnosis in 2009, older age, worse vision, presence of atrophy/fibrosis, pigment epithelium detachment, and geographic atrophy/fibrotic scar/neovascular AMD in the fellow eye. Factors associated with higher probability of GA were presence of atrophy and greater number of injections, whereas male sex, worse vision, lesser change in central macular thickness and presence of fibrosis were associated with less probability of GA as final macular status. Predictive model of preserved macula vs. GA/fibrotic scar showed sensibility of 77.78% and specificity of 69.09%. Predictive model of GA vs. fibrotic scar showed sensibility of 68.89% and specificity of 72.22%. We identified predictors of final macular status, and developed two predictive models. Predictive models that we propose are based on easily harvested variables, and, if validated, could be a useful tool for individual patient management and clinical research studies.

  13. Investigating Some Technical Issues on Cohesive Zone Modeling of Fracture

    NASA Technical Reports Server (NTRS)

    Wang, John T.

    2011-01-01

    This study investigates some technical issues related to the use of cohesive zone models (CZMs) in modeling fracture processes. These issues include: why cohesive laws of different shapes can produce similar fracture predictions; under what conditions CZM predictions have a high degree of agreement with linear elastic fracture mechanics (LEFM) analysis results; when the shape of cohesive laws becomes important in the fracture predictions; and why the opening profile along the cohesive zone length needs to be accurately predicted. Two cohesive models were used in this study to address these technical issues. They are the linear softening cohesive model and the Dugdale perfectly plastic cohesive model. Each cohesive model constitutes five cohesive laws of different maximum tractions. All cohesive laws have the same cohesive work rate (CWR) which is defined by the area under the traction-separation curve. The effects of the maximum traction on the cohesive zone length and the critical remote applied stress are investigated for both models. For a CZM to predict a fracture load similar to that obtained by an LEFM analysis, the cohesive zone length needs to be much smaller than the crack length, which reflects the small scale yielding condition requirement for LEFM analysis to be valid. For large-scale cohesive zone cases, the predicted critical remote applied stresses depend on the shape of cohesive models used and can significantly deviate from LEFM results. Furthermore, this study also reveals the importance of accurately predicting the cohesive zone profile in determining the critical remote applied load.

  14. When relationships estimated in the past cannot be used to predict the future: using mechanistic models to predict landscape ecological dynamics in a changing world

    Treesearch

    Eric J. Gustafson

    2013-01-01

    Researchers and natural resource managers need predictions of how multiple global changes (e.g., climate change, rising levels of air pollutants, exotic invasions) will affect landscape composition and ecosystem function. Ecological predictive models used for this purpose are constructed using either a mechanistic (process-based) or a phenomenological (empirical)...

  15. Overview of Heat Addition and Efficiency Predictions for an Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Wilson, Scott D.; Reid, Terry; Schifer, Nicholas; Briggs, Maxwell

    2011-01-01

    Past methods of predicting net heat input needed to be validated. Validation effort pursued with several paths including improving model inputs, using test hardware to provide validation data, and validating high fidelity models. Validation test hardware provided direct measurement of net heat input for comparison to predicted values. Predicted value of net heat input was 1.7 percent less than measured value and initial calculations of measurement uncertainty were 2.1 percent (under review). Lessons learned during validation effort were incorporated into convertor modeling approach which improved predictions of convertor efficiency.

  16. Predictive models for Escherichia coli concentrations at inland lake beaches and relationship of model variables to pathogen detection

    EPA Science Inventory

    Methods are needed improve the timeliness and accuracy of recreational water‐quality assessments. Traditional culture methods require 18–24 h to obtain results and may not reflect current conditions. Predictive models, based on environmental and water quality variables, have been...

  17. A Common Core for Active Conceptual Modeling for Learning from Surprises

    NASA Astrophysics Data System (ADS)

    Liddle, Stephen W.; Embley, David W.

    The new field of active conceptual modeling for learning from surprises (ACM-L) may be helpful in preserving life, protecting property, and improving quality of life. The conceptual modeling community has developed sound theory and practices for conceptual modeling that, if properly applied, could help analysts model and predict more accurately. In particular, we need to associate more semantics with links, and we need fully reified high-level objects and relationships that have a clear, formal underlying semantics that follows a natural, ontological approach. We also need to capture more dynamic aspects in our conceptual models to more accurately model complex, dynamic systems. These concepts already exist, and the theory is well developed; what remains is to link them with the ideas needed to predict system evolution, thus enabling risk assessment and response planning. No single researcher or research group will be able to achieve this ambitious vision alone. As a starting point, we recommend that the nascent ACM-L community agree on a common core model that supports all aspects—static and dynamic—needed for active conceptual modeling in support of learning from surprises. A common core will more likely gain the traction needed to sustain the extended ACM-L research effort that will yield the advertised benefits of learning from surprises.

  18. A quantitative assessment of a terrestrial biosphere model's data needs across North American biomes

    NASA Astrophysics Data System (ADS)

    Dietze, Michael C.; Serbin, Shawn P.; Davidson, Carl; Desai, Ankur R.; Feng, Xiaohui; Kelly, Ryan; Kooper, Rob; LeBauer, David; Mantooth, Joshua; McHenry, Kenton; Wang, Dan

    2014-03-01

    Terrestrial biosphere models are designed to synthesize our current understanding of how ecosystems function, test competing hypotheses of ecosystem function against observations, and predict responses to novel conditions such as those expected under climate change. Reducing uncertainties in such models can improve both basic scientific understanding and our predictive capacity, but rarely are ecosystem models employed in the design of field campaigns. We provide a synthesis of carbon cycle uncertainty analyses conducted using the Predictive Ecosystem Analyzer ecoinformatics workflow with the Ecosystem Demography model v2. This work is a synthesis of multiple projects, using Bayesian data assimilation techniques to incorporate field data and trait databases across temperate forests, grasslands, agriculture, short rotation forestry, boreal forests, and tundra. We report on a number of data needs that span a wide array of diverse biomes, such as the need for better constraint on growth respiration, mortality, stomatal conductance, and water uptake. We also identify data needs that are biome specific, such as photosynthetic quantum efficiency at high latitudes. We recommend that future data collection efforts balance the bias of past measurements toward aboveground processes in temperate biomes with the sensitivities of different processes as represented by ecosystem models. ©2014. American Geophysical Union. All Rights Reserved.

  19. Estimating the Need for Medical Intervention due to Sleep Disruption on the International Space Station

    NASA Technical Reports Server (NTRS)

    Myers, Jerry G.; Lewandowski, Beth E.; Brooker, John E.; Hurst, S. R.; Mallis, Melissa M.; Caldwell, J. Lynn

    2008-01-01

    During ISS and shuttle missions, difficulties with sleep affect more than half of all US crews. Mitigation strategies to help astronauts cope with the challenges of disrupted sleep patterns can negatively impact both mission planning and vehicle design. The methods for addressing known detrimental impacts for some mission scenarios may have a substantial impact on vehicle specific consumable mass or volume or on the mission timeline. As part of the Integrated Medical Model (IMM) task, NASA Glenn Research Center is leading the development of a Monte Carlo based forecasting tool designed to determine the consumables required to address risks related to sleep disruption. The model currently focuses on the International Space Station and uses an algorithm that assembles representative mission schedules and feeds this into a well validated model that predicts relative levels of performance, and need for sleep (SAFTE Model, IBR Inc). Correlation of the resulting output to self-diagnosed needs for hypnotics, stimulants, and other pharmaceutical countermeasures, allows prediction of pharmaceutical use and the uncertainty of the specified prediction. This paper outlines a conceptual model for determining a rate of pharmaceutical utilization that can be used in the IMM model for comparison and optimization of mitigation methods with respect to all other significant medical needs and interventions.

  20. Task network models in the prediction of workload imposed by extravehicular activities during the Hubble Space Telescope servicing mission

    NASA Technical Reports Server (NTRS)

    Diaz, Manuel F.; Takamoto, Neal; Woolford, Barbara

    1994-01-01

    In a joint effort with Brooks AFB, Texas, the Flight Crew Support Division at JSC has begun a computer simulation and performance modeling program directed at establishing the predictive validity of software tools for modeling human performance during spaceflight. This paper addresses the utility of task network modeling for predicting the workload that astronauts are likely to encounter in extravehicular activities (EVA) during the Hubble Space Telescope (HST) repair mission. The intent of the study was to determine whether two EVA crewmembers and one intravehicular activity (IVA) crewmember could reasonably be expected to complete HST Wide Field/Planetary Camera (WFPC) replacement in the allotted time. Ultimately, examination of the points during HST servicing that may result in excessive workload will lead to recommendations to the HST Flight Systems and Servicing Project concerning (1) expectation of degraded performance, (2) the need to change task allocation across crewmembers, (3) the need to expand the timeline, and (4) the need to increase the number of EVA's.

  1. Corneal cell culture models: a tool to study corneal drug absorption.

    PubMed

    Dey, Surajit

    2011-05-01

    In recent times, there has been an ever increasing demand for ocular drugs to treat sight threatening diseases such as glaucoma, age-related macular degeneration and diabetic retinopathy. As more drugs are developed, there is a great need to test in vitro permeability of these drugs to predict their efficacy and bioavailability in vivo. Corneal cell culture models are the only tool that can predict drug absorption across ocular layers accurately and rapidly. Cell culture studies are also valuable in reducing the number of animals needed for in vivo studies which can increase the cost of the drug developmental process. Currently, rabbit corneal cell culture models are used to predict human corneal absorption due to the difficulty in human corneal studies. More recently, a three dimensional human corneal equivalent has been developed using three different cell types to mimic the human cornea. In the future, human corneal cell culture systems need to be developed to be used as a standardized model for drug permeation.

  2. Evaluation of scoring models for identifying the need for therapeutic intervention of upper gastrointestinal bleeding: A new prediction score model for Japanese patients.

    PubMed

    Iino, Chikara; Mikami, Tatsuya; Igarashi, Takasato; Aihara, Tomoyuki; Ishii, Kentaro; Sakamoto, Jyuichi; Tono, Hiroshi; Fukuda, Shinsaku

    2016-11-01

    Multiple scoring systems have been developed to predict outcomes in patients with upper gastrointestinal bleeding. We determined how well these and a newly established scoring model predict the need for therapeutic intervention, excluding transfusion, in Japanese patients with upper gastrointestinal bleeding. We reviewed data from 212 consecutive patients with upper gastrointestinal bleeding. Patients requiring endoscopic intervention, operation, or interventional radiology were allocated to the therapeutic intervention group. Firstly, we compared areas under the curve for the Glasgow-Blatchford, Clinical Rockall, and AIMS65 scores. Secondly, the scores and factors likely associated with upper gastrointestinal bleeding were analyzed with a logistic regression analysis to form a new scoring model. Thirdly, the new model and the existing model were investigated to evaluate their usefulness. Therapeutic intervention was required in 109 patients (51.4%). The Glasgow-Blatchford score was superior to both the Clinical Rockall and AIMS65 scores for predicting therapeutic intervention need (area under the curve, 0.75 [95% confidence interval, 0.69-0.81] vs 0.53 [0.46-0.61] and 0.52 [0.44-0.60], respectively). Multivariate logistic regression analysis retained seven significant predictors in the model: systolic blood pressure <100 mmHg, syncope, hematemesis, hemoglobin <10 g/dL, blood urea nitrogen ≥22.4 mg/dL, estimated glomerular filtration rate ≤ 60 mL/min per 1.73 m 2 , and antiplatelet medication. Based on these variables, we established a new scoring model with superior discrimination to those of existing scoring systems (area under the curve, 0.85 [0.80-0.90]). We developed a superior scoring model for identifying therapeutic intervention need in Japanese patients with upper gastrointestinal bleeding. © 2016 Japan Gastroenterological Endoscopy Society.

  3. The effects of a confidant and a peer group on the well-being of single elders.

    PubMed

    Gupta, V; Korte, C

    1994-01-01

    A study of 100 elderly people was carried out to compare the predictions of well-being derived from the confidant model with those derived from the Weiss model. The confidant model predicts that the most important feature of a person's social network for the well-being of that person is whether or not the person has a confidant. The Weiss model states that different persons are needed to fulfill the different needs of the person and in particular that a confidant is important to the need for intimacy and emotional security while a peer group of social friends is needed to fulfill sociability and identity needs. The two models were evaluated by comparing the relative influence of the confidant variable with the peer group variable on subject's well-being. Regression analysis was carried out on the well-being measure using as predictor variables the confidant variable, peer group variable, age, health, and financial status. The confidant and peer group variables were of equal importance to well-being, thus confirming the Weiss model.

  4. Using beta binomials to estimate classification uncertainty for ensemble models.

    PubMed

    Clark, Robert D; Liang, Wenkel; Lee, Adam C; Lawless, Michael S; Fraczkiewicz, Robert; Waldman, Marvin

    2014-01-01

    Quantitative structure-activity (QSAR) models have enormous potential for reducing drug discovery and development costs as well as the need for animal testing. Great strides have been made in estimating their overall reliability, but to fully realize that potential, researchers and regulators need to know how confident they can be in individual predictions. Submodels in an ensemble model which have been trained on different subsets of a shared training pool represent multiple samples of the model space, and the degree of agreement among them contains information on the reliability of ensemble predictions. For artificial neural network ensembles (ANNEs) using two different methods for determining ensemble classification - one using vote tallies and the other averaging individual network outputs - we have found that the distribution of predictions across positive vote tallies can be reasonably well-modeled as a beta binomial distribution, as can the distribution of errors. Together, these two distributions can be used to estimate the probability that a given predictive classification will be in error. Large data sets comprised of logP, Ames mutagenicity, and CYP2D6 inhibition data are used to illustrate and validate the method. The distributions of predictions and errors for the training pool accurately predicted the distribution of predictions and errors for large external validation sets, even when the number of positive and negative examples in the training pool were not balanced. Moreover, the likelihood of a given compound being prospectively misclassified as a function of the degree of consensus between networks in the ensemble could in most cases be estimated accurately from the fitted beta binomial distributions for the training pool. Confidence in an individual predictive classification by an ensemble model can be accurately assessed by examining the distributions of predictions and errors as a function of the degree of agreement among the constituent submodels. Further, ensemble uncertainty estimation can often be improved by adjusting the voting or classification threshold based on the parameters of the error distribution. Finally, the profiles for models whose predictive uncertainty estimates are not reliable provide clues to that effect without the need for comparison to an external test set.

  5. Livestock Helminths in a Changing Climate: Approaches and Restrictions to Meaningful Predictions

    PubMed Central

    Fox, Naomi J.; Marion, Glenn; Davidson, Ross S.; White, Piran C. L.; Hutchings, Michael R.

    2012-01-01

    Simple Summary Parasitic helminths represent one of the most pervasive challenges to livestock, and their intensity and distribution will be influenced by climate change. There is a need for long-term predictions to identify potential risks and highlight opportunities for control. We explore the approaches to modelling future helminth risk to livestock under climate change. One of the limitations to model creation is the lack of purpose driven data collection. We also conclude that models need to include a broad view of the livestock system to generate meaningful predictions. Abstract Climate change is a driving force for livestock parasite risk. This is especially true for helminths including the nematodes Haemonchus contortus, Teladorsagia circumcincta, Nematodirus battus, and the trematode Fasciola hepatica, since survival and development of free-living stages is chiefly affected by temperature and moisture. The paucity of long term predictions of helminth risk under climate change has driven us to explore optimal modelling approaches and identify current bottlenecks to generating meaningful predictions. We classify approaches as correlative or mechanistic, exploring their strengths and limitations. Climate is one aspect of a complex system and, at the farm level, husbandry has a dominant influence on helminth transmission. Continuing environmental change will necessitate the adoption of mitigation and adaptation strategies in husbandry. Long term predictive models need to have the architecture to incorporate these changes. Ultimately, an optimal modelling approach is likely to combine mechanistic processes and physiological thresholds with correlative bioclimatic modelling, incorporating changes in livestock husbandry and disease control. Irrespective of approach, the principal limitation to parasite predictions is the availability of active surveillance data and empirical data on physiological responses to climate variables. By combining improved empirical data and refined models with a broad view of the livestock system, robust projections of helminth risk can be developed. PMID:26486780

  6. Modeling the distribution of white spruce (Picea glauca) for Alaska with high accuracy: an open access role-model for predicting tree species in last remaining wilderness areas

    Treesearch

    Bettina Ohse; Falk Huettmann; Stefanie M. Ickert-Bond; Glenn P. Juday

    2009-01-01

    Most wilderness areas still lack accurate distribution information on tree species. We met this need with a predictive GIS modeling approach, using freely available digital data and computer programs to efficiently obtain high-quality species distribution maps. Here we present a digital map with the predicted distribution of white spruce (Picea glauca...

  7. Extrapolation of a predictive model for growth of a low inoculum size of Salmonella typhimurium DT104 on chicken skin to higher inoculum sizes

    USDA-ARS?s Scientific Manuscript database

    Validation of model predictions for independent variables not included in model development can save time and money by identifying conditions for which new models are not needed. A single strain of Salmonella Typhimurium DT104 was used to develop a general regression neural network model for growth...

  8. Impact of modellers' decisions on hydrological a priori predictions

    NASA Astrophysics Data System (ADS)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2014-06-01

    In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of added information. In this qualitative analysis of a statistically small number of predictions we learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing), and (iii) that added process understanding can be as efficient as adding data for improving parameters needed to satisfy model requirements.

  9. Current status and future needs of the BehavePlus Fire Modeling System

    Treesearch

    Patricia L. Andrews

    2014-01-01

    The BehavePlus Fire Modeling System is among the most widely used systems for wildland fire prediction. It is designed for use in a range of tasks including wildfire behaviour prediction, prescribed fire planning, fire investigation, fuel hazard assessment, fire model understanding, communication and research. BehavePlus is based on mathematical models for fire...

  10. [Effects of sampling plot number on tree species distribution prediction under climate change].

    PubMed

    Liang, Yu; He, Hong-Shi; Wu, Zhi-Wei; Li, Xiao-Na; Luo, Xu

    2013-05-01

    Based on the neutral landscapes under different degrees of landscape fragmentation, this paper studied the effects of sampling plot number on the prediction of tree species distribution at landscape scale under climate change. The tree species distribution was predicted by the coupled modeling approach which linked an ecosystem process model with a forest landscape model, and three contingent scenarios and one reference scenario of sampling plot numbers were assumed. The differences between the three scenarios and the reference scenario under different degrees of landscape fragmentation were tested. The results indicated that the effects of sampling plot number on the prediction of tree species distribution depended on the tree species life history attributes. For the generalist species, the prediction of their distribution at landscape scale needed more plots. Except for the extreme specialist, landscape fragmentation degree also affected the effects of sampling plot number on the prediction. With the increase of simulation period, the effects of sampling plot number on the prediction of tree species distribution at landscape scale could be changed. For generalist species, more plots are needed for the long-term simulation.

  11. A novel model for estimating organic chemical bioconcentration in agricultural plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hung, H.; Mackay, D.; Di Guardo, A.

    1995-12-31

    There is increasing recognition that much human and wildlife exposure to organic contaminants can be traced through the food chain to bioconcentration in vegetation. For risk assessment, there is a need for an accurate model to predict organic chemical concentrations in plants. Existing models range from relatively simple correlations of concentrations using octanol-water or octanol-air partition coefficients, to complex models involving extensive physiological data. To satisfy the need for a relatively accurate model of intermediate complexity, a novel approach has been devised to predict organic chemical concentrations in agricultural plants as a function of soil and air concentrations, without themore » need for extensive plant physiological data. The plant is treated as three compartments, namely, leaves, roots and stems (including fruit and seeds). Data readily available from the literature, including chemical properties, volume, density and composition of each compartment; metabolic and growth rate of plant; and readily obtainable environmental conditions at the site are required as input. Results calculated from the model are compared with observed and experimentally-determined concentrations. It is suggested that the model, which includes a physiological database for agricultural plants, gives acceptably accurate predictions of chemical partitioning between plants, air and soil.« less

  12. How Coaches' Motivations Mediate Between Basic Psychological Needs and Well-Being/Ill-Being.

    PubMed

    Alcaraz, Saul; Torregrosa, Miquel; Viladrich, Carme

    2015-01-01

    The purpose of the present research was to test how behavioral regulations are mediated between basic psychological needs and psychological well-being and ill-being in a sample of team-sport coaches. Based on self-determination theory, we hypothesized a model where satisfaction and thwarting of the basic psychological needs predicted coaches' behavioral regulations, which in turn led them to experience well-being (i.e., subjective vitality, positive affect) or ill-being (i.e., perceived stress, negative affect). Three-hundred and two coaches participated in the study (Mage = 25.97 years; 82% male). For each instrument employed, the measurement model with the best psychometric properties was selected from a sequence of nested models sustained by previous research, including exploratory structural equation models and confirmatory factor analysis. These measurement models were included in 3 structural equation models to test for mediation: partial mediation, complete mediation, and absence of mediation. The results provided support for the partial mediation model. Coaches' motivation mediated the relationships from both relatedness need satisfaction and basic psychological needs thwarting for coaches' well-being. In contrast, relationships between basic psychological needs satisfaction and thwarting and ill-being were only predicted by direct effects. Our results highlight that 3 conditions seem necessary for coaches to experience psychological well-being in their teams: basic psychological needs satisfaction, especially relatedness; lack of basic psychological needs thwarting; and self-determined motivation.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Puskar, Joseph David; Quintana, Michael A.; Sorensen, Neil Robert

    A program is underway at Sandia National Laboratories to predict long-term reliability of photovoltaic (PV) systems. The vehicle for the reliability predictions is a Reliability Block Diagram (RBD), which models system behavior. Because this model is based mainly on field failure and repair times, it can be used to predict current reliability, but it cannot currently be used to accurately predict lifetime. In order to be truly predictive, physics-informed degradation processes and failure mechanisms need to be included in the model. This paper describes accelerated life testing of metal foil tapes used in thin-film PV modules, and how tape jointmore » degradation, a possible failure mode, can be incorporated into the model.« less

  14. A Local to National Scale Catchment Model Simulation Framework for Hydrological Predictions and Impact Assessments Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Freer, Jim; Coxon, Gemma; Quinn, Niall; Dunne, Toby; Lane, Rosie; Bates, Paul; Wagener, Thorsten; Woods, Ross; Neal, Jeff; Howden, Nicholas; Musuuza, Jude

    2017-04-01

    There is a huge challenge in developing hydrological model structures that can be used for hypothesis testing, prediction, impact assessment and risk analyses over a wide range of spatial scales. There are many reasons why this is the case, from computational demands, to how we define and characterize different features and pathway connectivities in the landscape, that differ depending on the objectives of the study. However there is certainly a need more than ever to explore the trade-offs between the complexity of modelling applied (i.e. spatial discretization, levels of process representation, complexity of landscape representation) compared to the benefits realized in terms of predictive capability and robustness of these predictions during hydrological extremes and during change. Furthermore, there is a further balance, particularly associated with prediction uncertainties, in that it is not desirable to have modelling systems that are too complex compared to the observed data that would ever be available to apply them. This is particularly the case when models are applied to quantify national impact assessments, especially if these are based on validation assessments from smaller more detailed case studies. Therefore the hydrological community needs modelling tools and approaches that enable these trade-offs to be explored and to understand the level of representation needed in models to be 'fit-for-purpose' for a given application. This paper presents a catchment scale national modelling framework based on Dynamic-TOPMODEL specifically setup to fulfil these aims. A key component of the modelling framework is it's structural flexibility, as is the ability to assess model outputs using Monte Carlo simulation techniques. The model build has been automated to work at any spatial scale to the national scale, and within that to control the level of spatial discretisation and connectivity of locally accounted landscape elements in the form of hydrological response units (HRU's). This allows for the explicit consideration of spatial rainfall fields, landscape, soils and geological attributes and the spatial connectivity of hydrological flow pathways to explore what level of modelling complexity we need for different prediction problems. We shall present this framework and show how it can be used in flood and drought risk analyses as well as include attributes and features within the landscape to explore societal and climate impacts effectively within an uncertainty analyses framework.

  15. Concepts and tools for predictive modeling of microbial dynamics.

    PubMed

    Bernaerts, Kristel; Dens, Els; Vereecken, Karen; Geeraerd, Annemie H; Standaert, Arnout R; Devlieghere, Frank; Debevere, Johan; Van Impe, Jan F

    2004-09-01

    Description of microbial cell (population) behavior as influenced by dynamically changing environmental conditions intrinsically needs dynamic mathematical models. In the past, major effort has been put into the modeling of microbial growth and inactivation within a constant environment (static models). In the early 1990s, differential equation models (dynamic models) were introduced in the field of predictive microbiology. Here, we present a general dynamic model-building concept describing microbial evolution under dynamic conditions. Starting from an elementary model building block, the model structure can be gradually complexified to incorporate increasing numbers of influencing factors. Based on two case studies, the fundamentals of both macroscopic (population) and microscopic (individual) modeling approaches are revisited. These illustrations deal with the modeling of (i) microbial lag under variable temperature conditions and (ii) interspecies microbial interactions mediated by lactic acid production (product inhibition). Current and future research trends should address the need for (i) more specific measurements at the cell and/or population level, (ii) measurements under dynamic conditions, and (iii) more comprehensive (mechanistically inspired) model structures. In the context of quantitative microbial risk assessment, complexity of the mathematical model must be kept under control. An important challenge for the future is determination of a satisfactory trade-off between predictive power and manageability of predictive microbiology models.

  16. A Critical Review for Developing Accurate and Dynamic Predictive Models Using Machine Learning Methods in Medicine and Health Care.

    PubMed

    Alanazi, Hamdan O; Abdullah, Abdul Hanan; Qureshi, Kashif Naseer

    2017-04-01

    Recently, Artificial Intelligence (AI) has been used widely in medicine and health care sector. In machine learning, the classification or prediction is a major field of AI. Today, the study of existing predictive models based on machine learning methods is extremely active. Doctors need accurate predictions for the outcomes of their patients' diseases. In addition, for accurate predictions, timing is another significant factor that influences treatment decisions. In this paper, existing predictive models in medicine and health care have critically reviewed. Furthermore, the most famous machine learning methods have explained, and the confusion between a statistical approach and machine learning has clarified. A review of related literature reveals that the predictions of existing predictive models differ even when the same dataset is used. Therefore, existing predictive models are essential, and current methods must be improved.

  17. A discrete event simulation tool to support and predict hospital and clinic staffing.

    PubMed

    DeRienzo, Christopher M; Shaw, Ryan J; Meanor, Phillip; Lada, Emily; Ferranti, Jeffrey; Tanaka, David

    2017-06-01

    We demonstrate how to develop a simulation tool to help healthcare managers and administrators predict and plan for staffing needs in a hospital neonatal intensive care unit using administrative data. We developed a discrete event simulation model of nursing staff needed in a neonatal intensive care unit and then validated the model against historical data. The process flow was translated into a discrete event simulation model. Results demonstrated that the model can be used to give a respectable estimate of annual admissions, transfers, and deaths based upon two different staffing levels. The discrete event simulation tool model can provide healthcare managers and administrators with (1) a valid method of modeling patient mix, patient acuity, staffing needs, and costs in the present state and (2) a forecast of how changes in a unit's staffing, referral patterns, or patient mix would affect a unit in a future state.

  18. Materials integrity in microsystems: a framework for a petascale predictive-science-based multiscale modeling and simulation system

    NASA Astrophysics Data System (ADS)

    To, Albert C.; Liu, Wing Kam; Olson, Gregory B.; Belytschko, Ted; Chen, Wei; Shephard, Mark S.; Chung, Yip-Wah; Ghanem, Roger; Voorhees, Peter W.; Seidman, David N.; Wolverton, Chris; Chen, J. S.; Moran, Brian; Freeman, Arthur J.; Tian, Rong; Luo, Xiaojuan; Lautenschlager, Eric; Challoner, A. Dorian

    2008-09-01

    Microsystems have become an integral part of our lives and can be found in homeland security, medical science, aerospace applications and beyond. Many critical microsystem applications are in harsh environments, in which long-term reliability needs to be guaranteed and repair is not feasible. For example, gyroscope microsystems on satellites need to function for over 20 years under severe radiation, thermal cycling, and shock loading. Hence a predictive-science-based, verified and validated computational models and algorithms to predict the performance and materials integrity of microsystems in these situations is needed. Confidence in these predictions is improved by quantifying uncertainties and approximation errors. With no full system testing and limited sub-system testings, petascale computing is certainly necessary to span both time and space scales and to reduce the uncertainty in the prediction of long-term reliability. This paper presents the necessary steps to develop predictive-science-based multiscale modeling and simulation system. The development of this system will be focused on the prediction of the long-term performance of a gyroscope microsystem. The environmental effects to be considered include radiation, thermo-mechanical cycling and shock. Since there will be many material performance issues, attention is restricted to creep resulting from thermal aging and radiation-enhanced mass diffusion, material instability due to radiation and thermo-mechanical cycling and damage and fracture due to shock. To meet these challenges, we aim to develop an integrated multiscale software analysis system that spans the length scales from the atomistic scale to the scale of the device. The proposed software system will include molecular mechanics, phase field evolution, micromechanics and continuum mechanics software, and the state-of-the-art model identification strategies where atomistic properties are calibrated by quantum calculations. We aim to predict the long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.

  19. Multivariate Analysis of Seismic Field Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alam, M. Kathleen

    1999-06-01

    This report includes the details of the model building procedure and prediction of seismic field data. Principal Components Regression, a multivariate analysis technique, was used to model seismic data collected as two pieces of equipment were cycled on and off. Models built that included only the two pieces of equipment of interest had trouble predicting data containing signals not included in the model. Evidence for poor predictions came from the prediction curves as well as spectral F-ratio plots. Once the extraneous signals were included in the model, predictions improved dramatically. While Principal Components Regression performed well for the present datamore » sets, the present data analysis suggests further work will be needed to develop more robust modeling methods as the data become more complex.« less

  20. Modeling Distributions of Immediate Memory Effects: No Strategies Needed?

    ERIC Educational Resources Information Center

    Beaman, C. Philip; Neath, Ian; Surprenant, Aimee M.

    2008-01-01

    Many models of immediate memory predict the presence or absence of various effects, but none have been tested to see whether they predict an appropriate distribution of effect sizes. The authors show that the feature model (J. S. Nairne, 1990) produces appropriate distributions of effect sizes for both the phonological confusion effect and the…

  1. Invasive Species Distribution Modeling (iSDM): Are absence data and dispersal constraints needed to predict actual distributions?

    Treesearch

    Tomáš Václavík; Ross K. Meentemeyer

    2009-01-01

    Species distribution models (SDMs) based on statistical relationships between occurrence data and underlying environmental conditions are increasingly used to predict spatial patterns of biological invasions and prioritize locations for early detection and control of invasion outbreaks. However, invasive species distribution models (iSDMs) face special challenges...

  2. A test of self-determination theory in school physical education.

    PubMed

    Standage, Martyn; Duda, Joan L; Ntoumanis, Nikos

    2005-09-01

    Contemporary research conducted in the context of school physical education (PE) has increasingly embraced various tenets of self-determination theory (Deci & Ryan, 1985, 1991). Despite this increase in research attention, some postulates of the framework remain unexplored (e.g. impact of a need-supportive climate). As such, the present study sought to provide a more comprehensive test of self-determination theory. The present work also examined Deci and Ryan's claim that the motivational sequence embraced by their framework is invariant across gender. (i) To examine a model of motivation based on the tenets of self-determination theory, and (ii) explore the invariance of the model across gender. Participants were 950 British secondary school students (443 male, 490 female, 17 gender not specified) Participants completed a questionnaire that included measures of need support, need satisfaction, motivation, positive and negative affect, task challenge, and concentration. Structural equation modelling (SEM) analysis revealed that students who perceived a need-supporting environment experienced greater levels of need satisfaction. Need satisfaction predicted intrinsic motivation, which, in turn, linked to adaptive PE-related outcomes. In contrast, need satisfaction negatively predicted amotivation, which, in turn, was positively predictive of feelings of unhappiness. Multisample SEM invariance testing revealed the model to be largely invariant for male and female students. The results of the study provide support for self-determination theory and corroborate the application of the framework to the context of school PE. Further, we largely found support for the invariance of the motivational processes embraced by self-determination theory across gender.

  3. Water Quality, Cyanobacteria, and Environmental Factors and Their Relations to Microcystin Concentrations for Use in Predictive Models at Ohio Lake Erie and Inland Lake Recreational Sites, 2013-14

    USGS Publications Warehouse

    Francy, Donna S.; Graham, Jennifer L.; Stelzer, Erin A.; Ecker, Christopher D.; Brady, Amie M. G.; Pam Struffolino,; Loftin, Keith A.

    2015-11-06

    The results of this study showed that water-quality and environmental variables are promising for use in site-specific daily or long-term predictive models. In order to develop more accurate models to predict toxin concentrations at freshwater lake sites, data need to be collected more frequently and for consecutive days in future studies.

  4. Predictive Capabilities of Multiphysics and Multiscale Models in Modeling Solidification of Steel Ingots and DC Casting of Aluminum

    NASA Astrophysics Data System (ADS)

    Combeau, Hervé; Založnik, Miha; Bedel, Marie

    2016-08-01

    Prediction of solidification defects, such as macrosegregation and inhomogeneous microstructures, constitutes a key issue for industry. The development of models of casting processes needs to account for several imbricated length scales and different physical phenomena. For example, the kinetics of the growth of microstructures needs to be coupled with the multiphase flow at the process scale. We introduce such a state-of-the-art model and outline its principles. We present the most recent applications of the model to casting of a heavy steel ingot and to direct chill casting of a large Al alloy sheet ingot. Their ability to help in the understanding of complex phenomena, such as the competition between nucleation and growth of grains in the presence of convection of the liquid and of grain motion is shown, and its predictive capabilities are discussed. Key issues for future developments and research are addressed.

  5. Development of a Simulation Capability for the Space Station Active Rack Isolation System

    NASA Technical Reports Server (NTRS)

    Johnson, Terry L.; Tolson, Robert H.

    1998-01-01

    To realize quality microgravity science on the International Space Station, many microgravity facilities will utilize the Active Rack Isolation System (ARIS). Simulation capabilities for ARIS will be needed to predict the microgravity environment. This paper discusses the development of a simulation model for use in predicting the performance of the ARIS in attenuating disturbances with frequency content between 0.01 Hz and 10 Hz. The derivation of the model utilizes an energy-based approach. The complete simulation includes the dynamic model of the ISPR integrated with the model for the ARIS controller so that the entire closed-loop system is simulated. Preliminary performance predictions are made for the ARIS in attenuating both off-board disturbances as well as disturbances from hardware mounted onboard the microgravity facility. These predictions suggest that the ARIS does eliminate resonant behavior detrimental to microgravity experimentation. A limited comparison is made between the simulation predictions of ARIS attenuation of off-board disturbances and results from the ARIS flight test. These comparisons show promise, but further tuning of the simulation is needed.

  6. Modeling strength loss in wood by chemical composition. Part I, An individual component model for southern pine

    Treesearch

    J. E. Winandy; P. K. Lebow

    2001-01-01

    In this study, we develop models for predicting loss in bending strength of clear, straight-grained pine from changes in chemical composition. Although significant work needs to be done before truly universal predictive models are developed, a quantitative fundamental relationship between changes in chemical composition and strength loss for pine was demonstrated. In...

  7. Evaluating Air-Quality Models: Review and Outlook.

    NASA Astrophysics Data System (ADS)

    Weil, J. C.; Sykes, R. I.; Venkatram, A.

    1992-10-01

    Over the past decade, much attention has been devoted to the evaluation of air-quality models with emphasis on model performance in predicting the high concentrations that are important in air-quality regulations. This paper stems from our belief that this practice needs to be expanded to 1) evaluate model physics and 2) deal with the large natural or stochastic variability in concentration. The variability is represented by the root-mean- square fluctuating concentration (c about the mean concentration (C) over an ensemble-a given set of meteorological, source, etc. conditions. Most air-quality models used in applications predict C, whereas observations are individual realizations drawn from an ensemble. For cC large residuals exist between predicted and observed concentrations, which confuse model evaluations.This paper addresses ways of evaluating model physics in light of the large c the focus is on elevated point-source models. Evaluation of model physics requires the separation of the mean model error-the difference between the predicted and observed C-from the natural variability. A residual analysis is shown to be an elective way of doing this. Several examples demonstrate the usefulness of residuals as well as correlation analyses and laboratory data in judging model physics.In general, c models and predictions of the probability distribution of the fluctuating concentration (c), (c, are in the developmental stage, with laboratory data playing an important role. Laboratory data from point-source plumes in a convection tank show that (c approximates a self-similar distribution along the plume center plane, a useful result in a residual analysis. At pmsent,there is one model-ARAP-that predicts C, c, and (c for point-source plumes. This model is more computationally demanding than other dispersion models (for C only) and must be demonstrated as a practical tool. However, it predicts an important quantity for applications- the uncertainty in the very high and infrequent concentrations. The uncertainty is large and is needed in evaluating operational performance and in predicting the attainment of air-quality standards.

  8. A review of statistical updating methods for clinical prediction models.

    PubMed

    Su, Ting-Li; Jaki, Thomas; Hickey, Graeme L; Buchan, Iain; Sperrin, Matthew

    2018-01-01

    A clinical prediction model is a tool for predicting healthcare outcomes, usually within a specific population and context. A common approach is to develop a new clinical prediction model for each population and context; however, this wastes potentially useful historical information. A better approach is to update or incorporate the existing clinical prediction models already developed for use in similar contexts or populations. In addition, clinical prediction models commonly become miscalibrated over time, and need replacing or updating. In this article, we review a range of approaches for re-using and updating clinical prediction models; these fall in into three main categories: simple coefficient updating, combining multiple previous clinical prediction models in a meta-model and dynamic updating of models. We evaluated the performance (discrimination and calibration) of the different strategies using data on mortality following cardiac surgery in the United Kingdom: We found that no single strategy performed sufficiently well to be used to the exclusion of the others. In conclusion, useful tools exist for updating existing clinical prediction models to a new population or context, and these should be implemented rather than developing a new clinical prediction model from scratch, using a breadth of complementary statistical methods.

  9. Emerging approaches in predictive toxicology.

    PubMed

    Zhang, Luoping; McHale, Cliona M; Greene, Nigel; Snyder, Ronald D; Rich, Ivan N; Aardema, Marilyn J; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan

    2014-12-01

    Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described. © 2014 Wiley Periodicals, Inc.

  10. Emerging Approaches in Predictive Toxicology

    PubMed Central

    Zhang, Luoping; McHale, Cliona M.; Greene, Nigel; Snyder, Ronald D.; Rich, Ivan N.; Aardema, Marilyn J.; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan

    2016-01-01

    Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described. PMID:25044351

  11. Using a Prediction Model to Manage Cyber Security Threats.

    PubMed

    Jaganathan, Venkatesh; Cherurveettil, Priyesh; Muthu Sivashanmugam, Premapriya

    2015-01-01

    Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization.

  12. Using a Prediction Model to Manage Cyber Security Threats

    PubMed Central

    Muthu Sivashanmugam, Premapriya

    2015-01-01

    Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization. PMID:26065024

  13. ADOT state-specific crash prediction models : an Arizona needs study.

    DOT National Transportation Integrated Search

    2016-12-01

    The predictive method in the Highway Safety Manual (HSM) includes a safety performance function (SPF), : crash modification factors (CMFs), and a local calibration factor (C), if available. Two alternatives exist for : applying the HSM prediction met...

  14. Independent external validation of predictive models for urinary dysfunction following external beam radiotherapy of the prostate: Issues in model development and reporting.

    PubMed

    Yahya, Noorazrul; Ebert, Martin A; Bulsara, Max; Kennedy, Angel; Joseph, David J; Denham, James W

    2016-08-01

    Most predictive models are not sufficiently validated for prospective use. We performed independent external validation of published predictive models for urinary dysfunctions following radiotherapy of the prostate. Multivariable models developed to predict atomised and generalised urinary symptoms, both acute and late, were considered for validation using a dataset representing 754 participants from the TROG 03.04-RADAR trial. Endpoints and features were harmonised to match the predictive models. The overall performance, calibration and discrimination were assessed. 14 models from four publications were validated. The discrimination of the predictive models in an independent external validation cohort, measured using the area under the receiver operating characteristic (ROC) curve, ranged from 0.473 to 0.695, generally lower than in internal validation. 4 models had ROC >0.6. Shrinkage was required for all predictive models' coefficients ranging from -0.309 (prediction probability was inverse to observed proportion) to 0.823. Predictive models which include baseline symptoms as a feature produced the highest discrimination. Two models produced a predicted probability of 0 and 1 for all patients. Predictive models vary in performance and transferability illustrating the need for improvements in model development and reporting. Several models showed reasonable potential but efforts should be increased to improve performance. Baseline symptoms should always be considered as potential features for predictive models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Evaluation of the TBET model for potential improvement of southern P indices

    USDA-ARS?s Scientific Manuscript database

    Due to a shortage of available phosphorus (P) loss data sets, simulated data from a quantitative P transport model could be used to evaluate a P-index. However, the model would need to accurately predict the P loss data sets that are available. The objective of this study was to compare predictions ...

  16. Dynamic-landscape metapopulation models predict complex response of wildlife populations to climate and landscape change

    Treesearch

    Thomas W. Bonnot; Frank R. Thompson; Joshua J. Millspaugh

    2017-01-01

    The increasing need to predict how climate change will impact wildlife species has exposed limitations in how well current approaches model important biological processes at scales at which those processes interact with climate. We used a comprehensive approach that combined recent advances in landscape and population modeling into dynamic-landscape metapopulation...

  17. Refinement of the Arc-Habcap model to predict habitat effectiveness for elk

    Treesearch

    Lakhdar Benkobi; Mark A. Rumble; Gary C. Brundige; Joshua J. Millspaugh

    2004-01-01

    Wildlife habitat modeling is increasingly important for managers who need to assess the effects of land management activities. We evaluated the performance of a spatially explicit deterministic habitat model (Arc-Habcap) that predicts habitat effectiveness for elk. We used five years of radio-telemetry locations of elk from Custer State Park (CSP), South Dakota, to...

  18. Mental workload prediction based on attentional resource allocation and information processing.

    PubMed

    Xiao, Xu; Wanyan, Xiaoru; Zhuang, Damin

    2015-01-01

    Mental workload is an important component in complex human-machine systems. The limited applicability of empirical workload measures produces the need for workload modeling and prediction methods. In the present study, a mental workload prediction model is built on the basis of attentional resource allocation and information processing to ensure pilots' accuracy and speed in understanding large amounts of flight information on the cockpit display interface. Validation with an empirical study of an abnormal attitude recovery task showed that this model's prediction of mental workload highly correlated with experimental results. This mental workload prediction model provides a new tool for optimizing human factors interface design and reducing human errors.

  19. Dealing with Diversity in Computational Cancer Modeling

    PubMed Central

    Johnson, David; McKeever, Steve; Stamatakos, Georgios; Dionysiou, Dimitra; Graf, Norbert; Sakkalis, Vangelis; Marias, Konstantinos; Wang, Zhihui; Deisboeck, Thomas S.

    2013-01-01

    This paper discusses the need for interconnecting computational cancer models from different sources and scales within clinically relevant scenarios to increase the accuracy of the models and speed up their clinical adaptation, validation, and eventual translation. We briefly review current interoperability efforts drawing upon our experiences with the development of in silico models for predictive oncology within a number of European Commission Virtual Physiological Human initiative projects on cancer. A clinically relevant scenario, addressing brain tumor modeling that illustrates the need for coupling models from different sources and levels of complexity, is described. General approaches to enabling interoperability using XML-based markup languages for biological modeling are reviewed, concluding with a discussion on efforts towards developing cancer-specific XML markup to couple multiple component models for predictive in silico oncology. PMID:23700360

  20. Bias and uncertainty in regression-calibrated models of groundwater flow in heterogeneous media

    USGS Publications Warehouse

    Cooley, R.L.; Christensen, S.

    2006-01-01

    Groundwater models need to account for detailed but generally unknown spatial variability (heterogeneity) of the hydrogeologic model inputs. To address this problem we replace the large, m-dimensional stochastic vector ?? that reflects both small and large scales of heterogeneity in the inputs by a lumped or smoothed m-dimensional approximation ????*, where ?? is an interpolation matrix and ??* is a stochastic vector of parameters. Vector ??* has small enough dimension to allow its estimation with the available data. The consequence of the replacement is that model function f(????*) written in terms of the approximate inputs is in error with respect to the same model function written in terms of ??, ??,f(??), which is assumed to be nearly exact. The difference f(??) - f(????*), termed model error, is spatially correlated, generates prediction biases, and causes standard confidence and prediction intervals to be too small. Model error is accounted for in the weighted nonlinear regression methodology developed to estimate ??* and assess model uncertainties by incorporating the second-moment matrix of the model errors into the weight matrix. Techniques developed by statisticians to analyze classical nonlinear regression methods are extended to analyze the revised method. The analysis develops analytical expressions for bias terms reflecting the interaction of model nonlinearity and model error, for correction factors needed to adjust the sizes of confidence and prediction intervals for this interaction, and for correction factors needed to adjust the sizes of confidence and prediction intervals for possible use of a diagonal weight matrix in place of the correct one. If terms expressing the degree of intrinsic nonlinearity for f(??) and f(????*) are small, then most of the biases are small and the correction factors are reduced in magnitude. Biases, correction factors, and confidence and prediction intervals were obtained for a test problem for which model error is large to test robustness of the methodology. Numerical results conform with the theoretical analysis. ?? 2005 Elsevier Ltd. All rights reserved.

  1. A basic need theory approach to problematic Internet use and the mediating effect of psychological distress

    PubMed Central

    Wong, Ting Yat; Yuen, Kenneth S. L.; Li, Wang On

    2015-01-01

    The Internet provides an easily accessible way to meet certain needs. Over-reliance on it leads to problematic use, which studies show can be predicted by psychological distress. Self-determination theory proposes that we all have the basic need for autonomy, competency, and relatedness. This has been shown to explain the motivations behind problematic Internet use. This study hypothesizes that individuals who are psychologically disturbed because their basic needs are not being met are more vulnerable to becoming reliant on the Internet when they seek such needs satisfaction from online activities, and tests a model in which basic needs predict problematic Internet use, fully mediated by psychological distress. Problematic Internet use, psychological distress, and basic needs satisfaction were psychometrically measured in a sample of 229 Hong Kong University students and structural equation modeling was used to test the hypothesized model. All indices showed the model has a good fit. Further, statistical testing supported a mediation effect for psychological distress between needs satisfaction and problematic Internet use. The results extend our understanding of the development and prevention of problematic Internet use based on the framework of self-determination theory. Psychological distress could be used as an early predictor, while preventing and treating problematic Internet use should emphasize the fulfillment of unmet needs. PMID:25642201

  2. A basic need theory approach to problematic Internet use and the mediating effect of psychological distress.

    PubMed

    Wong, Ting Yat; Yuen, Kenneth S L; Li, Wang On

    2014-01-01

    The Internet provides an easily accessible way to meet certain needs. Over-reliance on it leads to problematic use, which studies show can be predicted by psychological distress. Self-determination theory proposes that we all have the basic need for autonomy, competency, and relatedness. This has been shown to explain the motivations behind problematic Internet use. This study hypothesizes that individuals who are psychologically disturbed because their basic needs are not being met are more vulnerable to becoming reliant on the Internet when they seek such needs satisfaction from online activities, and tests a model in which basic needs predict problematic Internet use, fully mediated by psychological distress. Problematic Internet use, psychological distress, and basic needs satisfaction were psychometrically measured in a sample of 229 Hong Kong University students and structural equation modeling was used to test the hypothesized model. All indices showed the model has a good fit. Further, statistical testing supported a mediation effect for psychological distress between needs satisfaction and problematic Internet use. The results extend our understanding of the development and prevention of problematic Internet use based on the framework of self-determination theory. Psychological distress could be used as an early predictor, while preventing and treating problematic Internet use should emphasize the fulfillment of unmet needs.

  3. Prognostic and Prediction Tools in Bladder Cancer: A Comprehensive Review of the Literature.

    PubMed

    Kluth, Luis A; Black, Peter C; Bochner, Bernard H; Catto, James; Lerner, Seth P; Stenzl, Arnulf; Sylvester, Richard; Vickers, Andrew J; Xylinas, Evanguelos; Shariat, Shahrokh F

    2015-08-01

    This review focuses on risk assessment and prediction tools for bladder cancer (BCa). To review the current knowledge on risk assessment and prediction tools to enhance clinical decision making and counseling of patients with BCa. A literature search in English was performed using PubMed in July 2013. Relevant risk assessment and prediction tools for BCa were selected. More than 1600 publications were retrieved. Special attention was given to studies that investigated the clinical benefit of a prediction tool. Most prediction tools for BCa focus on the prediction of disease recurrence and progression in non-muscle-invasive bladder cancer or disease recurrence and survival after radical cystectomy. Although these tools are helpful, recent prediction tools aim to address a specific clinical problem, such as the prediction of organ-confined disease and lymph node metastasis to help identify patients who might benefit from neoadjuvant chemotherapy. Although a large number of prediction tools have been reported in recent years, many of them lack external validation. Few studies have investigated the clinical utility of any given model as measured by its ability to improve clinical decision making. There is a need for novel biomarkers to improve the accuracy and utility of prediction tools for BCa. Decision tools hold the promise of facilitating the shared decision process, potentially improving clinical outcomes for BCa patients. Prediction models need external validation and assessment of clinical utility before they can be incorporated into routine clinical care. We looked at models that aim to predict outcomes for patients with bladder cancer (BCa). We found a large number of prediction models that hold the promise of facilitating treatment decisions for patients with BCa. However, many models are missing confirmation in a different patient cohort, and only a few studies have tested the clinical utility of any given model as measured by its ability to improve clinical decision making. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  4. Modelling ecological systems in a changing world

    PubMed Central

    Evans, Matthew R.

    2012-01-01

    The world is changing at an unprecedented rate. In such a situation, we need to understand the nature of the change and to make predictions about the way in which it might affect systems of interest; often we may also wish to understand what might be done to mitigate the predicted effects. In ecology, we usually make such predictions (or forecasts) by making use of mathematical models that describe the system and projecting them into the future, under changed conditions. Approaches emphasizing the desirability of simple models with analytical tractability and those that use assumed causal relationships derived statistically from data currently dominate ecological modelling. Although such models are excellent at describing the way in which a system has behaved, they are poor at predicting its future state, especially in novel conditions. In order to address questions about the impact of environmental change, and to understand what, if any, action might be taken to ameliorate it, ecologists need to develop the ability to project models into novel, future conditions. This will require the development of models based on understanding the processes that result in a system behaving the way it does, rather than relying on a description of the system, as a whole, remaining valid indefinitely. PMID:22144381

  5. A Statistical Weather-Driven Streamflow Model: Enabling future flow predictions in data-scarce headwater streams

    NASA Astrophysics Data System (ADS)

    Rosner, A.; Letcher, B. H.; Vogel, R. M.

    2014-12-01

    Predicting streamflow in headwaters and over a broad spatial scale pose unique challenges due to limited data availability. Flow observation gages for headwaters streams are less common than for larger rivers, and gages with records lengths of ten year or more are even more scarce. Thus, there is a great need for estimating streamflows in ungaged or sparsely-gaged headwaters. Further, there is often insufficient basin information to develop rainfall-runoff models that could be used to predict future flows under various climate scenarios. Headwaters in the northeastern U.S. are of particular concern to aquatic biologists, as these stream serve as essential habitat for native coldwater fish. In order to understand fish response to past or future environmental drivers, estimates of seasonal streamflow are needed. While there is limited flow data, there is a wealth of data for historic weather conditions. Observed data has been modeled to interpolate a spatially continuous historic weather dataset. (Mauer et al 2002). We present a statistical model developed by pairing streamflow observations with precipitation and temperature information for the same and preceding time-steps. We demonstrate this model's use to predict flow metrics at the seasonal time-step. While not a physical model, this statistical model represents the weather drivers. Since this model can predict flows not directly tied to reference gages, we can generate flow estimates for historic as well as potential future conditions.

  6. Validating the Malheur model for predicting ponderosa pine post-fire mortality using 24 fires in the Pacific Northwest, USA

    Treesearch

    Walter G. Thies; Douglas J. Westlind

    2012-01-01

    Fires, whether intentionally or accidentally set, commonly occur in western interior forests of the US. Following fire, managers need the ability to predict mortality of individual trees based on easily observed characteristics. Previously, a two-factor model using crown scorch and bole scorch proportions was developed with data from 3415 trees for predicting the...

  7. Predicting past and future diameter growth for trees in the northeastern United States

    Treesearch

    James A. Westfall

    2006-01-01

    Tree diameter growth models are widely used in forestry applications, often to predict tree size at a future point in time. Also, there are instances where projections of past diameters are needed. A relative diameter growth model was developed to allow prediction of both future and past growth rates. Coefficients were estimated for 15 species groups that cover most...

  8. Predicting homeowners' approval of fuel management at the wild-urban interface using the theory of reasoned action.

    Treesearch

    Christine A. Vogt; Greg Winter; Jeremy S. Fried

    2005-01-01

    Social science models are increasingly needed as a framework for explaining and predicting how members of the public respond to the natural environment and their communities. The theory of reasoned action is widely used in human dimensions research on natural resource problems and work is ongoing to increase the predictive power of models based on this theory. This...

  9. Predicting post-fire tree mortality for 14 conifers in the Pacific Northwest, USA: Model evaluation, development, and thresholds

    Treesearch

    Lindsay M. Grayson; Robert A. Progar; Sharon M. Hood

    2017-01-01

    Fire is a driving force in the North American landscape and predicting post-fire tree mortality is vital to land management. Post-fire tree mortality can have substantial economic and social impacts, and natural resource managers need reliable predictive methods to anticipate potential mortality following fire events. Current fire mortality models are limited to a few...

  10. A reciprocal effects model of the temporal ordering of basic psychological needs and motivation.

    PubMed

    Martinent, Guillaume; Guillet-Descas, Emma; Moiret, Sophie

    2015-04-01

    Using self-determination theory as the framework, we examined the temporal ordering between satisfaction and thwarting of basic psychological needs and motivation. We accomplished this goal by using a two-wave 7-month partial least squares path modeling approach (PLS-PM) among a sample of 94 adolescent athletes (Mage = 15.96) in an intensive training setting. The PLS-PM results showed significant paths leading: (a) from T1 satisfaction of basic psychological need for competence to T2 identified regulation, (b) from T1 external regulation to T2 thwarting and satisfaction of basic psychological need for competence, and (c) from T1 amotivation to T2 satisfaction of basic psychological need for relatedness. Overall, our results suggest that the relationship between basic psychological need and motivation varied depending on the type of basic need and motivation assessed. Basic psychological need for competence predicted identified regulation over time whereas amotivation and external regulation predicted basic psychological need for relatedness or competence over time.

  11. Exploring the social-environmental determinants of well- and ill-being in dancers: a test of basic needs theory.

    PubMed

    Quested, Eleanor; Duda, Joan L

    2010-02-01

    Grounded in the basic needs mini-theory (Deci & Ryan, 2000), this study examined the interplay among perceptions of the social environment manifested in vocational dance schools, basic need satisfaction, and indices of elite dancers' well- and ill-being. The hypothesized mediating role of need satisfaction was also tested. Dancers (N = 392) completed a questionnaire tapping the targeted variables. Structural equation modeling supported a model in which perceptions of task-involving dance environments positively predicted need satisfaction. Perceived ego-involving climates negatively corresponded with competence and relatedness. Perceptions of autonomy support were positively related to autonomy and relatedness. Need satisfaction positively predicted positive affect. Competence and relatedness satisfaction corresponded negatively to reported negative affect. Emotional and physical exhaustion was not related to need satisfaction. Partial support emerged for the assumed mediation of the needs. Results highlight the relevance of task-involving and autonomy-supportive dance climates for elite dancers' need satisfaction and healthful engagement in vocational dance.

  12. Effects of soil moisture on the diurnal pattern of pesticide emission: Numerical simulation and sensitivity analysis

    USDA-ARS?s Scientific Manuscript database

    Accurate prediction of pesticide volatilization is important for the protection of human and environmental health. Due to the complexity of the volatilization process, sophisticated predictive models are needed, especially for dry soil conditions. A mathematical model was developed to allow simulati...

  13. Impact of predictive model-directed end-of-life counseling for Medicare beneficiaries.

    PubMed

    Hamlet, Karen S; Hobgood, Adam; Hamar, Guy Brent; Dobbs, Angela C; Rula, Elizabeth Y; Pope, James E

    2010-05-01

    To validate a predictive model for identifying Medicare beneficiaries who need end-of-life care planning and to determine the impact on cost and hospice care of a telephonic counseling program utilizing this predictive model in 2 Medicare Health Support (MHS) pilots. Secondary analysis of data from 2 MHS pilot programs that used a randomized controlled design. A predictive model was developed using intervention group data (N = 43,497) to identify individuals at greatest risk of death. Model output guided delivery of a telephonic intervention designed to support educated end-of-life decisions and improve end-of-life provisions. Control group participants received usual care. As a primary outcome, Medicare costs in the last 6 months of life were compared between intervention group decedents (n = 3112) and control group decedents (n = 1630). Hospice admission rates and duration of hospice care were compared as secondary measures. The predictive model was highly accurate, and more than 80% of intervention group decedents were contacted during the 12 months before death. Average Medicare costs were $1913 lower for intervention group decedents compared with control group decedents in the last 6 months of life (P = .05), for a total savings of $5.95 million. There were no significant changes in hospice admissions or mean duration of hospice care. Telephonic end-of-life counseling provided as an ancillary Medicare service, guided by a predictive model, can reach a majority of individuals needing support and can reduce costs by facilitating voluntary election of less intensive care.

  14. An Overview of Numerical Weather Prediction on Various Scales

    NASA Astrophysics Data System (ADS)

    Bao, J.-W.

    2009-04-01

    The increasing public need for detailed weather forecasts, along with the advances in computer technology, has motivated many research institutes and national weather forecasting centers to develop and run global as well as regional numerical weather prediction (NWP) models at high resolutions (i.e., with horizontal resolutions of ~10 km or higher for global models and 1 km or higher for regional models, and with ~60 vertical levels or higher). The need for running NWP models at high horizontal and vertical resolutions requires the implementation of non-hydrostatic dynamic core with a choice of horizontal grid configurations and vertical coordinates that are appropriate for high resolutions. Development of advanced numerics will also be needed for high resolution global and regional models, in particular, when the models are applied to transport problems and air quality applications. In addition to the challenges in numerics, the NWP community is also facing the challenges of developing physics parameterizations that are well suited for high-resolution NWP models. For example, when NWP models are run at resolutions of ~5 km or higher, the use of much more detailed microphysics parameterizations than those currently used in NWP model will become important. Another example is that regional NWP models at ~1 km or higher only partially resolve convective energy containing eddies in the lower troposphere. Parameterizations to account for the subgrid diffusion associated with unresolved turbulence still need to be developed. Further, physically sound parameterizations for air-sea interaction will be a critical component for tropical NWP models, particularly for hurricane predictions models. In this review presentation, the above issues will be elaborated on and the approaches to address them will be discussed.

  15. Epigenome-wide cross-tissue predictive modeling and comparison of cord blood and placental methylation in a birth cohort

    PubMed Central

    De Carli, Margherita M; Baccarelli, Andrea A; Trevisi, Letizia; Pantic, Ivan; Brennan, Kasey JM; Hacker, Michele R; Loudon, Holly; Brunst, Kelly J; Wright, Robert O; Wright, Rosalind J; Just, Allan C

    2017-01-01

    Aim: We compared predictive modeling approaches to estimate placental methylation using cord blood methylation. Materials & methods: We performed locus-specific methylation prediction using both linear regression and support vector machine models with 174 matched pairs of 450k arrays. Results: At most CpG sites, both approaches gave poor predictions in spite of a misleading improvement in array-wide correlation. CpG islands and gene promoters, but not enhancers, were the genomic contexts where the correlation between measured and predicted placental methylation levels achieved higher values. We provide a list of 714 sites where both models achieved an R2 ≥0.75. Conclusion: The present study indicates the need for caution in interpreting cross-tissue predictions. Few methylation sites can be predicted between cord blood and placenta. PMID:28234020

  16. Incorporating genetic variation into a model of budburst phenology of coast Douglas-fir (Pseudotsuga menziesii var

    Treesearch

    Peter J. Gould; Constance A. Harrington; Bradley J. St Clair

    2011-01-01

    Models to predict budburst and other phenological events in plants are needed to forecast how climate change may impact ecosystems and for the development of mitigation strategies. Differences among genotypes are important to predicting phenological events in species that show strong clinal variation in adaptive traits. We present a model that incorporates the effects...

  17. Gap model development, validation, and application to succession of secondary subtropical dry forests of Puerto Rico

    Treesearch

    Jennifer A. Holm; H.H. Shugart; Skip J. Van Bloem; G.R. Larocque

    2012-01-01

    Because of human pressures, the need to understand and predict the long-term dynamics and development of subtropical dry forests is urgent. Through modifications to the ZELIG simulation model, including the development of species- and site-specific parameters and internal modifications, the capability to model and predict forest change within the 4500-ha Guanica State...

  18. Model structures amplify uncertainty in predicted soil carbon responses to climate change.

    PubMed

    Shi, Zheng; Crowell, Sean; Luo, Yiqi; Moore, Berrien

    2018-06-04

    Large model uncertainty in projected future soil carbon (C) dynamics has been well documented. However, our understanding of the sources of this uncertainty is limited. Here we quantify the uncertainties arising from model parameters, structures and their interactions, and how those uncertainties propagate through different models to projections of future soil carbon stocks. Both the vertically resolved model and the microbial explicit model project much greater uncertainties to climate change than the conventional soil C model, with both positive and negative C-climate feedbacks, whereas the conventional model consistently predicts positive soil C-climate feedback. Our findings suggest that diverse model structures are necessary to increase confidence in soil C projection. However, the larger uncertainty in the complex models also suggests that we need to strike a balance between model complexity and the need to include diverse model structures in order to forecast soil C dynamics with high confidence and low uncertainty.

  19. Scientific reporting is suboptimal for aspects that characterize genetic risk prediction studies: a review of published articles based on the Genetic RIsk Prediction Studies statement.

    PubMed

    Iglesias, Adriana I; Mihaescu, Raluca; Ioannidis, John P A; Khoury, Muin J; Little, Julian; van Duijn, Cornelia M; Janssens, A Cecile J W

    2014-05-01

    Our main objective was to raise awareness of the areas that need improvements in the reporting of genetic risk prediction articles for future publications, based on the Genetic RIsk Prediction Studies (GRIPS) statement. We evaluated studies that developed or validated a prediction model based on multiple DNA variants, using empirical data, and were published in 2010. A data extraction form based on the 25 items of the GRIPS statement was created and piloted. Forty-two studies met our inclusion criteria. Overall, more than half of the evaluated items (34 of 62) were reported in at least 85% of included articles. Seventy-seven percentage of the articles were identified as genetic risk prediction studies through title assessment, but only 31% used the keywords recommended by GRIPS in the title or abstract. Seventy-four percentage mentioned which allele was the risk variant. Overall, only 10% of the articles reported all essential items needed to perform external validation of the risk model. Completeness of reporting in genetic risk prediction studies is adequate for general elements of study design but is suboptimal for several aspects that characterize genetic risk prediction studies such as description of the model construction. Improvements in the transparency of reporting of these aspects would facilitate the identification, replication, and application of genetic risk prediction models. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Life extending control for rocket engines

    NASA Technical Reports Server (NTRS)

    Lorenzo, C. F.; Saus, J. R.; Ray, A.; Carpino, M.; Wu, M.-K.

    1992-01-01

    The concept of life extending control is defined. A brief discussion of current fatigue life prediction methods is given and the need for an alternative life prediction model based on a continuous functional relationship is established. Two approaches to life extending control are considered: (1) the implicit approach which uses cyclic fatigue life prediction as a basis for control design; and (2) the continuous life prediction approach which requires a continuous damage law. Progress on an initial formulation of a continuous (in time) fatigue model is presented. Finally, nonlinear programming is used to develop initial results for life extension for a simplified rocket engine (model).

  1. Comparison of modeled backscatter with SAR data at P-band

    NASA Technical Reports Server (NTRS)

    Wang, Yong; Davis, Frank W.; Melack, John M.

    1992-01-01

    In recent years several analytical models were developed to predict microwave scattering by trees and forest canopies. These models contribute to the understanding of radar backscatter over forested regions to the extent that they capture the basic interactions between microwave radiation and tree canopies, understories, and ground layers as functions of incidence angle, wavelength, and polarization. The Santa Barbara microwave model backscatter model for woodland (i.e. with discontinuous tree canopies) combines a single-tree backscatter model and a gap probability model. Comparison of model predictions with synthetic aperture radar (SAR) data and L-band (lambda = 0.235 m) is promising, but much work is still needed to test the validity of model predictions at other wavelengths. The validity of the model predictions at P-band (lambda = 0.68 m) for woodland stands at our Mt. Shasta test site was tested.

  2. Computational Modeling and Treatment Identification in the Myelodysplastic Syndromes.

    PubMed

    Drusbosky, Leylah M; Cogle, Christopher R

    2017-10-01

    This review discusses the need for computational modeling in myelodysplastic syndromes (MDS) and early test results. As our evolving understanding of MDS reveals a molecularly complicated disease, the need for sophisticated computer analytics is required to keep track of the number and complex interplay among the molecular abnormalities. Computational modeling and digital drug simulations using whole exome sequencing data input have produced early results showing high accuracy in predicting treatment response to standard of care drugs. Furthermore, the computational MDS models serve as clinically relevant MDS cell lines for pre-clinical assays of investigational agents. MDS is an ideal disease for computational modeling and digital drug simulations. Current research is focused on establishing the prediction value of computational modeling. Future research will test the clinical advantage of computer-informed therapy in MDS.

  3. Predicting the need for CT imaging in children with minor head injury using an ensemble of Naive Bayes classifiers.

    PubMed

    Klement, William; Wilk, Szymon; Michalowski, Wojtek; Farion, Ken J; Osmond, Martin H; Verter, Vedat

    2012-03-01

    Using an automatic data-driven approach, this paper develops a prediction model that achieves more balanced performance (in terms of sensitivity and specificity) than the Canadian Assessment of Tomography for Childhood Head Injury (CATCH) rule, when predicting the need for computed tomography (CT) imaging of children after a minor head injury. CT is widely considered an effective tool for evaluating patients with minor head trauma who have potentially suffered serious intracranial injury. However, its use poses possible harmful effects, particularly for children, due to exposure to radiation. Safety concerns, along with issues of cost and practice variability, have led to calls for the development of effective methods to decide when CT imaging is needed. Clinical decision rules represent such methods and are normally derived from the analysis of large prospectively collected patient data sets. The CATCH rule was created by a group of Canadian pediatric emergency physicians to support the decision of referring children with minor head injury to CT imaging. The goal of the CATCH rule was to maximize the sensitivity of predictions of potential intracranial lesion while keeping specificity at a reasonable level. After extensive analysis of the CATCH data set, characterized by severe class imbalance, and after a thorough evaluation of several data mining methods, we derived an ensemble of multiple Naive Bayes classifiers as the prediction model for CT imaging decisions. In the first phase of the experiment we compared the proposed ensemble model to other ensemble models employing rule-, tree- and instance-based member classifiers. Our prediction model demonstrated the best performance in terms of AUC, G-mean and sensitivity measures. In the second phase, using a bootstrapping experiment similar to that reported by the CATCH investigators, we showed that the proposed ensemble model achieved a more balanced predictive performance than the CATCH rule with an average sensitivity of 82.8% and an average specificity of 74.4% (vs. 98.1% and 50.0% for the CATCH rule respectively). Automatically derived prediction models cannot replace a physician's acumen. However, they help establish reference performance indicators for the purpose of developing clinical decision rules so the trade-off between prediction sensitivity and specificity is better understood. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Breeding Jatropha curcas by genomic selection: A pilot assessment of the accuracy of predictive models.

    PubMed

    Azevedo Peixoto, Leonardo de; Laviola, Bruno Galvêas; Alves, Alexandre Alonso; Rosado, Tatiana Barbosa; Bhering, Leonardo Lopes

    2017-01-01

    Genomic wide selection is a promising approach for improving the selection accuracy in plant breeding, particularly in species with long life cycles, such as Jatropha. Therefore, the objectives of this study were to estimate the genetic parameters for grain yield (GY) and the weight of 100 seeds (W100S) using restricted maximum likelihood (REML); to compare the performance of GWS methods to predict GY and W100S; and to estimate how many markers are needed to train the GWS model to obtain the maximum accuracy. Eight GWS models were compared in terms of predictive ability. The impact that the marker density had on the predictive ability was investigated using a varying number of markers, from 2 to 1,248. Because the genetic variance between evaluated genotypes was significant, it was possible to obtain selection gain. All of the GWS methods tested in this study can be used to predict GY and W100S in Jatropha. A training model fitted using 1,000 and 800 markers is sufficient to capture the maximum genetic variance and, consequently, maximum prediction ability of GY and W100S, respectively. This study demonstrated the applicability of genome-wide prediction to identify useful genetic sources of GY and W100S for Jatropha breeding. Further research is needed to confirm the applicability of the proposed approach to other complex traits.

  5. A self-determination theory approach to understanding the antecedents of teachers' motivational strategies in physical education.

    PubMed

    Taylor, Ian M; Ntoumanis, Nikos; Standage, Martyn

    2008-02-01

    Physical education teachers can influence students' self-determination through the motivational strategies that they use. The current study examined how teachers' reported use of three motivational strategies (providing a meaningful rationale, providing instrumental help and support, and gaining an understanding of the students) were predicted by perceived job pressure, perceptions of student self-determination, the teachers' autonomous orientation, psychological need satisfaction, and self-determination to teach. Structural equation modeling supported a model in which perceived job pressure, perceptions of student self-determination, and teacher autonomous orientation predicted teacher psychological need satisfaction, which, in turn positively influenced teacher self-determination. The last positively predicted the use of all three strategies. Direct positive effects of teachers' psychological need satisfaction on the strategies of gaining an understanding of students and instrumental help and support were also found. In summary, factors that influence teacher motivation may also indirectly affect their motivational strategies toward students.

  6. A hypothetical model for predicting the toxicity of high aspect ratio nanoparticles (HARN)

    NASA Astrophysics Data System (ADS)

    Tran, C. L.; Tantra, R.; Donaldson, K.; Stone, V.; Hankin, S. M.; Ross, B.; Aitken, R. J.; Jones, A. D.

    2011-12-01

    The ability to predict nanoparticle (dimensional structures which are less than 100 nm in size) toxicity through the use of a suitable model is an important goal if nanoparticles are to be regulated in terms of exposures and toxicological effects. Recently, a model to predict toxicity of nanoparticles with high aspect ratio has been put forward by a consortium of scientists. The High aspect ratio nanoparticles (HARN) model is a platform that relates the physical dimensions of HARN (specifically length and diameter ratio) and biopersistence to their toxicity in biological environments. Potentially, this model is of great public health and economic importance, as it can be used as a tool to not only predict toxicological activity but can be used to classify the toxicity of various fibrous nanoparticles, without the need to carry out time-consuming and expensive toxicology studies. However, this model of toxicity is currently hypothetical in nature and is based solely on drawing similarities in its dimensional geometry with that of asbestos and synthetic vitreous fibres. The aim of this review is two-fold: (a) to present findings from past literature, on the physicochemical property and pathogenicity bioassay testing of HARN (b) to identify some of the challenges and future research steps crucial before the HARN model can be accepted as a predictive model. By presenting what has been done, we are able to identify scientific challenges and research directions that are needed for the HARN model to gain public acceptance. Our recommendations for future research includes the need to: (a) accurately link physicochemical data with corresponding pathogenicity assay data, through the use of suitable reference standards and standardised protocols, (b) develop better tools/techniques for physicochemical characterisation, (c) to develop better ways of monitoring HARN in the workplace, (d) to reliably measure dose exposure levels, in order to support future epidemiological studies.

  7. CONFOLD2: improved contact-driven ab initio protein structure modeling.

    PubMed

    Adhikari, Badri; Cheng, Jianlin

    2018-01-25

    Contact-guided protein structure prediction methods are becoming more and more successful because of the latest advances in residue-residue contact prediction. To support contact-driven structure prediction, effective tools that can quickly build tertiary structural models of good quality from predicted contacts need to be developed. We develop an improved contact-driven protein modelling method, CONFOLD2, and study how it may be effectively used for ab initio protein structure prediction with predicted contacts as input. It builds models using various subsets of input contacts to explore the fold space under the guidance of a soft square energy function, and then clusters the models to obtain the top five models. CONFOLD2 obtains an average reconstruction accuracy of 0.57 TM-score for the 150 proteins in the PSICOV contact prediction dataset. When benchmarked on the CASP11 contacts predicted using CONSIP2 and CASP12 contacts predicted using Raptor-X, CONFOLD2 achieves a mean TM-score of 0.41 on both datasets. CONFOLD2 allows to quickly generate top five structural models for a protein sequence when its secondary structures and contacts predictions at hand. The source code of CONFOLD2 is publicly available at https://github.com/multicom-toolbox/CONFOLD2/ .

  8. Critical research issues in development of biomathematical models of fatigue and performance.

    PubMed

    Dinges, David F

    2004-03-01

    This article reviews the scientific research needed to ensure the continued development, validation, and operational transition of biomathematical models of fatigue and performance. These models originated from the need to ascertain the formal underlying relationships among sleep and circadian dynamics in the control of alertness and neurobehavioral performance capability. Priority should be given to research that further establishes their basic validity, including the accuracy of the core mathematical formulae and parameters that instantiate the interactions of sleep/wake and circadian processes. Since individuals can differ markedly and reliably in their responses to sleep loss and to countermeasures for it, models must incorporate estimates of these inter-individual differences, and research should identify predictors of them. To ensure models accurately predict recovery of function with sleep of varying durations, dose-response curves for recovery of performance as a function of prior sleep homeostatic load and the number of days of recovery are needed. It is also necessary to establish whether the accuracy of models is affected by using work/rest schedules as surrogates for sleep/wake inputs to models. Given the importance of light as both a circadian entraining agent and an alerting agent, research should determine the extent to which light input could incrementally improve model predictions of performance, especially in persons exposed to night work, jet lag, and prolonged work. Models seek to estimate behavioral capability and/or the relative risk of adverse events in a fatigued state. Research is needed on how best to scale and interpret metrics of behavioral capability, and incorporate factors that amplify or diminish the relationship between model predictions of performance and risk outcomes.

  9. Incorporation of Predictive Population Modeling into the AOP Famework: A Case Study with White Suckers Exposed to Pulp Effluent

    EPA Science Inventory

    A need in ecological risk assessment is the ability to create linkages between chemically-induced alterations at molecular and biochemical levels of organization with adverse outcomes in whole organisms and populations. A predictive model was developed to translate changes in th...

  10. Modeling the Earth system in the Mission to Planet Earth era

    NASA Technical Reports Server (NTRS)

    Unninayar, Sushel; Bergman, Kenneth H.

    1993-01-01

    A broad overview is made of global earth system modeling in the Mission to Planet Earth (MTPE) era for the multidisciplinary audience encompassed by the Global Change Research Program (GCRP). Time scales of global system fluctuation and change are described in Section 2. Section 3 provides a rubric for modeling the global earth system, as presently understood. The ability of models to predict the future state of the global earth system and the extent to which their predictions are reliable are covered in Sections 4 and 5. The 'engineering' use of global system models (and predictions) is covered in Section 6. Section 7 covers aspects of an increasing need for improved transform algorithms and better methods to assimilate this information into global models. Future monitoring and data requirements are detailed in Section 8. Section 9 covers the NASA-initiated concept 'Mission to Planet Earth,' which employs space and ground based measurement systems to provide the scientific basis for understanding global change. Section 10 concludes this review with general remarks concerning the state of global system modeling and observing technology and the need for future research.

  11. Defense Waste Processing Facility Nitric- Glycolic Flowsheet Chemical Process Cell Chemistry: Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zamecnik, J.; Edwards, T.

    The conversions of nitrite to nitrate, the destruction of glycolate, and the conversion of glycolate to formate and oxalate were modeled for the Nitric-Glycolic flowsheet using data from Chemical Process Cell (CPC) simulant runs conducted by Savannah River National Laboratory (SRNL) from 2011 to 2016. The goal of this work was to develop empirical correlation models to predict these values from measureable variables from the chemical process so that these quantities could be predicted a-priori from the sludge or simulant composition and measurable processing variables. The need for these predictions arises from the need to predict the REDuction/OXidation (REDOX) statemore » of the glass from the Defense Waste Processing Facility (DWPF) melter. This report summarizes the work on these correlations based on the aforementioned data. Previous work on these correlations was documented in a technical report covering data from 2011-2015. This current report supersedes this previous report. Further refinement of the models as additional data are collected is recommended.« less

  12. Need satisfaction, motivational regulations and exercise: moderation and mediation effects.

    PubMed

    Weman-Josefsson, Karin; Lindwall, Magnus; Ivarsson, Andreas

    2015-05-20

    Based on the Self-determination theory process model, this study aimed to explore relationships between the latent constructs of psychological need satisfaction, autonomous motivation and exercise behaviour; the mediational role of autonomous motivation in the association of psychological need satisfaction with exercise behaviour; as well as gender and age differences in the aforementioned associations. Adult active members of an Internet-based exercise program (n = 1091) between 18 and 78 years of age completed a test battery on motivational aspects based on Self-determination theory. The Basic Psychological Needs in Exercise Scale and the Behavioural Regulation in Exercise Questionnaire-2 were used to measure need satisfaction and type of motivation and the Leisure Time Exercise Questionnaire to measure self-reported exercise. Need satisfaction predicted autonomous motivation, which in turn predicted exercise, especially for women. Autonomous motivation was found to mediate the association between need satisfaction and exercise. Age and gender moderated several of the paths in the model linking need satisfaction with motivation and exercise. The results demonstrated gender and age differences in the proposed sequential mechanisms between autonomous motivation and exercise in the process model. This study thus highlights a potential value in considering moderating factors and the need to further examine the underlying mechanisms between needs, autonomous motivation, and exercise behaviour.

  13. Validation of model predictions of pore-scale fluid distributions during two-phase flow

    NASA Astrophysics Data System (ADS)

    Bultreys, Tom; Lin, Qingyang; Gao, Ying; Raeini, Ali Q.; AlRatrout, Ahmed; Bijeljic, Branko; Blunt, Martin J.

    2018-05-01

    Pore-scale two-phase flow modeling is an important technology to study a rock's relative permeability behavior. To investigate if these models are predictive, the calculated pore-scale fluid distributions which determine the relative permeability need to be validated. In this work, we introduce a methodology to quantitatively compare models to experimental fluid distributions in flow experiments visualized with microcomputed tomography. First, we analyzed five repeated drainage-imbibition experiments on a single sample. In these experiments, the exact fluid distributions were not fully repeatable on a pore-by-pore basis, while the global properties of the fluid distribution were. Then two fractional flow experiments were used to validate a quasistatic pore network model. The model correctly predicted the fluid present in more than 75% of pores and throats in drainage and imbibition. To quantify what this means for the relevant global properties of the fluid distribution, we compare the main flow paths and the connectivity across the different pore sizes in the modeled and experimental fluid distributions. These essential topology characteristics matched well for drainage simulations, but not for imbibition. This suggests that the pore-filling rules in the network model we used need to be improved to make reliable predictions of imbibition. The presented analysis illustrates the potential of our methodology to systematically and robustly test two-phase flow models to aid in model development and calibration.

  14. UXO Burial Prediction Fidelity

    DTIC Science & Technology

    2017-07-01

    been developed to predict the initial penetration depth of underwater mines . SERDP would like to know if and how these existing mine models could be...designed for near-cylindrical mines —for munitions, however, projectile-specific drag, lift, and moment coefficients are needed for estimating...as inputs.  Other models have been built to estimate these initial conditions for mines dropped into water.  Can these mine models be useful for

  15. SU-E-T-630: Predictive Modeling of Mortality, Tumor Control, and Normal Tissue Complications After Stereotactic Body Radiotherapy for Stage I Non-Small Cell Lung Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindsay, WD; Oncora Medical, LLC, Philadelphia, PA; Berlind, CG

    Purpose: While rates of local control have been well characterized after stereotactic body radiotherapy (SBRT) for stage I non-small cell lung cancer (NSCLC), less data are available characterizing survival and normal tissue toxicities, and no validated models exist assessing these parameters after SBRT. We evaluate the reliability of various machine learning techniques when applied to radiation oncology datasets to create predictive models of mortality, tumor control, and normal tissue complications. Methods: A dataset of 204 consecutive patients with stage I non-small cell lung cancer (NSCLC) treated with stereotactic body radiotherapy (SBRT) at the University of Pennsylvania between 2009 and 2013more » was used to create predictive models of tumor control, normal tissue complications, and mortality in this IRB-approved study. Nearly 200 data fields of detailed patient- and tumor-specific information, radiotherapy dosimetric measurements, and clinical outcomes data were collected. Predictive models were created for local tumor control, 1- and 3-year overall survival, and nodal failure using 60% of the data (leaving the remainder as a test set). After applying feature selection and dimensionality reduction, nonlinear support vector classification was applied to the resulting features. Models were evaluated for accuracy and area under ROC curve on the 81-patient test set. Results: Models for common events in the dataset (such as mortality at one year) had the highest predictive power (AUC = .67, p < 0.05). For rare occurrences such as radiation pneumonitis and local failure (each occurring in less than 10% of patients), too few events were present to create reliable models. Conclusion: Although this study demonstrates the validity of predictive analytics using information extracted from patient medical records and can most reliably predict for survival after SBRT, larger sample sizes are needed to develop predictive models for normal tissue toxicities and more advanced machine learning methodologies need be consider in the future.« less

  16. A generalized procedure for the prediction of multicomponent adsorption equilibria

    DOE PAGES

    Ladshaw, Austin; Yiacoumi, Sotira; Tsouris, Costas

    2015-04-07

    Prediction of multicomponent adsorption equilibria has been investigated for several decades. While there are theories available to predict the adsorption behavior of ideal mixtures, there are few purely predictive theories to account for nonidealities in real systems. Most models available for dealing with nonidealities contain interaction parameters that must be obtained through correlation with binary-mixture data. However, as the number of components in a system grows, the number of parameters needed to be obtained increases exponentially. Here, a generalized procedure is proposed, as an extension of the predictive real adsorbed solution theory, for determining the parameters of any activity model,more » for any number of components, without correlation. This procedure is then combined with the adsorbed solution theory to predict the adsorption behavior of mixtures. As this method can be applied to any isotherm model and any activity model, it is referred to as the generalized predictive adsorbed solution theory.« less

  17. Using Speculative Execution to Automatically Hide I/O Latency

    DTIC Science & Technology

    2001-12-07

    sion of the Lempel - Ziv algorithm and the Finite multi-order context models (FMOC) that originated from prediction-by-partial-match data compressors...allowed the cancellation of a single hint at a time.) 2.2.4 Predicting future data needs In order to take advantage of any of the algorithms described...modelling techniques generally used for data compression to perform probabilistic prediction of an application’s next page fault (or, in an object-oriented

  18. Handling a Small Dataset Problem in Prediction Model by employ Artificial Data Generation Approach: A Review

    NASA Astrophysics Data System (ADS)

    Lateh, Masitah Abdul; Kamilah Muda, Azah; Yusof, Zeratul Izzah Mohd; Azilah Muda, Noor; Sanusi Azmi, Mohd

    2017-09-01

    The emerging era of big data for past few years has led to large and complex data which needed faster and better decision making. However, the small dataset problems still arise in a certain area which causes analysis and decision are hard to make. In order to build a prediction model, a large sample is required as a training sample of the model. Small dataset is insufficient to produce an accurate prediction model. This paper will review an artificial data generation approach as one of the solution to solve the small dataset problem.

  19. Using Pareto points for model identification in predictive toxicology

    PubMed Central

    2013-01-01

    Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649

  20. Model Design for Military Advisors

    DTIC Science & Technology

    2013-05-02

    needs of their counterpart. This paper explores one area that would significantly improve advising outcomes; using advising models to match the...more specific. This paper develops three dominate models for advisors; the Stoic Acquaintance, the General Manger, and the Entertainer which can...then outcomes related to the individual counterpart’s developmental needs will be more predictable and specific. This paper will focus only on

  1. NASA Langley developments in response calculations needed for failure and life prediction

    NASA Technical Reports Server (NTRS)

    Housner, Jerrold M.

    1993-01-01

    NASA Langley developments in response calculations needed for failure and life predictions are discussed. Topics covered include: structural failure analysis in concurrent engineering; accuracy of independent regional modeling demonstrated on classical example; functional interface method accurately joins incompatible finite element models; interface method for insertion of local detail modeling extended to curve pressurized fuselage window panel; interface concept for joining structural regions; motivation for coupled 2D-3D analysis; compression panel with discontinuous stiffener coupled 2D-3D model and axial surface strains at the middle of the hat stiffener; use of adaptive refinement with multiple methods; adaptive mesh refinement; and studies on quantity effect of bow-type initial imperfections on reliability of stiffened panels.

  2. A Global Model for Bankruptcy Prediction

    PubMed Central

    Alaminos, David; del Castillo, Agustín; Fernández, Manuel Ángel

    2016-01-01

    The recent world financial crisis has increased the number of bankruptcies in numerous countries and has resulted in a new area of research which responds to the need to predict this phenomenon, not only at the level of individual countries, but also at a global level, offering explanations of the common characteristics shared by the affected companies. Nevertheless, few studies focus on the prediction of bankruptcies globally. In order to compensate for this lack of empirical literature, this study has used a methodological framework of logistic regression to construct predictive bankruptcy models for Asia, Europe and America, and other global models for the whole world. The objective is to construct a global model with a high capacity for predicting bankruptcy in any region of the world. The results obtained have allowed us to confirm the superiority of the global model in comparison to regional models over periods of up to three years prior to bankruptcy. PMID:27880810

  3. A systematic review of models to predict recruitment to multicentre clinical trials.

    PubMed

    Barnard, Katharine D; Dent, Louise; Cook, Andrew

    2010-07-06

    Less than one third of publicly funded trials managed to recruit according to their original plan often resulting in request for additional funding and/or time extensions. The aim was to identify models which might be useful to a major public funder of randomised controlled trials when estimating likely time requirements for recruiting trial participants. The requirements of a useful model were identified as usability, based on experience, able to reflect time trends, accounting for centre recruitment and contribution to a commissioning decision. A systematic review of English language articles using MEDLINE and EMBASE. Search terms included: randomised controlled trial, patient, accrual, predict, enroll, models, statistical; Bayes Theorem; Decision Theory; Monte Carlo Method and Poisson. Only studies discussing prediction of recruitment to trials using a modelling approach were included. Information was extracted from articles by one author, and checked by a second, using a pre-defined form. Out of 326 identified abstracts, only 8 met all the inclusion criteria. Of these 8 studies examined, there are five major classes of model discussed: the unconditional model, the conditional model, the Poisson model, Bayesian models and Monte Carlo simulation of Markov models. None of these meet all the pre-identified needs of the funder. To meet the needs of a number of research programmes, a new model is required as a matter of importance. Any model chosen should be validated against both retrospective and prospective data, to ensure the predictions it gives are superior to those currently used.

  4. Psychopathy and Deviant Workplace Behavior: A Comparison of Two Psychopathy Models.

    PubMed

    Carre, Jessica R; Mueller, Steven M; Schleicher, Karly M; Jones, Daniel N

    2018-04-01

    Although psychopathy is an interpersonally harmful construct, few studies have compared different psycho athy models in predicting different types of workplace deviance. We examined how the Triarchic Psychopathy Model (TRI-PM) and the Self-Report Psychopathy-Short Form (SRP-SF) predicted deviant workplace behaviors in two forms: sexual harassment and deviant work behaviors. Using structural equations modeling, the latent factor of psychopathy was predictive for both types of deviant workplace behavior. Specifically, the SRP-SF signif cantly predicted both measures of deviant workplace behavior. With respect to the TRI-PM, meanness and disinhibition significantly predicted higher scores of workplace deviance and workplace sexual harassment measures. Future research needs to investigate the influence of psychopathy on deviant workplace behaviors, and consider the measures they use when they investigate these constructs.

  5. Mathematical prediction of core body temperature from environment, activity, and clothing: The heat strain decision aid (HSDA).

    PubMed

    Potter, Adam W; Blanchard, Laurie A; Friedl, Karl E; Cadarette, Bruce S; Hoyt, Reed W

    2017-02-01

    Physiological models provide useful summaries of complex interrelated regulatory functions. These can often be reduced to simple input requirements and simple predictions for pragmatic applications. This paper demonstrates this modeling efficiency by tracing the development of one such simple model, the Heat Strain Decision Aid (HSDA), originally developed to address Army needs. The HSDA, which derives from the Givoni-Goldman equilibrium body core temperature prediction model, uses 16 inputs from four elements: individual characteristics, physical activity, clothing biophysics, and environmental conditions. These inputs are used to mathematically predict core temperature (T c ) rise over time and can estimate water turnover from sweat loss. Based on a history of military applications such as derivation of training and mission planning tools, we conclude that the HSDA model is a robust integration of physiological rules that can guide a variety of useful predictions. The HSDA model is limited to generalized predictions of thermal strain and does not provide individualized predictions that could be obtained from physiological sensor data-driven predictive models. This fully transparent physiological model should be improved and extended with new findings and new challenging scenarios. Published by Elsevier Ltd.

  6. The Current State of Predicting Furrow Irrigation Erosion

    USDA-ARS?s Scientific Manuscript database

    There continues to be a need to predict furrow irrigation erosion to estimate on- and off-site impacts of irrigation management. The objective of this paper is to review the current state of furrow erosion prediction technology considering four models: SISL, WEPP, WinSRFR and APEX. SISL is an empiri...

  7. The importance of accurate muscle modelling for biomechanical analyses: a case study with a lizard skull

    PubMed Central

    Gröning, Flora; Jones, Marc E. H.; Curtis, Neil; Herrel, Anthony; O'Higgins, Paul; Evans, Susan E.; Fagan, Michael J.

    2013-01-01

    Computer-based simulation techniques such as multi-body dynamics analysis are becoming increasingly popular in the field of skull mechanics. Multi-body models can be used for studying the relationships between skull architecture, muscle morphology and feeding performance. However, to be confident in the modelling results, models need to be validated against experimental data, and the effects of uncertainties or inaccuracies in the chosen model attributes need to be assessed with sensitivity analyses. Here, we compare the bite forces predicted by a multi-body model of a lizard (Tupinambis merianae) with in vivo measurements, using anatomical data collected from the same specimen. This subject-specific model predicts bite forces that are very close to the in vivo measurements and also shows a consistent increase in bite force as the bite position is moved posteriorly on the jaw. However, the model is very sensitive to changes in muscle attributes such as fibre length, intrinsic muscle strength and force orientation, with bite force predictions varying considerably when these three variables are altered. We conclude that accurate muscle measurements are crucial to building realistic multi-body models and that subject-specific data should be used whenever possible. PMID:23614944

  8. Synthesis of User Needs for Arctic Sea Ice Predictions

    NASA Astrophysics Data System (ADS)

    Wiggins, H. V.; Turner-Bogren, E. J.; Sheffield Guy, L.

    2017-12-01

    Forecasting Arctic sea ice on sub-seasonal to seasonal scales in a changing Arctic is of interest to a diverse range of stakeholders. However, sea ice forecasting is still challenging due to high variability in weather and ocean conditions and limits to prediction capabilities; the science needs for observations and modeling are extensive. At a time of challenged science funding, one way to prioritize sea ice prediction efforts is to examine the information needs of various stakeholder groups. This poster will present a summary and synthesis of existing surveys, reports, and other literature that examines user needs for sea ice predictions. The synthesis will include lessons learned from the Sea Ice Prediction Network (a collaborative, multi-agency-funded project focused on seasonal Arctic sea ice predictions), the Sea Ice for Walrus Outlook (a resource for Alaska Native subsistence hunters and coastal communities, that provides reports on weather and sea ice conditions), and other efforts. The poster will specifically compare the scales and variables of sea ice forecasts currently available, as compared to what information is requested by various user groups.

  9. Tailoring Mathematical Models to Stem-Cell Derived Cardiomyocyte Lines Can Improve Predictions of Drug-Induced Changes to Their Electrophysiology.

    PubMed

    Lei, Chon Lok; Wang, Ken; Clerx, Michael; Johnstone, Ross H; Hortigon-Vinagre, Maria P; Zamora, Victor; Allan, Andrew; Smith, Godfrey L; Gavaghan, David J; Mirams, Gary R; Polonchuk, Liudmila

    2017-01-01

    Human induced pluripotent stem cell derived cardiomyocytes (iPSC-CMs) have applications in disease modeling, cell therapy, drug screening and personalized medicine. Computational models can be used to interpret experimental findings in iPSC-CMs, provide mechanistic insights, and translate these findings to adult cardiomyocyte (CM) electrophysiology. However, different cell lines display different expression of ion channels, pumps and receptors, and show differences in electrophysiology. In this exploratory study, we use a mathematical model based on iPSC-CMs from Cellular Dynamic International (CDI, iCell), and compare its predictions to novel experimental recordings made with the Axiogenesis Cor.4U line. We show that tailoring this model to the specific cell line, even using limited data and a relatively simple approach, leads to improved predictions of baseline behavior and response to drugs. This demonstrates the need and the feasibility to tailor models to individual cell lines, although a more refined approach will be needed to characterize individual currents, address differences in ion current kinetics, and further improve these results.

  10. Adapting the Water Erosion Prediction Project (WEPP) model for forest applications

    Treesearch

    Shuhui Dun; Joan Q. Wu; William J. Elliot; Peter R. Robichaud; Dennis C. Flanagan; James R. Frankenberger; Robert E. Brown; Arthur C. Xu

    2009-01-01

    There has been an increasing public concern over forest stream pollution by excessive sedimentation due to natural or human disturbances. Adequate erosion simulation tools are needed for sound management of forest resources. The Water Erosion Prediction Project (WEPP) watershed model has proved useful in forest applications where Hortonian flow is the major form of...

  11. Validation of a probabilistic post-fire erosion model

    Treesearch

    Pete Robichaud; William J. Elliot; Sarah A. Lewis; Mary Ellen Miller

    2016-01-01

    Post-fire increases of runoff and erosion often occur and land managers need tools to be able to project the increased risk. The Erosion Risk Management Tool (ERMiT) uses the Water Erosion Prediction Project (WEPP) model as the underlying processor. ERMiT predicts the probability of a given amount of hillslope sediment delivery from a single rainfall or...

  12. Detecting Protected Health Information in Heterogeneous Clinical Notes.

    PubMed

    Henriksson, Aron; Kvist, Maria; Dalianis, Hercules

    2017-01-01

    To enable secondary use of healthcare data in a privacy-preserving manner, there is a need for methods capable of automatically identifying protected health information (PHI) in clinical text. To that end, learning predictive models from labeled examples has emerged as a promising alternative to rule-based systems. However, little is known about differences with respect to PHI prevalence in different types of clinical notes and how potential domain differences may affect the performance of predictive models trained on one particular type of note and applied to another. In this study, we analyze the performance of a predictive model trained on an existing PHI corpus of Swedish clinical notes and applied to a variety of clinical notes: written (i) in different clinical specialties, (ii) under different headings, and (iii) by persons in different professions. The results indicate that domain adaption is needed for effective detection of PHI in heterogeneous clinical notes.

  13. The relationship between language use and depression: illuminating the importance of self-reflection, self-rumination, and the need for absolute truth.

    PubMed

    Şimşek, Ömer Faruk

    2013-01-01

    The main aim of the present study was to provide additional knowledge about the mediatory processes through which language relates to depression. Although previous research gave clear evidence that language is closely related to depression, the research on intervening variables in the relationship has been limited. The present investigation tested a structural equation model in which self-concept clarity and self-consciousness mediated the relationship between personal perceptions of language and depression. Since "the need for absolute truth" construct has been shown to be important in providing greater consistency in estimates of the relationships among the variables, it has been added to the model as a control variable. The results supported the model and showed that personal perceptions of language predicted self-concept clarity, which in turn predicted the participants' self-reflection and self-rumination. Self-reflection and self-rumination, in turn, predicted depression.

  14. The scaling of geographic ranges: implications for species distribution models

    USGS Publications Warehouse

    Yackulic, Charles B.; Ginsberg, Joshua R.

    2016-01-01

    There is a need for timely science to inform policy and management decisions; however, we must also strive to provide predictions that best reflect our understanding of ecological systems. Species distributions evolve through time and reflect responses to environmental conditions that are mediated through individual and population processes. Species distribution models that reflect this understanding, and explicitly model dynamics, are likely to give more accurate predictions.

  15. Distributed collaborative decision support environments for predictive awareness

    NASA Astrophysics Data System (ADS)

    McQuay, William K.; Stilman, Boris; Yakhnis, Vlad

    2005-05-01

    The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, rapidly assess the enemy"s course of action (eCOA) or possible actions and promulgate their own course of action (COA) - a need for predictive awareness. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Revolutionary new approaches to strategy generation and assessment such as Linguistic Geometry (LG) permit the rapid development of COA vs. enemy COA (eCOA). LG tools automatically generate and permit the operators to take advantage of winning strategies and tactics for mission planning and execution in near real-time. LG is predictive and employs deep "look-ahead" from the current state and provides a realistic, reactive model of adversary reasoning and behavior. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing research efforts in applying distributed collaborative environments to decision support for predictive mission awareness.

  16. The NASA/industry Design Analysis Methods for Vibrations (DAMVIBS) program : Bell Helicopter Textron accomplishments

    NASA Technical Reports Server (NTRS)

    Cronkhite, James D.

    1993-01-01

    Accurate vibration prediction for helicopter airframes is needed to 'fly from the drawing board' without costly development testing to solve vibration problems. The principal analytical tool for vibration prediction within the U.S. helicopter industry is the NASTRAN finite element analysis. Under the NASA DAMVIBS research program, Bell conducted NASTRAN modeling, ground vibration testing, and correlations of both metallic (AH-1G) and composite (ACAP) airframes. The objectives of the program were to assess NASTRAN airframe vibration correlations, to investigate contributors to poor agreement, and to improve modeling techniques. In the past, there has been low confidence in higher frequency vibration prediction for helicopters that have multibladed rotors (three or more blades) with predominant excitation frequencies typically above 15 Hz. Bell's findings under the DAMVIBS program, discussed in this paper, included the following: (1) accuracy of finite element models (FEM) for composite and metallic airframes generally were found to be comparable; (2) more detail is needed in the FEM to improve higher frequency prediction; (3) secondary structure not normally included in the FEM can provide significant stiffening; (4) damping can significantly affect phase response at higher frequencies; and (5) future work is needed in the areas of determination of rotor-induced vibratory loads and optimization.

  17. A general structure-property relationship to predict the enthalpy of vaporisation at ambient temperatures.

    PubMed

    Oberg, T

    2007-01-01

    The vapour pressure is the most important property of an anthropogenic organic compound in determining its partitioning between the atmosphere and the other environmental media. The enthalpy of vaporisation quantifies the temperature dependence of the vapour pressure and its value around 298 K is needed for environmental modelling. The enthalpy of vaporisation can be determined by different experimental methods, but estimation methods are needed to extend the current database and several approaches are available from the literature. However, these methods have limitations, such as a need for other experimental results as input data, a limited applicability domain, a lack of domain definition, and a lack of predictive validation. Here we have attempted to develop a quantitative structure-property relationship (QSPR) that has general applicability and is thoroughly validated. Enthalpies of vaporisation at 298 K were collected from the literature for 1835 pure compounds. The three-dimensional (3D) structures were optimised and each compound was described by a set of computationally derived descriptors. The compounds were randomly assigned into a calibration set and a prediction set. Partial least squares regression (PLSR) was used to estimate a low-dimensional QSPR model with 12 latent variables. The predictive performance of this model, within the domain of application, was estimated at n=560, q2Ext=0.968 and s=0.028 (log transformed values). The QSPR model was subsequently applied to a database of 100,000+ structures, after a similar 3D optimisation and descriptor generation. Reliable predictions can be reported for compounds within the previously defined applicability domain.

  18. On the predictive ability of mechanistic models for the Haitian cholera epidemic.

    PubMed

    Mari, Lorenzo; Bertuzzo, Enrico; Finger, Flavio; Casagrandi, Renato; Gatto, Marino; Rinaldo, Andrea

    2015-03-06

    Predictive models of epidemic cholera need to resolve at suitable aggregation levels spatial data pertaining to local communities, epidemiological records, hydrologic drivers, waterways, patterns of human mobility and proxies of exposure rates. We address the above issue in a formal model comparison framework and provide a quantitative assessment of the explanatory and predictive abilities of various model settings with different spatial aggregation levels and coupling mechanisms. Reference is made to records of the recent Haiti cholera epidemics. Our intensive computations and objective model comparisons show that spatially explicit models accounting for spatial connections have better explanatory power than spatially disconnected ones for short-to-intermediate calibration windows, while parsimonious, spatially disconnected models perform better with long training sets. On average, spatially connected models show better predictive ability than disconnected ones. We suggest limits and validity of the various approaches and discuss the pathway towards the development of case-specific predictive tools in the context of emergency management. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  19. Sci-Fri AM: Quality, Safety, and Professional Issues 04: Predicting waiting times in Radiation Oncology using machine learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, Ackeem; Herrera, David; Hijal, Tarek

    We describe a method for predicting waiting times in radiation oncology. Machine learning is a powerful predictive modelling tool that benefits from large, potentially complex, datasets. The essence of machine learning is to predict future outcomes by learning from previous experience. The patient waiting experience remains one of the most vexing challenges facing healthcare. Waiting time uncertainty can cause patients, who are already sick and in pain, to worry about when they will receive the care they need. In radiation oncology, patients typically experience three types of waiting: Waiting at home for their treatment plan to be prepared Waiting inmore » the waiting room for daily radiotherapy Waiting in the waiting room to see a physician in consultation or follow-up These waiting periods are difficult for staff to predict and only rough estimates are typically provided, based on personal experience. In the present era of electronic health records, waiting times need not be so uncertain. At our centre, we have incorporated the electronic treatment records of all previously-treated patients into our machine learning model. We found that the Random Forest Regression model provides the best predictions for daily radiotherapy treatment waiting times (type 2). Using this model, we achieved a median residual (actual minus predicted value) of 0.25 minutes and a standard deviation residual of 6.5 minutes. The main features that generated the best fit model (from most to least significant) are: Allocated time, median past duration, fraction number and the number of treatment fields.« less

  20. Separating predictable and unpredictable work to manage interruptions and promote safe and effective work flow.

    PubMed

    Kowinsky, Amy M; Shovel, Judith; McLaughlin, Maribeth; Vertacnik, Lisa; Greenhouse, Pamela K; Martin, Susan Christie; Minnier, Tamra E

    2012-01-01

    Predictable and unpredictable patient care tasks compete for caregiver time and attention, making it difficult for patient care staff to reliably and consistently meet patient needs. We have piloted a redesigned care model that separates the work of patient care technicians based on task predictability and creates role specificity. This care model shows promise in improving the ability of staff to reliably complete tasks in a more consistent and timely manner.

  1. FINAL REPORT: Mechanistically-Base Field Scale Models of Uranium Biogeochemistry from Upscaling Pore-Scale Experiments and Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Brian D.

    2013-11-04

    Biogeochemical reactive transport processes in the subsurface environment are important to many contemporary environmental issues of significance to DOE. Quantification of risks and impacts associated with environmental management options, and design of remediation systems where needed, require that we have at our disposal reliable predictive tools (usually in the form of numerical simulation models). However, it is well known that even the most sophisticated reactive transport models available today have poor predictive power, particularly when applied at the field scale. Although the lack of predictive ability is associated in part with our inability to characterize the subsurface and limitations inmore » computational power, significant advances have been made in both of these areas in recent decades and can be expected to continue. In this research, we examined the upscaling (pore to Darcy and Darcy to field) the problem of bioremediation via biofilms in porous media. The principle idea was to start with a conceptual description of the bioremediation process at the pore scale, and apply upscaling methods to formally develop the appropriate upscaled model at the so-called Darcy scale. The purpose was to determine (1) what forms the upscaled models would take, and (2) how one might parameterize such upscaled models for applications to bioremediation in the field. We were able to effectively upscale the bioremediation process to explain how the pore-scale phenomena were linked to the field scale. The end product of this research was to produce a set of upscaled models that could be used to help predict field-scale bioremediation. These models were mechanistic, in the sense that they directly incorporated pore-scale information, but upscaled so that only the essential features of the process were needed to predict the effective parameters that appear in the model. In this way, a direct link between the microscale and the field scale was made, but the upscaling process helped inform potential users of the model what kinds of information would be needed to accurately characterize the system.« less

  2. Thermal sensation prediction by soft computing methodology.

    PubMed

    Jović, Srđan; Arsić, Nebojša; Vilimonović, Jovana; Petković, Dalibor

    2016-12-01

    Thermal comfort in open urban areas is very factor based on environmental point of view. Therefore it is need to fulfill demands for suitable thermal comfort during urban planning and design. Thermal comfort can be modeled based on climatic parameters and other factors. The factors are variables and they are changed throughout the year and days. Therefore there is need to establish an algorithm for thermal comfort prediction according to the input variables. The prediction results could be used for planning of time of usage of urban areas. Since it is very nonlinear task, in this investigation was applied soft computing methodology in order to predict the thermal comfort. The main goal was to apply extreme leaning machine (ELM) for forecasting of physiological equivalent temperature (PET) values. Temperature, pressure, wind speed and irradiance were used as inputs. The prediction results are compared with some benchmark models. Based on the results ELM can be used effectively in forecasting of PET. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Predicting and Managing Lighting and Visibility for Human Operations in Space

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Peacock, Brian

    2003-01-01

    Lighting is critical to human visual performance. On earth this problem is well understood and solutions are well defined and executed. Because the sun rises and sets on average every 45 minutes during Earth orbit, humans working in space must cope with extremely dynamic lighting conditions varying from very low light conditions to severe glare and contrast conditions. For critical operations, it is essential that lighting conditions be predictable and manageable. Mission planners need to detelmine whether low-light video cameras are required or whether additional luminaires, or lamps, need to be flown . Crew and flight directors need to have up to date daylight orbit time lines showing the best and worst viewing conditions for sunlight and shadowing. Where applicable and possible, lighting conditions need to be part of crew training. In addition, it is desirable to optimize the quantity and quality of light because of the potential impacts on crew safety, delivery costs, electrical power and equipment maintainability for both exterior and interior conditions. Addressing these issues, an illumination modeling system has been developed in the Space Human Factors Laboratory at ASA Johnson Space Center. The system is the integration of a physically based ray-tracing package ("Radiance"), developed at the Lawrence Berkeley Laboratories, a human factors oriented geometric modeling system developed by NASA and an extensive database of humans and their work environments. Measured and published data has been collected for exterior and interior surface reflectivity; luminaire beam spread distribution, color and intensity and video camera light sensitivity and has been associated with their corresponding geometric models. Selecting an eye-point and one or more light sources, including sun and earthshine, a snapshot of the light energy reaching the surfaces or reaching the eye point is computed. This energy map is then used to extract the required information needed for useful predictions. Using a validated, comprehensive illumination model integrated with empirically derived data, predictions of lighting and viewing conditions have been successfully used for Shuttle and Space Station planning and assembly operations. It has successfully balanced the needs for adequate human performance with the utili zation of resources. Keywords: Modeling, ray tracing, luminaires, refl ectivity, luminance, illuminance.

  4. Predicting acute pain after cesarean delivery using three simple questions.

    PubMed

    Pan, Peter H; Tonidandel, Ashley M; Aschenbrenner, Carol A; Houle, Timothy T; Harris, Lynne C; Eisenach, James C

    2013-05-01

    Interindividual variability in postoperative pain presents a clinical challenge. Preoperative quantitative sensory testing is useful but time consuming in predicting postoperative pain intensity. The current study was conducted to develop and validate a predictive model of acute postcesarean pain using a simple three-item preoperative questionnaire. A total of 200 women scheduled for elective cesarean delivery under subarachnoid anesthesia were enrolled (192 subjects analyzed). Patients were asked to rate the intensity of loudness of audio tones, their level of anxiety and anticipated pain, and analgesic need from surgery. Postoperatively, patients reported the intensity of evoked pain. Regression analysis was performed to generate a predictive model for pain from these measures. A validation cohort of 151 women was enrolled to test the reliability of the model (131 subjects analyzed). Responses from each of the three preoperative questions correlated moderately with 24-h evoked pain intensity (r = 0.24-0.33, P < 0.001). Audio tone rating added uniquely, but minimally, to the model and was not included in the predictive model. The multiple regression analysis yielded a statistically significant model (R = 0.20, P < 0.001), whereas the validation cohort showed reliably a very similar regression line (R = 0.18). In predicting the upper 20th percentile of evoked pain scores, the optimal cut point was 46.9 (z =0.24) such that sensitivity of 0.68 and specificity of 0.67 were as balanced as possible. This simple three-item questionnaire is useful to help predict postcesarean evoked pain intensity, and could be applied to further research and clinical application to tailor analgesic therapy to those who need it most.

  5. Planning, creating and documenting a NASTRAN finite element model of a modern helicopter

    NASA Technical Reports Server (NTRS)

    Gabal, R.; Reed, D.; Ricks, R.; Kesack, W.

    1985-01-01

    Mathematical models based on the finite element method of structural analysis as embodied in the NASTRAN computer code are widely used by the helicopter industry to calculate static internal loads and vibration of airframe structure. The internal loads are routinely used for sizing structural members. The vibration predictions are not yet relied on during design. NASA's Langley Research Center sponsored a program to conduct an application of the finite element method with emphasis on predicting structural vibration. The Army/Boeing CH-47D helicopter was used as the modeling subject. The objective was to engender the needed trust in vibration predictions using these models and establish a body of modeling guides which would enable confident future prediction of airframe vibration as part of the regular design process.

  6. Meteorological Processes Affecting Air Quality – Research and Model Development Needs

    EPA Science Inventory

    Meteorology modeling is an important component of air quality modeling systems that defines the physical and dynamical environment for atmospheric chemistry. The meteorology models used for air quality applications are based on numerical weather prediction models that were devel...

  7. Simulation and Prediction of Warm Season Drought in North America

    NASA Technical Reports Server (NTRS)

    Wang, Hailan; Chang, Yehui; Schubert, Siegfried D.; Koster, Randal D.

    2018-01-01

    This presentation presents our recent work on model simulation and prediction of warm season drought in North America. The emphasis will be on the contribution from the leading modes of subseasonal atmospheric circulation variability, which are often present in the form of stationary Rossby waves. Here we take advantage of the results from observations, reanalyses, and simulations and reforecasts performed using the NASA Goddard Earth Observing System (GEOS-5) atmospheric and coupled General Circulation Model (GCM). Our results show that stationary Rossby waves play a key role in Northern Hemisphere (NH) atmospheric circulation and surface meteorology variability on subseasonal timescales. In particular, such waves have been crucial to the development of recent short-term warm season heat waves and droughts over North America (e.g. the 1988, 1998, and 2012 summer droughts) and northern Eurasia (e.g., the 2003 summer heat wave over Europe and the 2010 summer drought and heat wave over Russia). Through an investigation of the physical processes by which these waves lead to the development of warm season drought in North America, it is further found that these waves can serve as a potential source of drought predictability. In order to properly represent their effect and exploit this source of predictability, a model needs to correctly simulate the Northern Hemisphere (NH) mean jet streams and be able to predict the sources of these waves. Given the NASA GEOS-5 AGCM deficiency in simulating the NH jet streams and tropical convection during boreal summer, an approach has been developed to artificially remove much of model mean biases, which leads to considerable improvement in model simulation and prediction of stationary Rossby waves and drought development in North America. Our study points to the need to identify key model biases that limit model simulation and prediction of regional climate extremes, and diagnose the origin of these biases so as to inform modeling group for model improvement.

  8. Wanting, having, and needing: integrating motive disposition theory and self-determination theory.

    PubMed

    Sheldon, Kennon M; Schüler, Julia

    2011-11-01

    Four studies explored the motivational and experiential dynamics of psychological needs, applying both self-determination theory and motive disposition theory. In all 4 studies, motive dispositions toward achievement and affiliation ("wanting" particular experiences) predicted corresponding feelings of competence and relatedness ("having" those experiences). Competence and relatedness in turn predicted well-being, again indicating that these 2 experiences may really be "needed." Illuminating how wanting gets to having, in Studies 2 and 3, participants reported greater self-concordance for motive-congruent goals, which, in longitudinal Study 3, predicted greater attainment of those goals and thus enhanced well-being. Study 4 replicated selected earlier results using an implicit as well as an explicit motive disposition measure. Supporting the presumed universality of competence and relatedness needs, in no studies did motive dispositions moderate the effects of corresponding need-satisfaction on well-being. Discussion focuses on a "sequential process" model of psychological needs that views needs as both motives that instigate and outcomes that reward behavior.

  9. Joint genome-wide prediction in several populations accounting for randomness of genotypes: A hierarchical Bayes approach. I: Multivariate Gaussian priors for marker effects and derivation of the joint probability mass function of genotypes.

    PubMed

    Martínez, Carlos Alberto; Khare, Kshitij; Banerjee, Arunava; Elzo, Mauricio A

    2017-03-21

    It is important to consider heterogeneity of marker effects and allelic frequencies in across population genome-wide prediction studies. Moreover, all regression models used in genome-wide prediction overlook randomness of genotypes. In this study, a family of hierarchical Bayesian models to perform across population genome-wide prediction modeling genotypes as random variables and allowing population-specific effects for each marker was developed. Models shared a common structure and differed in the priors used and the assumption about residual variances (homogeneous or heterogeneous). Randomness of genotypes was accounted for by deriving the joint probability mass function of marker genotypes conditional on allelic frequencies and pedigree information. As a consequence, these models incorporated kinship and genotypic information that not only permitted to account for heterogeneity of allelic frequencies, but also to include individuals with missing genotypes at some or all loci without the need for previous imputation. This was possible because the non-observed fraction of the design matrix was treated as an unknown model parameter. For each model, a simpler version ignoring population structure, but still accounting for randomness of genotypes was proposed. Implementation of these models and computation of some criteria for model comparison were illustrated using two simulated datasets. Theoretical and computational issues along with possible applications, extensions and refinements were discussed. Some features of the models developed in this study make them promising for genome-wide prediction, the use of information contained in the probability distribution of genotypes is perhaps the most appealing. Further studies to assess the performance of the models proposed here and also to compare them with conventional models used in genome-wide prediction are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hobbs, Michael L.

    We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model,more » implementation, and validation.« less

  11. Spatial Statistical Network Models for Stream and River Temperature in the Chesapeake Bay Watershed, USA

    EPA Science Inventory

    Regional temperature models are needed for characterizing and mapping stream thermal regimes, establishing reference conditions, predicting future impacts and identifying critical thermal refugia. Spatial statistical models have been developed to improve regression modeling techn...

  12. Effect of Interfacial Turbulence and Accommodation Coefficient on CFD Predictions of Pressurization and Pressure Control in Cryogenic Storage Tank

    NASA Technical Reports Server (NTRS)

    Kassemi, Mohammad; Kartuzova, Olga; Hylton, Sonya

    2015-01-01

    Laminar models agree closely with the pressure evolution and vapor phase temperature stratification but under-predict liquid temperatures. Turbulent SST k-w and k-e models under-predict the pressurization rate and extent of stratification in the vapor but represent liquid temperature distributions fairly well. These conclusions seem to equally apply to large cryogenic tank simulations as well as small scale simulant fluid pressurization cases. Appropriate turbulent models that represent both interfacial and bulk vapor phase turbulence with greater fidelity are needed. Application of LES models to the tank pressurization problem can serve as a starting point.

  13. What do we need to measure, how much, and where? A quantitative assessment of terrestrial data needs across North American biomes through data-model fusion and sampling optimization

    NASA Astrophysics Data System (ADS)

    Dietze, M. C.; Davidson, C. D.; Desai, A. R.; Feng, X.; Kelly, R.; Kooper, R.; LeBauer, D. S.; Mantooth, J.; McHenry, K.; Serbin, S. P.; Wang, D.

    2012-12-01

    Ecosystem models are designed to synthesize our current understanding of how ecosystems function and to predict responses to novel conditions, such as climate change. Reducing uncertainties in such models can thus improve both basic scientific understanding and our predictive capacity, but rarely have the models themselves been employed in the design of field campaigns. In the first part of this paper we provide a synthesis of uncertainty analyses conducted using the Predictive Ecosystem Analyzer (PEcAn) ecoinformatics workflow on the Ecosystem Demography model v2 (ED2). This work spans a number of projects synthesizing trait databases and using Bayesian data assimilation techniques to incorporate field data across temperate forests, grasslands, agriculture, short rotation forestry, boreal forests, and tundra. We report on a number of data needs that span a wide array diverse biomes, such as the need for better constraint on growth respiration. We also identify other data needs that are biome specific, such as reproductive allocation in tundra, leaf dark respiration in forestry and early-successional trees, and root allocation and turnover in mid- and late-successional trees. Future data collection needs to balance the unequal distribution of past measurements across biomes (temperate biased) and processes (aboveground biased) with the sensitivities of different processes. In the second part we present the development of a power analysis and sampling optimization module for the the PEcAn system. This module uses the results of variance decomposition analyses to estimate the further reduction in model predictive uncertainty for different sample sizes of different variables. By assigning a cost to each measurement type, we apply basic economic theory to optimize the reduction in model uncertainty for any total expenditure, or to determine the cost required to reduce uncertainty to a given threshold. Using this system we find that sampling switches among multiple measurement types but favors those with no prior measurements due to the need integrate over prior uncertainty in within and among site variability. When starting from scratch in a new system, the optimal design favors initial measurements of SLA due to high sensitivity and low cost. The value of many data types, such as photosynthetic response curves, depends strongly on whether one includes initial equipment costs or just per-sample costs. Similarly, sampling at previously measured locations is favored when infrastructure costs are high, otherwise across-site sampling is favored over intensive sampling except when within-site variability strongly dominates.

  14. Predisposing characteristics, enabling resources and need as predictors of utilization and clinical outcomes for veterans receiving mental health services.

    PubMed

    Fasoli, DiJon R; Glickman, Mark E; Eisen, Susan V

    2010-04-01

    Though demand for mental health services (MHS) among US veterans is increasing, MHS utilization per veteran is decreasing. With health and social service needs competing for limited resources, it is important to understand the association between patient factors, MHS utilization, and clinical outcomes. We use a framework based on Andersen's behavioral model of health service utilization to examine predisposing characteristics, enabling resources, and clinical need as predictors of MHS utilization and clinical outcomes. This was a prospective observational study of veterans receiving inpatient or outpatient MHS through Veterans Administration programs. Clinician ratings (Global Assessment of Functioning [GAF]) and self-report assessments (Behavior and Symptom Identification Scale-24) were completed for 421 veterans at enrollment and 3 months later. Linear and logistic regression analyses were conducted to examine: (1) predisposing characteristics, enabling resources, and need as predictors of MHS inpatient, residential, and outpatient utilization and (2) the association between individual characteristics, utilization, and clinical outcomes. Being older, female, having greater clinical need, lack of enabling resources (employment, stable housing, and social support), and easy access to treatment significantly predicted greater MHS utilization at 3-month follow-up. Less clinical need and no inpatient psychiatric hospitalization predicted better GAF and Behavior and Symptom Identification Scale-24 scores. White race and residential treatment also predicted better GAF scores. Neither enabling resources, nor number of outpatient mental health visits predicted clinical outcomes. This application of Andersen's behavioral model of health service utilization confirmed associations between some predisposing characteristics, need, and enabling resources on MHS utilization but only predisposing characteristics, need, and utilization were associated with clinical outcomes.

  15. Development of a Response Surface Thermal Model for Orion Mated to the International Space Station

    NASA Technical Reports Server (NTRS)

    Miller, Stephen W.; Meier, Eric J.

    2010-01-01

    A study was performed to determine if a Design of Experiments (DOE)/Response Surface Methodology could be applied to on-orbit thermal analysis and produce a set of Response Surface Equations (RSE) that accurately predict vehicle temperatures. The study used an integrated thermal model of the International Space Station and the Orion Outer mold line model. Five separate factors were identified for study: yaw, pitch, roll, beta angle, and the environmental parameters. Twenty external Orion temperatures were selected as the responses. A DOE case matrix of 110 runs was developed. The data from these cases were analyzed to produce an RSE for each of the temperature responses. The initial agreement between the engineering data and the RSE predictions was encouraging, although many RSEs had large uncertainties on their predictions. Fourteen verification cases were developed to test the predictive powers of the RSEs. The verification showed mixed results with some RSE predicting temperatures matching the engineering data within the uncertainty bands, while others had very large errors. While this study to not irrefutably prove that the DOE/RSM approach can be applied to on-orbit thermal analysis, it does demonstrate that technique has the potential to predict temperatures. Additional work is needed to better identify the cases needed to produce the RSEs

  16. Predicting cyanobacterial abundance, microcystin, and geosmin in a eutrophic drinking-water reservoir using a 14-year dataset

    USGS Publications Warehouse

    Harris, Ted D.; Graham, Jennifer L.

    2017-01-01

    Cyanobacterial blooms degrade water quality in drinking water supply reservoirs by producing toxic and taste-and-odor causing secondary metabolites, which ultimately cause public health concerns and lead to increased treatment costs for water utilities. There have been numerous attempts to create models that predict cyanobacteria and their secondary metabolites, most using linear models; however, linear models are limited by assumptions about the data and have had limited success as predictive tools. Thus, lake and reservoir managers need improved modeling techniques that can accurately predict large bloom events that have the highest impact on recreational activities and drinking-water treatment processes. In this study, we compared 12 unique linear and nonlinear regression modeling techniques to predict cyanobacterial abundance and the cyanobacterial secondary metabolites microcystin and geosmin using 14 years of physiochemical water quality data collected from Cheney Reservoir, Kansas. Support vector machine (SVM), random forest (RF), boosted tree (BT), and Cubist modeling techniques were the most predictive of the compared modeling approaches. SVM, RF, and BT modeling techniques were able to successfully predict cyanobacterial abundance, microcystin, and geosmin concentrations <60,000 cells/mL, 2.5 µg/L, and 20 ng/L, respectively. Only Cubist modeling predicted maxima concentrations of cyanobacteria and geosmin; no modeling technique was able to predict maxima microcystin concentrations. Because maxima concentrations are a primary concern for lake and reservoir managers, Cubist modeling may help predict the largest and most noxious concentrations of cyanobacteria and their secondary metabolites.

  17. A Case Study Using Modeling and Simulation to Predict Logistics Supply Chain Issues

    NASA Technical Reports Server (NTRS)

    Tucker, David A.

    2007-01-01

    Optimization of critical supply chains to deliver thousands of parts, materials, sub-assemblies, and vehicle structures as needed is vital to the success of the Constellation Program. Thorough analysis needs to be performed on the integrated supply chain processes to plan, source, make, deliver, and return critical items efficiently. Process modeling provides simulation technology-based, predictive solutions for supply chain problems which enable decision makers to reduce costs, accelerate cycle time and improve business performance. For example, United Space Alliance, LLC utilized this approach in late 2006 to build simulation models that recreated shuttle orbiter thruster failures and predicted the potential impact of thruster removals on logistics spare assets. The main objective was the early identification of possible problems in providing thruster spares for the remainder of the Shuttle Flight Manifest. After extensive analysis the model results were used to quantify potential problems and led to improvement actions in the supply chain. Similarly the proper modeling and analysis of Constellation parts, materials, operations, and information flows will help ensure the efficiency of the critical logistics supply chains and the overall success of the program.

  18. Applying multibeam sonar and mathematical modeling for mapping seabed substrate and biota of offshore shallows

    NASA Astrophysics Data System (ADS)

    Herkül, Kristjan; Peterson, Anneliis; Paekivi, Sander

    2017-06-01

    Both basic science and marine spatial planning are in a need of high resolution spatially continuous data on seabed habitats and biota. As conventional point-wise sampling is unable to cover large spatial extents in high detail, it must be supplemented with remote sensing and modeling in order to fulfill the scientific and management needs. The combined use of in situ sampling, sonar scanning, and mathematical modeling is becoming the main method for mapping both abiotic and biotic seabed features. Further development and testing of the methods in varying locations and environmental settings is essential for moving towards unified and generally accepted methodology. To fill the relevant research gap in the Baltic Sea, we used multibeam sonar and mathematical modeling methods - generalized additive models (GAM) and random forest (RF) - together with underwater video to map seabed substrate and epibenthos of offshore shallows. In addition to testing the general applicability of the proposed complex of techniques, the predictive power of different sonar-based variables and modeling algorithms were tested. Mean depth, followed by mean backscatter, were the most influential variables in most of the models. Generally, mean values of sonar-based variables had higher predictive power than their standard deviations. The predictive accuracy of RF was higher than that of GAM. To conclude, we found the method to be feasible and with predictive accuracy similar to previous studies of sonar-based mapping.

  19. Prediction of community prevalence of human onchocerciasis in the Amazonian onchocerciasis focus: Bayesian approach.

    PubMed Central

    Carabin, Hélène; Escalona, Marisela; Marshall, Clare; Vivas-Martínez, Sarai; Botto, Carlos; Joseph, Lawrence; Basáñez, María-Gloria

    2003-01-01

    OBJECTIVE: To develop a Bayesian hierarchical model for human onchocerciasis with which to explore the factors that influence prevalence of microfilariae in the Amazonian focus of onchocerciasis and predict the probability of any community being at least mesoendemic (>20% prevalence of microfilariae), and thus in need of priority ivermectin treatment. METHODS: Models were developed with data from 732 individuals aged > or =15 years who lived in 29 Yanomami communities along four rivers of the south Venezuelan Orinoco basin. The models' abilities to predict prevalences of microfilariae in communities were compared. The deviance information criterion, Bayesian P-values, and residual values were used to select the best model with an approximate cross-validation procedure. FINDINGS: A three-level model that acknowledged clustering of infection within communities performed best, with host age and sex included at the individual level, a river-dependent altitude effect at the community level, and additional clustering of communities along rivers. This model correctly classified 25/29 (86%) villages with respect to their need for priority ivermectin treatment. CONCLUSION: Bayesian methods are a flexible and useful approach for public health research and control planning. Our model acknowledges the clustering of infection within communities, allows investigation of links between individual- or community-specific characteristics and infection, incorporates additional uncertainty due to missing covariate data, and informs policy decisions by predicting the probability that a new community is at least mesoendemic. PMID:12973640

  20. Prediction of community prevalence of human onchocerciasis in the Amazonian onchocerciasis focus: Bayesian approach.

    PubMed

    Carabin, Hélène; Escalona, Marisela; Marshall, Clare; Vivas-Martínez, Sarai; Botto, Carlos; Joseph, Lawrence; Basáñez, María-Gloria

    2003-01-01

    To develop a Bayesian hierarchical model for human onchocerciasis with which to explore the factors that influence prevalence of microfilariae in the Amazonian focus of onchocerciasis and predict the probability of any community being at least mesoendemic (>20% prevalence of microfilariae), and thus in need of priority ivermectin treatment. Models were developed with data from 732 individuals aged > or =15 years who lived in 29 Yanomami communities along four rivers of the south Venezuelan Orinoco basin. The models' abilities to predict prevalences of microfilariae in communities were compared. The deviance information criterion, Bayesian P-values, and residual values were used to select the best model with an approximate cross-validation procedure. A three-level model that acknowledged clustering of infection within communities performed best, with host age and sex included at the individual level, a river-dependent altitude effect at the community level, and additional clustering of communities along rivers. This model correctly classified 25/29 (86%) villages with respect to their need for priority ivermectin treatment. Bayesian methods are a flexible and useful approach for public health research and control planning. Our model acknowledges the clustering of infection within communities, allows investigation of links between individual- or community-specific characteristics and infection, incorporates additional uncertainty due to missing covariate data, and informs policy decisions by predicting the probability that a new community is at least mesoendemic.

  1. Comparing aboveground biomass predictions for an uneven-aged pine-dominated stand using local, regional, and national models

    Treesearch

    D.C. Bragg; K.M. McElligott

    2013-01-01

    Sequestration by Arkansas forests removes carbon dioxide from the atmosphere, storing this carbon in biomass that fills a number of critical ecological and socioeconomic functions. We need a better understanding of the contribution of forests to the carbon cycle, including the accurate quantification of tree biomass. Models have long been developed to predict...

  2. Watershed-scale evaluation of the Water Erosion Prediction Project (WEPP) model in the Lake Tahoe basin

    Treesearch

    Erin S. Brooks; Mariana Dobre; William J. Elliot; Joan Q. Wu; Jan Boll

    2016-01-01

    Forest managers need methods to evaluate the impacts of management at the watershed scale. The Water Erosion Prediction Project (WEPP) has the ability to model disturbed forested hillslopes, but has difficulty addressing some of the critical processes that are important at a watershed scale, including baseflow and water yield. In order to apply WEPP to...

  3. Predicting the spread of sudden oak death in California (2010-2030): epidemic outcomes under no control

    Treesearch

    Ross K. Meentemeyer; Nik Cunniffe; Alex Cook; David M. Rizzo; Chris A. Gilligan

    2010-01-01

    Landscape- to regional-scale models of plant epidemics are direly needed to predict largescale impacts of disease and assess practicable options for control. While landscape heterogeneity is recognized as a major driver of disease dynamics, epidemiological models are rarely applied to realistic landscape conditions due to computational and data limitations. Here we...

  4. Climate change in grasslands, shrublands, and deserts of the interior American West: a review and needs assessment

    Treesearch

    Deborah M. Finch

    2012-01-01

    Recent research and species distribution modeling predict large changes in the distributions of species and vegetation types in the western interior of the United States in response to climate change. This volume reviews existing climate models that predict species and vegetation changes in the western United States, and it synthesizes knowledge about climate change...

  5. Measurement error and timing of predictor values for multivariable risk prediction models are poorly reported.

    PubMed

    Whittle, Rebecca; Peat, George; Belcher, John; Collins, Gary S; Riley, Richard D

    2018-05-18

    Measurement error in predictor variables may threaten the validity of clinical prediction models. We sought to evaluate the possible extent of the problem. A secondary objective was to examine whether predictors are measured at the intended moment of model use. A systematic search of Medline was used to identify a sample of articles reporting the development of a clinical prediction model published in 2015. After screening according to a predefined inclusion criteria, information on predictors, strategies to control for measurement error and intended moment of model use were extracted. Susceptibility to measurement error for each predictor was classified into low and high risk. Thirty-three studies were reviewed, including 151 different predictors in the final prediction models. Fifty-one (33.7%) predictors were categorised as high risk of error, however this was not accounted for in the model development. Only 8 (24.2%) studies explicitly stated the intended moment of model use and when the predictors were measured. Reporting of measurement error and intended moment of model use is poor in prediction model studies. There is a need to identify circumstances where ignoring measurement error in prediction models is consequential and whether accounting for the error will improve the predictions. Copyright © 2018. Published by Elsevier Inc.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zamecnik, J. R.; Edwards, T. B.

    The conversions of nitrite to nitrate, the destruction of glycolate, and the conversion of glycolate to formate and oxalate were modeled for the Nitric-Glycolic flowsheet using data from Chemical Process Cell (CPC) simulant runs conducted by SRNL from 2011 to 2015. The goal of this work was to develop empirical correlations for these variables versus measureable variables from the chemical process so that these quantities could be predicted a-priori from the sludge composition and measurable processing variables. The need for these predictions arises from the need to predict the REDuction/OXidation (REDOX) state of the glass from the Defense Waste Processingmore » Facility (DWPF) melter. This report summarizes the initial work on these correlations based on the aforementioned data. Further refinement of the models as additional data is collected is recommended.« less

  7. Accurate prediction of energy expenditure using a shoe-based activity monitor.

    PubMed

    Sazonova, Nadezhda; Browning, Raymond C; Sazonov, Edward

    2011-07-01

    The aim of this study was to develop and validate a method for predicting energy expenditure (EE) using a footwear-based system with integrated accelerometer and pressure sensors. We developed a footwear-based device with an embedded accelerometer and insole pressure sensors for the prediction of EE. The data from the device can be used to perform accurate recognition of major postures and activities and to estimate EE using the acceleration, pressure, and posture/activity classification information in a branched algorithm without the need for individual calibration. We measured EE via indirect calorimetry as 16 adults (body mass index=19-39 kg·m) performed various low- to moderate-intensity activities and compared measured versus predicted EE using several models based on the acceleration and pressure signals. Inclusion of pressure data resulted in better accuracy of EE prediction during static postures such as sitting and standing. The activity-based branched model that included predictors from accelerometer and pressure sensors (BACC-PS) achieved the lowest error (e.g., root mean squared error (RMSE)=0.69 METs) compared with the accelerometer-only-based branched model BACC (RMSE=0.77 METs) and nonbranched model (RMSE=0.94-0.99 METs). Comparison of EE prediction models using data from both legs versus models using data from a single leg indicates that only one shoe needs to be equipped with sensors. These results suggest that foot acceleration combined with insole pressure measurement, when used in an activity-specific branched model, can accurately estimate the EE associated with common daily postures and activities. The accuracy and unobtrusiveness of a footwear-based device may make it an effective physical activity monitoring tool.

  8. Stem mortality in surface fires: Part II, experimental methods for characterizing the thermal response of tree stems to heating by fires

    Treesearch

    D. M. Jimenez; B. W. Butler; J. Reardon

    2003-01-01

    Current methods for predicting fire-induced plant mortality in shrubs and trees are largely empirical. These methods are not readily linked to duff burning, soil heating, and surface fire behavior models. In response to the need for a physics-based model of this process, a detailed model for predicting the temperature distribution through a tree stem as a function of...

  9. Construction and evaluation of FiND, a fall risk prediction model of inpatients from nursing data.

    PubMed

    Yokota, Shinichiroh; Ohe, Kazuhiko

    2016-04-01

    To construct and evaluate an easy-to-use fall risk prediction model based on the daily condition of inpatients from secondary use electronic medical record system data. The present authors scrutinized electronic medical record system data and created a dataset for analysis by including inpatient fall report data and Intensity of Nursing Care Needs data. The authors divided the analysis dataset into training data and testing data, then constructed the fall risk prediction model FiND from the training data, and tested the model using the testing data. The dataset for analysis contained 1,230,604 records from 46,241 patients. The sensitivity of the model constructed from the training data was 71.3% and the specificity was 66.0%. The verification result from the testing dataset was almost equivalent to the theoretical value. Although the model's accuracy did not surpass that of models developed in previous research, the authors believe FiND will be useful in medical institutions all over Japan because it is composed of few variables (only age, sex, and the Intensity of Nursing Care Needs items), and the accuracy for unknown data was clear. © 2016 Japan Academy of Nursing Science.

  10. The Antecedents of Coaches' Interpersonal Behaviors: The Role of the Coaching Context, Coaches' Psychological Needs, and Coaches' Motivation.

    PubMed

    Rocchi, Meredith; Pelletier, Luc G

    2017-10-01

    This study explored how the coaching context influences coaches' psychological needs, motivation, and reported interpersonal behaviors, using self-determination theory. In Study 1, 56 coaches identified how contextual factors influence their coaching experience. Coaches identified administration, athlete motivation, colleagues, parents, professional development, time, and work-life as having the largest impact on them. In Study 2, 424 coaches reported on their perceptions of the factors identified in Study 1 and their psychological needs, motivation, and interpersonal behaviors. Structural equation modeling analyses suggested perceptions of the coaching context supported or thwarted their psychological needs, which positively or negatively predicted their autonomous and controlled motivation. Coaches' autonomous motivation predicted their reported supportive interpersonal behaviors and controlled motivation predicted thwarting behaviors. Overall, the results provided additional support for understanding how the coaching context, coaches' psychological needs, and their motivation for coaching relate to their coaching behaviors.

  11. Near infrared spectroscopy to estimate the temperature reached on burned soils: strategies to develop robust models.

    NASA Astrophysics Data System (ADS)

    Guerrero, César; Pedrosa, Elisabete T.; Pérez-Bejarano, Andrea; Keizer, Jan Jacob

    2014-05-01

    The temperature reached on soils is an important parameter needed to describe the wildfire effects. However, the methods for measure the temperature reached on burned soils have been poorly developed. Recently, the use of the near-infrared (NIR) spectroscopy has been pointed as a valuable tool for this purpose. The NIR spectrum of a soil sample contains information of the organic matter (quantity and quality), clay (quantity and quality), minerals (such as carbonates and iron oxides) and water contents. Some of these components are modified by the heat, and each temperature causes a group of changes, leaving a typical fingerprint on the NIR spectrum. This technique needs the use of a model (or calibration) where the changes in the NIR spectra are related with the temperature reached. For the development of the model, several aliquots are heated at known temperatures, and used as standards in the calibration set. This model offers the possibility to make estimations of the temperature reached on a burned sample from its NIR spectrum. However, the estimation of the temperature reached using NIR spectroscopy is due to changes in several components, and cannot be attributed to changes in a unique soil component. Thus, we can estimate the temperature reached by the interaction between temperature and the thermo-sensible soil components. In addition, we cannot expect the uniform distribution of these components, even at small scale. Consequently, the proportion of these soil components can vary spatially across the site. This variation will be present in the samples used to construct the model and also in the samples affected by the wildfire. Therefore, the strategies followed to develop robust models should be focused to manage this expected variation. In this work we compared the prediction accuracy of models constructed with different approaches. These approaches were designed to provide insights about how to distribute the efforts needed for the development of robust models, since this step is the bottle-neck of this technique. In the first approach, a plot-scale model was used to predict the temperature reached in samples collected in other plots from the same site. In a plot-scale model, all the heated aliquots come from a unique plot-scale sample. As expected, the results obtained with this approach were deceptive, because this approach was assuming that a plot-scale model would be enough to represent the whole variability of the site. The accuracy (measured as the root mean square error of prediction, thereinafter RMSEP) was 86ºC, and the bias was also high (>30ºC). In the second approach, the temperatures predicted through several plot-scale models were averaged. The accuracy was improved (RMSEP=65ºC) respect the first approach, because the variability from several plots was considered and biased predictions were partially counterbalanced. However, this approach implies more efforts, since several plot-scale models are needed. In the third approach, the predictions were obtained with site-scale models. These models were constructed with aliquots from several plots. In this case, the results were accurate, since the RMSEP was around 40ºC, the bias was very small (<1ºC) and the R2 was 0.92. As expected, this approach clearly outperformed the second approach, in spite of the fact that the same efforts were needed. In a plot-scale model, only one interaction between temperature and soil components was modelled. However, several different interactions between temperature and soil components were present in the calibration matrix of a site-scale model. Consequently, the site-scale models were able to model the temperature reached excluding the influence of the differences in soil composition, resulting in more robust models respect that variation. Summarizing, the results were highlighting the importance of an adequate strategy to develop robust and accurate models with moderate efforts, and how a wrong strategy can result in deceptive predictions.

  12. An improved shuffled frog leaping algorithm based evolutionary framework for currency exchange rate prediction

    NASA Astrophysics Data System (ADS)

    Dash, Rajashree

    2017-11-01

    Forecasting purchasing power of one currency with respect to another currency is always an interesting topic in the field of financial time series prediction. Despite the existence of several traditional and computational models for currency exchange rate forecasting, there is always a need for developing simpler and more efficient model, which will produce better prediction capability. In this paper, an evolutionary framework is proposed by using an improved shuffled frog leaping (ISFL) algorithm with a computationally efficient functional link artificial neural network (CEFLANN) for prediction of currency exchange rate. The model is validated by observing the monthly prediction measures obtained for three currency exchange data sets such as USD/CAD, USD/CHF, and USD/JPY accumulated within same period of time. The model performance is also compared with two other evolutionary learning techniques such as Shuffled frog leaping algorithm and Particle Swarm optimization algorithm. Practical analysis of results suggest that, the proposed model developed using the ISFL algorithm with CEFLANN network is a promising predictor model for currency exchange rate prediction compared to other models included in the study.

  13. Multi-Temporal Decomposed Wind and Load Power Models for Electric Energy Systems

    NASA Astrophysics Data System (ADS)

    Abdel-Karim, Noha

    This thesis is motivated by the recognition that sources of uncertainties in electric power systems are multifold and may have potentially far-reaching effects. In the past, only system load forecast was considered to be the main challenge. More recently, however, the uncertain price of electricity and hard-to-predict power produced by renewable resources, such as wind and solar, are making the operating and planning environment much more challenging. The near-real-time power imbalances are compensated by means of frequency regulation and generally require fast-responding costly resources. Because of this, a more accurate forecast and look-ahead scheduling would result in a reduced need for expensive power balancing. Similarly, long-term planning and seasonal maintenance need to take into account long-term demand forecast as well as how the short-term generation scheduling is done. The better the demand forecast, the more efficient planning will be as well. Moreover, computer algorithms for scheduling and planning are essential in helping the system operators decide what to schedule and planners what to build. This is needed given the overall complexity created by different abilities to adjust the power output of generation technologies, demand uncertainties and by the network delivery constraints. Given the growing presence of major uncertainties, it is likely that the main control applications will use more probabilistic approaches. Today's predominantly deterministic methods will be replaced by methods which account for key uncertainties as decisions are made. It is well-understood that although demand and wind power cannot be predicted at very high accuracy, taking into consideration predictions and scheduling in a look-ahead way over several time horizons generally results in more efficient and reliable utilization, than when decisions are made assuming deterministic, often worst-case scenarios. This change is in approach is going to ultimately require new electricity market rules capable of providing the right incentives to manage uncertainties and of differentiating various technologies according to the rate at which they can respond to ever changing conditions. Given the overall need for modeling uncertainties in electric energy systems, we consider in this thesis the problem of multi-temporal modeling of wind and demand power, in particular. Historic data is used to derive prediction models for several future time horizons. Short-term prediction models derived can be used for look-ahead economic dispatch and unit commitment, while the long-term annual predictive models can be used for investment planning. As expected, the accuracy of such predictive models depends on the time horizons over which the predictions are made, as well as on the nature of uncertain signals. It is shown that predictive models obtained using the same general modeling approaches result in different accuracy for wind than for demand power. In what follows, we introduce several models which have qualitatively different patterns, ranging from hourly to annual. We first transform historic time-stamped data into the Fourier Transform (Fr) representation. The frequency domain data representation is used to decompose the wind and load power signals and to derive predictive models relevant for short-term and long-term predictions using extracted spectral techniques. The short-term results are interpreted next as a Linear Prediction Coding Model (LPC) and its accuracy is analyzed. Next, a new Markov-Based Sensitivity Model (MBSM) for short term prediction has been proposed and the dispatched costs of uncertainties for different predictive models with comparisons have been developed. Moreover, the Discrete Markov Process (DMP) representation is applied to help assess probabilities of most likely short-, medium- and long-term states and the related multi-temporal risks. In addition, this thesis discusses operational impacts of wind power integration in different scenario levels by performing more than 9,000 AC Optimal Power Flow runs. The effects of both wind and load variations on system constraints and costs are presented. The limitations of DC Optimal Power Flow (DCOPF) vs. ACOPF are emphasized by means of system convergence problems due to the effect of wind power on changing line flows and net power injections. By studying the effect of having wind power on line flows, we found that the divergence problem applies in areas with high wind and hydro generation capacity share (cheap generations). (Abstract shortened by UMI.).

  14. Simple, empirical approach to predict neutron capture cross sections from nuclear masses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Couture, Aaron Joseph; Casten, Richard F.; Cakirli, R. B.

    Here, neutron capture cross sections are essential to understanding the astrophysical s and r processes, the modeling of nuclear reactor design and performance, and for a wide variety of nuclear forensics applications. Often, cross sections are needed for nuclei where experimental measurements are difficult. Enormous effort, over many decades, has gone into attempting to develop sophisticated statistical reaction models to predict these cross sections. Such work has met with some success but is often unable to reproduce measured cross sections to better than 40%, and has limited predictive power, with predictions from different models rapidly differing by an order ofmore » magnitude a few nucleons from the last measurement.« less

  15. Simple, empirical approach to predict neutron capture cross sections from nuclear masses

    DOE PAGES

    Couture, Aaron Joseph; Casten, Richard F.; Cakirli, R. B.

    2017-12-20

    Here, neutron capture cross sections are essential to understanding the astrophysical s and r processes, the modeling of nuclear reactor design and performance, and for a wide variety of nuclear forensics applications. Often, cross sections are needed for nuclei where experimental measurements are difficult. Enormous effort, over many decades, has gone into attempting to develop sophisticated statistical reaction models to predict these cross sections. Such work has met with some success but is often unable to reproduce measured cross sections to better than 40%, and has limited predictive power, with predictions from different models rapidly differing by an order ofmore » magnitude a few nucleons from the last measurement.« less

  16. A systematic review of models to predict recruitment to multicentre clinical trials

    PubMed Central

    2010-01-01

    Background Less than one third of publicly funded trials managed to recruit according to their original plan often resulting in request for additional funding and/or time extensions. The aim was to identify models which might be useful to a major public funder of randomised controlled trials when estimating likely time requirements for recruiting trial participants. The requirements of a useful model were identified as usability, based on experience, able to reflect time trends, accounting for centre recruitment and contribution to a commissioning decision. Methods A systematic review of English language articles using MEDLINE and EMBASE. Search terms included: randomised controlled trial, patient, accrual, predict, enrol, models, statistical; Bayes Theorem; Decision Theory; Monte Carlo Method and Poisson. Only studies discussing prediction of recruitment to trials using a modelling approach were included. Information was extracted from articles by one author, and checked by a second, using a pre-defined form. Results Out of 326 identified abstracts, only 8 met all the inclusion criteria. Of these 8 studies examined, there are five major classes of model discussed: the unconditional model, the conditional model, the Poisson model, Bayesian models and Monte Carlo simulation of Markov models. None of these meet all the pre-identified needs of the funder. Conclusions To meet the needs of a number of research programmes, a new model is required as a matter of importance. Any model chosen should be validated against both retrospective and prospective data, to ensure the predictions it gives are superior to those currently used. PMID:20604946

  17. College Students' Drinking and Posting About Alcohol: Forwarding a Model of Motivations, Behaviors, and Consequences.

    PubMed

    Thompson, Charee M; Romo, Lynsey K

    2016-06-01

    College drinking continues to remain a public health problem that has been exacerbated by alcohol-related posts on social networking sites (SNSs). Although existing research has linked alcohol consumption, alcohol posts, and adverse consequences to one another, comprehensive explanations for these associations have been largely unexplored. Thus, we reasoned that students' personal motivations (i.e., espousing an alcohol identity, needing entertainment, and adhering to social norms) influence their behaviors (i.e., alcohol consumption and alcohol-related posting on SNSs), which can lead to alcohol problems. Using structural equation modeling, we analyzed data from 364 undergraduate students and found general support for our model. In particular, espousing an alcohol identity predicted alcohol consumption and alcohol-related SNS posting, needing entertainment predicted alcohol consumption but not alcohol-related SNS posting, and adhering to social norms predicted alcohol-related SNS posting but not alcohol consumption. In turn, alcohol consumption and alcohol-related SNS posting predicted alcohol problems. It is surprising that alcohol-related SNS posting was a stronger predictor of alcohol problems than alcohol consumption. We discuss the findings within their applied applications for college student health.

  18. Multiaxial and Thermomechanical Fatigue of Materials: A Historical Perspective and Some Future Challenges

    NASA Technical Reports Server (NTRS)

    Kalluri, Sreeramesh

    2013-01-01

    Structural materials used in engineering applications routinely subjected to repetitive mechanical loads in multiple directions under non-isothermal conditions. Over past few decades, several multiaxial fatigue life estimation models (stress- and strain-based) developed for isothermal conditions. Historically, numerous fatigue life prediction models also developed for thermomechanical fatigue (TMF) life prediction, predominantly for uniaxial mechanical loading conditions. Realistic structural components encounter multiaxial loads and non-isothermal loading conditions, which increase potential for interaction of damage modes. A need exists for mechanical testing and development verification of life prediction models under such conditions.

  19. Predicting recycling behaviour: Comparison of a linear regression model and a fuzzy logic model.

    PubMed

    Vesely, Stepan; Klöckner, Christian A; Dohnal, Mirko

    2016-03-01

    In this paper we demonstrate that fuzzy logic can provide a better tool for predicting recycling behaviour than the customarily used linear regression. To show this, we take a set of empirical data on recycling behaviour (N=664), which we randomly divide into two halves. The first half is used to estimate a linear regression model of recycling behaviour, and to develop a fuzzy logic model of recycling behaviour. As the first comparison, the fit of both models to the data included in estimation of the models (N=332) is evaluated. As the second comparison, predictive accuracy of both models for "new" cases (hold-out data not included in building the models, N=332) is assessed. In both cases, the fuzzy logic model significantly outperforms the regression model in terms of fit. To conclude, when accurate predictions of recycling and possibly other environmental behaviours are needed, fuzzy logic modelling seems to be a promising technique. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Demonstrating the improvement of predictive maturity of a computational model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M; Unal, Cetin; Atamturktur, Huriye S

    2010-01-01

    We demonstrate an improvement of predictive capability brought to a non-linear material model using a combination of test data, sensitivity analysis, uncertainty quantification, and calibration. A model that captures increasingly complicated phenomena, such as plasticity, temperature and strain rate effects, is analyzed. Predictive maturity is defined, here, as the accuracy of the model to predict multiple Hopkinson bar experiments. A statistical discrepancy quantifies the systematic disagreement (bias) between measurements and predictions. Our hypothesis is that improving the predictive capability of a model should translate into better agreement between measurements and predictions. This agreement, in turn, should lead to a smallermore » discrepancy. We have recently proposed to use discrepancy and coverage, that is, the extent to which the physical experiments used for calibration populate the regime of applicability of the model, as basis to define a Predictive Maturity Index (PMI). It was shown that predictive maturity could be improved when additional physical tests are made available to increase coverage of the regime of applicability. This contribution illustrates how the PMI changes as 'better' physics are implemented in the model. The application is the non-linear Preston-Tonks-Wallace (PTW) strength model applied to Beryllium metal. We demonstrate that our framework tracks the evolution of maturity of the PTW model. Robustness of the PMI with respect to the selection of coefficients needed in its definition is also studied.« less

  1. Functional neuroimaging of psychotherapeutic processes in anxiety and depression: from mechanisms to predictions.

    PubMed

    Lueken, Ulrike; Hahn, Tim

    2016-01-01

    The review provides an update of functional neuroimaging studies that identify neural processes underlying psychotherapy and predict outcomes following psychotherapeutic treatment in anxiety and depressive disorders. Following current developments in this field, studies were classified as 'mechanistic' or 'predictor' studies (i.e., informing neurobiological models about putative mechanisms versus aiming to provide predictive information). Mechanistic evidence points toward a dual-process model of psychotherapy in anxiety disorders with abnormally increased limbic activation being decreased, while prefrontal activity is increased. Partly overlapping findings are reported for depression, albeit with a stronger focus on prefrontal activation following treatment. No studies directly comparing neural pathways of psychotherapy between anxiety and depression were detected. Consensus is accumulating for an overarching role of the anterior cingulate cortex in modulating treatment response across disorders. When aiming to quantify clinical utility, the need for single-subject predictions is increasingly recognized and predictions based on machine learning approaches show high translational potential. Present findings encourage the search for predictors providing clinically meaningful information for single patients. However, independent validation as a crucial prerequisite for clinical use is still needed. Identifying nonresponders a priori creates the need for alternative treatment options that can be developed based on an improved understanding of those neural mechanisms underlying effective interventions.

  2. Extrapolating cetacean densities to quantitatively assess human impacts on populations in the high seas.

    PubMed

    Mannocci, Laura; Roberts, Jason J; Miller, David L; Halpin, Patrick N

    2017-06-01

    As human activities expand beyond national jurisdictions to the high seas, there is an increasing need to consider anthropogenic impacts to species inhabiting these waters. The current scarcity of scientific observations of cetaceans in the high seas impedes the assessment of population-level impacts of these activities. We developed plausible density estimates to facilitate a quantitative assessment of anthropogenic impacts on cetacean populations in these waters. Our study region extended from a well-surveyed region within the U.S. Exclusive Economic Zone into a large region of the western North Atlantic sparsely surveyed for cetaceans. We modeled densities of 15 cetacean taxa with available line transect survey data and habitat covariates and extrapolated predictions to sparsely surveyed regions. We formulated models to reduce the extent of extrapolation beyond covariate ranges, and constrained them to model simple and generalizable relationships. To evaluate confidence in the predictions, we mapped where predictions were made outside sampled covariate ranges, examined alternate models, and compared predicted densities with maps of sightings from sources that could not be integrated into our models. Confidence levels in model results depended on the taxon and geographic area and highlighted the need for additional surveying in environmentally distinct areas. With application of necessary caution, our density estimates can inform management needs in the high seas, such as the quantification of potential cetacean interactions with military training exercises, shipping, fisheries, and deep-sea mining and be used to delineate areas of special biological significance in international waters. Our approach is generally applicable to other marine taxa and geographic regions for which management will be implemented but data are sparse. © 2016 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  3. Dissolved oxygen content prediction in crab culture using a hybrid intelligent method

    PubMed Central

    Yu, Huihui; Chen, Yingyi; Hassan, ShahbazGul; Li, Daoliang

    2016-01-01

    A precise predictive model is needed to obtain a clear understanding of the changing dissolved oxygen content in outdoor crab ponds, to assess how to reduce risk and to optimize water quality management. The uncertainties in the data from multiple sensors are a significant factor when building a dissolved oxygen content prediction model. To increase prediction accuracy, a new hybrid dissolved oxygen content forecasting model based on the radial basis function neural networks (RBFNN) data fusion method and a least squares support vector machine (LSSVM) with an optimal improved particle swarm optimization(IPSO) is developed. In the modelling process, the RBFNN data fusion method is used to improve information accuracy and provide more trustworthy training samples for the IPSO-LSSVM prediction model. The LSSVM is a powerful tool for achieving nonlinear dissolved oxygen content forecasting. In addition, an improved particle swarm optimization algorithm is developed to determine the optimal parameters for the LSSVM with high accuracy and generalizability. In this study, the comparison of the prediction results of different traditional models validates the effectiveness and accuracy of the proposed hybrid RBFNN-IPSO-LSSVM model for dissolved oxygen content prediction in outdoor crab ponds. PMID:27270206

  4. Dissolved oxygen content prediction in crab culture using a hybrid intelligent method.

    PubMed

    Yu, Huihui; Chen, Yingyi; Hassan, ShahbazGul; Li, Daoliang

    2016-06-08

    A precise predictive model is needed to obtain a clear understanding of the changing dissolved oxygen content in outdoor crab ponds, to assess how to reduce risk and to optimize water quality management. The uncertainties in the data from multiple sensors are a significant factor when building a dissolved oxygen content prediction model. To increase prediction accuracy, a new hybrid dissolved oxygen content forecasting model based on the radial basis function neural networks (RBFNN) data fusion method and a least squares support vector machine (LSSVM) with an optimal improved particle swarm optimization(IPSO) is developed. In the modelling process, the RBFNN data fusion method is used to improve information accuracy and provide more trustworthy training samples for the IPSO-LSSVM prediction model. The LSSVM is a powerful tool for achieving nonlinear dissolved oxygen content forecasting. In addition, an improved particle swarm optimization algorithm is developed to determine the optimal parameters for the LSSVM with high accuracy and generalizability. In this study, the comparison of the prediction results of different traditional models validates the effectiveness and accuracy of the proposed hybrid RBFNN-IPSO-LSSVM model for dissolved oxygen content prediction in outdoor crab ponds.

  5. Improving the forecast for biodiversity under climate change.

    PubMed

    Urban, M C; Bocedi, G; Hendry, A P; Mihoub, J-B; Pe'er, G; Singer, A; Bridle, J R; Crozier, L G; De Meester, L; Godsoe, W; Gonzalez, A; Hellmann, J J; Holt, R D; Huth, A; Johst, K; Krug, C B; Leadley, P W; Palmer, S C F; Pantel, J H; Schmitz, A; Zollner, P A; Travis, J M J

    2016-09-09

    New biological models are incorporating the realistic processes underlying biological responses to climate change and other human-caused disturbances. However, these more realistic models require detailed information, which is lacking for most species on Earth. Current monitoring efforts mainly document changes in biodiversity, rather than collecting the mechanistic data needed to predict future changes. We describe and prioritize the biological information needed to inform more realistic projections of species' responses to climate change. We also highlight how trait-based approaches and adaptive modeling can leverage sparse data to make broader predictions. We outline a global effort to collect the data necessary to better understand, anticipate, and reduce the damaging effects of climate change on biodiversity. Copyright © 2016, American Association for the Advancement of Science.

  6. A systematic review of breast cancer incidence risk prediction models with meta-analysis of their performance.

    PubMed

    Meads, Catherine; Ahmed, Ikhlaaq; Riley, Richard D

    2012-04-01

    A risk prediction model is a statistical tool for estimating the probability that a currently healthy individual with specific risk factors will develop a condition in the future such as breast cancer. Reliably accurate prediction models can inform future disease burdens, health policies and individual decisions. Breast cancer prediction models containing modifiable risk factors, such as alcohol consumption, BMI or weight, condom use, exogenous hormone use and physical activity, are of particular interest to women who might be considering how to reduce their risk of breast cancer and clinicians developing health policies to reduce population incidence rates. We performed a systematic review to identify and evaluate the performance of prediction models for breast cancer that contain modifiable factors. A protocol was developed and a sensitive search in databases including MEDLINE and EMBASE was conducted in June 2010. Extensive use was made of reference lists. Included were any articles proposing or validating a breast cancer prediction model in a general female population, with no language restrictions. Duplicate data extraction and quality assessment were conducted. Results were summarised qualitatively, and where possible meta-analysis of model performance statistics was undertaken. The systematic review found 17 breast cancer models, each containing a different but often overlapping set of modifiable and other risk factors, combined with an estimated baseline risk that was also often different. Quality of reporting was generally poor, with characteristics of included participants and fitted model results often missing. Only four models received independent validation in external data, most notably the 'Gail 2' model with 12 validations. None of the models demonstrated consistently outstanding ability to accurately discriminate between those who did and those who did not develop breast cancer. For example, random-effects meta-analyses of the performance of the 'Gail 2' model showed the average C statistic was 0.63 (95% CI 0.59-0.67), and the expected/observed ratio of events varied considerably across studies (95% prediction interval for E/O ratio when the model was applied in practice was 0.75-1.19). There is a need for models with better predictive performance but, given the large amount of work already conducted, further improvement of existing models based on conventional risk factors is perhaps unlikely. Research to identify new risk factors with large additionally predictive ability is therefore needed, alongside clearer reporting and continual validation of new models as they develop.

  7. Predicting operator workload during system design

    NASA Technical Reports Server (NTRS)

    Aldrich, Theodore B.; Szabo, Sandra M.

    1988-01-01

    A workload prediction methodology was developed in response to the need to measure workloads associated with operation of advanced aircraft. The application of the methodology will involve: (1) conducting mission/task analyses of critical mission segments and assigning estimates of workload for the sensory, cognitive, and psychomotor workload components of each task identified; (2) developing computer-based workload prediction models using the task analysis data; and (3) exercising the computer models to produce predictions of crew workload under varying automation and/or crew configurations. Critical issues include reliability and validity of workload predictors and selection of appropriate criterion measures.

  8. Validation of the Registry to Evaluate Early and Long-Term Pulmonary Arterial Hypertension Disease Management (REVEAL) pulmonary hypertension prediction model in a unique population and utility in the prediction of long-term survival.

    PubMed

    Cogswell, Rebecca; Kobashigawa, Erin; McGlothlin, Dana; Shaw, Robin; De Marco, Teresa

    2012-11-01

    The Registry to Evaluate Early and Long-Term Pulmonary Arterial (PAH) Hypertension Disease Management (REVEAL) model was designed to predict 1-year survival in patients with PAH. Multivariate prediction models need to be evaluated in cohorts distinct from the derivation set to determine external validity. In addition, limited data exist on the utility of this model in the prediction of long-term survival. REVEAL model performance was assessed to predict 1-year and 5-year outcomes, defined as survival or composite survival or freedom from lung transplant, in 140 patients with PAH. The validation cohort had a higher proportion of human immunodeficiency virus (7.9% vs 1.9%, p < 0.0001), methamphetamine use (19.3% vs 4.9%, p < 0.0001), and portal hypertension PAH (16.4% vs 5.1%, p < 0.0001) compared with the development cohort. The C-index of the model to predict survival was 0.765 at 1 year and 0.712 at 5 years of follow-up. The C-index of the model to predict composite survival or freedom from lung transplant was 0.805 and 0.724 at 1 and 5 years of follow-up, respectively. Prediction by the model, however, was weakest among patients with intermediate-risk predicted survival. The REVEAL model had adequate discrimination to predict 1-year survival in this small but clinically distinct validation cohort. Although the model also had predictive ability out to 5 years, prediction was limited among patients of intermediate risk, suggesting our prediction methods can still be improved. Copyright © 2012. Published by Elsevier Inc.

  9. Cellular automata model for use with real freeway data

    DOT National Transportation Integrated Search

    2002-01-01

    The exponential rate of increase in freeway traffic is expanding the need for accurate and : realistic methods to model and predict traffic flow. Traffic modeling and simulation facilitates an : examination of both microscopic and macroscopic views o...

  10. Habitat features and predictive habitat modeling for the Colorado chipmunk in southern New Mexico

    USGS Publications Warehouse

    Rivieccio, M.; Thompson, B.C.; Gould, W.R.; Boykin, K.G.

    2003-01-01

    Two subspecies of Colorado chipmunk (state threatened and federal species of concern) occur in southern New Mexico: Tamias quadrivittatus australis in the Organ Mountains and T. q. oscuraensis in the Oscura Mountains. We developed a GIS model of potentially suitable habitat based on vegetation and elevation features, evaluated site classifications of the GIS model, and determined vegetation and terrain features associated with chipmunk occurrence. We compared GIS model classifications with actual vegetation and elevation features measured at 37 sites. At 60 sites we measured 18 habitat variables regarding slope, aspect, tree species, shrub species, and ground cover. We used logistic regression to analyze habitat variables associated with chipmunk presence/absence. All (100%) 37 sample sites (28 predicted suitable, 9 predicted unsuitable) were classified correctly by the GIS model regarding elevation and vegetation. For 28 sites predicted suitable by the GIS model, 18 sites (64%) appeared visually suitable based on habitat variables selected from logistic regression analyses, of which 10 sites (36%) were specifically predicted as suitable habitat via logistic regression. We detected chipmunks at 70% of sites deemed suitable via the logistic regression models. Shrub cover, tree density, plant proximity, presence of logs, and presence of rock outcrop were retained in the logistic model for the Oscura Mountains; litter, shrub cover, and grass cover were retained in the logistic model for the Organ Mountains. Evaluation of predictive models illustrates the need for multi-stage analyses to best judge performance. Microhabitat analyses indicate prospective needs for different management strategies between the subspecies. Sensitivities of each population of the Colorado chipmunk to natural and prescribed fire suggest that partial burnings of areas inhabited by Colorado chipmunks in southern New Mexico may be beneficial. These partial burnings may later help avoid a fire that could substantially reduce habitat of chipmunks over a mountain range.

  11. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  12. Prediction of Metabolism of Drugs using Artificial Intelligence: How far have we reached?

    PubMed

    Kumar, Rajnish; Sharma, Anju; Siddiqui, Mohammed Haris; Tiwari, Rajesh Kumar

    2016-01-01

    Information about drug metabolism is an essential component of drug development. Modeling the drug metabolism requires identification of the involved enzymes, rate and extent of metabolism, the sites of metabolism etc. There has been continuous attempts in the prediction of metabolism of drugs using artificial intelligence in effort to reduce the attrition rate of drug candidates entering to preclinical and clinical trials. Currently, there are number of predictive models available for metabolism using Support vector machines, Artificial neural networks, Bayesian classifiers etc. There is an urgent need to review their progress so far and address the existing challenges in prediction of metabolism. In this attempt, we are presenting the currently available literature models and some of the critical issues regarding prediction of drug metabolism.

  13. Surgical Risk Preoperative Assessment System (SURPAS): II. Parsimonious Risk Models for Postoperative Adverse Outcomes Addressing Need for Laboratory Variables and Surgeon Specialty-specific Models.

    PubMed

    Meguid, Robert A; Bronsert, Michael R; Juarez-Colunga, Elizabeth; Hammermeister, Karl E; Henderson, William G

    2016-07-01

    To develop parsimonious prediction models for postoperative mortality, overall morbidity, and 6 complication clusters applicable to a broad range of surgical operations in adult patients. Quantitative risk assessment tools are not routinely used for preoperative patient assessment, shared decision making, informed consent, and preoperative patient optimization, likely due in part to the burden of data collection and the complexity of incorporation into routine surgical practice. Multivariable forward selection stepwise logistic regression analyses were used to develop predictive models for 30-day mortality, overall morbidity, and 6 postoperative complication clusters, using 40 preoperative variables from 2,275,240 surgical cases in the American College of Surgeons National Surgical Quality Improvement Program data set, 2005 to 2012. For the mortality and overall morbidity outcomes, prediction models were compared with and without preoperative laboratory variables, and generic models (based on all of the data from 9 surgical specialties) were compared with specialty-specific models. In each model, the cumulative c-index was used to examine the contribution of each added predictor variable. C-indexes, Hosmer-Lemeshow analyses, and Brier scores were used to compare discrimination and calibration between models. For the mortality and overall morbidity outcomes, the prediction models without the preoperative laboratory variables performed as well as the models with the laboratory variables, and the generic models performed as well as the specialty-specific models. The c-indexes were 0.938 for mortality, 0.810 for overall morbidity, and for the 6 complication clusters ranged from 0.757 for infectious to 0.897 for pulmonary complications. Across the 8 prediction models, the first 7 to 11 variables entered accounted for at least 99% of the c-index of the full model (using up to 28 nonlaboratory predictor variables). Our results suggest that it will be possible to develop parsimonious models to predict 8 important postoperative outcomes for a broad surgical population, without the need for surgeon specialty-specific models or inclusion of laboratory variables.

  14. Proposals for enhanced health risk assessment and stratification in an integrated care scenario

    PubMed Central

    Dueñas-Espín, Ivan; Vela, Emili; Pauws, Steffen; Bescos, Cristina; Cano, Isaac; Cleries, Montserrat; Contel, Joan Carles; de Manuel Keenoy, Esteban; Garcia-Aymerich, Judith; Gomez-Cabrero, David; Kaye, Rachelle; Lahr, Maarten M H; Lluch-Ariet, Magí; Moharra, Montserrat; Monterde, David; Mora, Joana; Nalin, Marco; Pavlickova, Andrea; Piera, Jordi; Ponce, Sara; Santaeugenia, Sebastià; Schonenberg, Helen; Störk, Stefan; Tegner, Jesper; Velickovski, Filip; Westerteicher, Christoph; Roca, Josep

    2016-01-01

    Objectives Population-based health risk assessment and stratification are considered highly relevant for large-scale implementation of integrated care by facilitating services design and case identification. The principal objective of the study was to analyse five health-risk assessment strategies and health indicators used in the five regions participating in the Advancing Care Coordination and Telehealth Deployment (ACT) programme (http://www.act-programme.eu). The second purpose was to elaborate on strategies toward enhanced health risk predictive modelling in the clinical scenario. Settings The five ACT regions: Scotland (UK), Basque Country (ES), Catalonia (ES), Lombardy (I) and Groningen (NL). Participants Responsible teams for regional data management in the five ACT regions. Primary and secondary outcome measures We characterised and compared risk assessment strategies among ACT regions by analysing operational health risk predictive modelling tools for population-based stratification, as well as available health indicators at regional level. The analysis of the risk assessment tool deployed in Catalonia in 2015 (GMAs, Adjusted Morbidity Groups) was used as a basis to propose how population-based analytics could contribute to clinical risk prediction. Results There was consensus on the need for a population health approach to generate health risk predictive modelling. However, this strategy was fully in place only in two ACT regions: Basque Country and Catalonia. We found marked differences among regions in health risk predictive modelling tools and health indicators, and identified key factors constraining their comparability. The research proposes means to overcome current limitations and the use of population-based health risk prediction for enhanced clinical risk assessment. Conclusions The results indicate the need for further efforts to improve both comparability and flexibility of current population-based health risk predictive modelling approaches. Applicability and impact of the proposals for enhanced clinical risk assessment require prospective evaluation. PMID:27084274

  15. Configuration of the thermal landscape determines thermoregulatory performance of ectotherms

    PubMed Central

    Sears, Michael W.; Angilletta, Michael J.; Schuler, Matthew S.; Borchert, Jason; Dilliplane, Katherine F.; Stegman, Monica; Rusch, Travis W.; Mitchell, William A.

    2016-01-01

    Although most organisms thermoregulate behaviorally, biologists still cannot easily predict whether mobile animals will thermoregulate in natural environments. Current models fail because they ignore how the spatial distribution of thermal resources constrains thermoregulatory performance over space and time. To overcome this limitation, we modeled the spatially explicit movements of animals constrained by access to thermal resources. Our models predict that ectotherms thermoregulate more accurately when thermal resources are dispersed throughout space than when these resources are clumped. This prediction was supported by thermoregulatory behaviors of lizards in outdoor arenas with known distributions of environmental temperatures. Further, simulations showed how the spatial structure of the landscape qualitatively affects responses of animals to climate. Biologists will need spatially explicit models to predict impacts of climate change on local scales. PMID:27601639

  16. Response of hydrology to climate change in the southern Appalachian mountains using Bayesian inference

    Treesearch

    Wei Wu; James S. Clark; James M. Vose

    2012-01-01

    Predicting long-term consequences of climate change on hydrologic processes has been limited due to the needs to accommodate the uncertainties in hydrological measurements for calibration, and to account for the uncertainties in the models that would ingest those calibrations and uncertainties in climate predictions as basis for hydrological predictions. We implemented...

  17. Prioritizing CD4 Count Monitoring in Response to ART in Resource-Constrained Settings: A Retrospective Application of Prediction-Based Classification

    PubMed Central

    Liu, Yan; Li, Xiaohong; Johnson, Margaret; Smith, Collette; Kamarulzaman, Adeeba bte; Montaner, Julio; Mounzer, Karam; Saag, Michael; Cahn, Pedro; Cesar, Carina; Krolewiecki, Alejandro; Sanne, Ian; Montaner, Luis J.

    2012-01-01

    Background Global programs of anti-HIV treatment depend on sustained laboratory capacity to assess treatment initiation thresholds and treatment response over time. Currently, there is no valid alternative to CD4 count testing for monitoring immunologic responses to treatment, but laboratory cost and capacity limit access to CD4 testing in resource-constrained settings. Thus, methods to prioritize patients for CD4 count testing could improve treatment monitoring by optimizing resource allocation. Methods and Findings Using a prospective cohort of HIV-infected patients (n = 1,956) monitored upon antiretroviral therapy initiation in seven clinical sites with distinct geographical and socio-economic settings, we retrospectively apply a novel prediction-based classification (PBC) modeling method. The model uses repeatedly measured biomarkers (white blood cell count and lymphocyte percent) to predict CD4+ T cell outcome through first-stage modeling and subsequent classification based on clinically relevant thresholds (CD4+ T cell count of 200 or 350 cells/µl). The algorithm correctly classified 90% (cross-validation estimate = 91.5%, standard deviation [SD] = 4.5%) of CD4 count measurements <200 cells/µl in the first year of follow-up; if laboratory testing is applied only to patients predicted to be below the 200-cells/µl threshold, we estimate a potential savings of 54.3% (SD = 4.2%) in CD4 testing capacity. A capacity savings of 34% (SD = 3.9%) is predicted using a CD4 threshold of 350 cells/µl. Similar results were obtained over the 3 y of follow-up available (n = 619). Limitations include a need for future economic healthcare outcome analysis, a need for assessment of extensibility beyond the 3-y observation time, and the need to assign a false positive threshold. Conclusions Our results support the use of PBC modeling as a triage point at the laboratory, lessening the need for laboratory-based CD4+ T cell count testing; implementation of this tool could help optimize the use of laboratory resources, directing CD4 testing towards higher-risk patients. However, further prospective studies and economic analyses are needed to demonstrate that the PBC model can be effectively applied in clinical settings. Please see later in the article for the Editors' Summary PMID:22529752

  18. Is variety a spice of (an active) life?: perceived variety, exercise behavior, and the mediating role of autonomous motivation.

    PubMed

    Sylvester, Benjamin D; Standage, Martyn; Ark, Tavinder K; Sweet, Shane N; Crocker, Peter R; Zumbo, Bruno D; Beauchamp, Mark R

    2014-10-01

    In this study, we examined whether perceived variety in exercise prospectively predicts unique variance in exercise behavior when examined alongside satisfaction of the three basic psychological needs (for competence, relatedness, and autonomy) embedded within self-determination theory (Ryan & Deci, 2002), through the mediating role of autonomous and controlled motivation. A convenience sample of community adults (N = 363) completed online questionnaires twice over a 6-week period. The results of structural equation modeling showed perceived variety and satisfaction of the needs for competence and relatedness to be unique indirect positive predictors of exercise behavior (through autonomous motivation) 6 weeks later. In addition, satisfaction of the need for autonomy was found to negatively predict controlled motivation. Perceived variety in exercise complemented satisfaction of the needs for competence, relatedness, and autonomy in predicting motivation and (indirectly) exercise behavior, and may act as a salient mechanism in the prediction of autonomous motivation and behavior in exercise settings.

  19. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes

    PubMed Central

    Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-01-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215

  20. The effects of autonomy-supportive coaching, need satisfaction, and self-perceptions on initiative and identity in youth swimmers.

    PubMed

    Coatsworth, J Douglas; Conroy, David E

    2009-03-01

    This study tested a sequential process model linking youth sport coaching climates (perceived coach behaviors and perceived need satisfaction) to youth self-perceptions (perceived competence and global self-esteem) and youth development outcomes (initiative, identity reflection, identity exploration). A sample of 119 youth between the ages of 10 and 18 who participated in a community-directed summer swim league completed questionnaires over the course of the 7-week season. Results indicated that coaches' autonomy support, particularly via process-focused praise, predicted youth competence need satisfaction and relatedness need satisfaction in the coaching relationship. Youth competence need satisfaction predicted self-esteem indirectly via perceived competence. Finally, self-esteem predicted identity reflection, and perceived competence predicted both identity reflection and initiative. Effects of age, sex, and perceptions of direct contact with the coach were not significant. Findings suggest that the quality of the coaching climate is an important predictor of the developmental benefits of sport participation and that one pathway by which the coaching climate has its effect on initiative and identity reflection is through developing youth self-perceptions.

  1. Using ensemble rainfall predictions in a countrywide flood forecasting model in Scotland

    NASA Astrophysics Data System (ADS)

    Cranston, M. D.; Maxey, R.; Tavendale, A. C. W.; Buchanan, P.

    2012-04-01

    Improving flood predictions for all sources of flooding is at the centre of flood risk management policy in Scotland. With the introduction of the Flood Risk Management (Scotland) Act providing a new statutory basis for SEPA's flood warning responsibilities, the pressures on delivering hydrological science developments in support of this legislation has increased. Specifically, flood forecasting capabilities need to develop in support of the need to reduce the impact of flooding through the provision of actively disseminated, reliable and timely flood warnings. Flood forecasting in Scotland has developed significantly in recent years (Cranston and Tavendale, 2012). The development of hydrological models to predict flooding at a catchment scale has relied upon the application of rainfall runoff models utilising raingauge, radar and quantitative precipitation forecasts in the short lead time (less than 6 hours). Single or deterministic forecasts based on highly uncertain rainfall predictions have led to the greatest operational difficulties when communicating flood risk with emergency responders, therefore the emergence of probability-based estimates offers the greatest opportunity for managing uncertain predictions. This paper presents operational application of a physical-conceptual distributed hydrological model on a countrywide basis across Scotland. Developed by CEH Wallingford for SEPA in 2011, Grid-to-Grid (G2G) principally runs in deterministic mode and employs radar and raingauge estimates of rainfall together with weather model predictions to produce forecast river flows, as gridded time-series at a resolution of 1km and for up to 5 days ahead (Cranston, et al., 2012). However the G2G model is now being run operationally using ensemble predictions of rainfall from the MOGREPS-R system to provide probabilistic flood forecasts. By presenting a range of flood predictions on a national scale through this approach, hydrologists are now able to consider an objective measure of the likelihood of flooding impacts to help with risk based emergency communication.

  2. Modelling of the 10-micrometer natural laser emission from the mesospheres of Mars and Venus

    NASA Technical Reports Server (NTRS)

    Deming, D.; Mumma, M. J.

    1983-01-01

    The NLTE radiative transfer problem is solved to obtain the 00 deg 1 vibrational state population. This model successfully reproduces the existing center-to-limb observations, although higher spatial resolution observations are needed for a definitive test. The model also predicts total fluxes which are close to the observed values. The strength of the emission is predicted to be closely related to the instantaneous near-IR solar heating rate.

  3. Modeling of the 10-micron natural laser emission from the mesospheres of Mars and Venus

    NASA Technical Reports Server (NTRS)

    Deming, D.; Mumma, M. J.

    1983-01-01

    The NLTE radiative transfer problem is solved to obtain the 00 deg 1 vibrational state population. This model successfully reproduces the existing center-to-limb observations, although higher spatial resolution observations are needed for a definitive test. The model also predicts total fluxes which are close to the observed values. The strength of the emission is predicted to be closely related to the instantaneous near-IR solar heating rate.

  4. Benchmarking novel approaches for modelling species range dynamics

    PubMed Central

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H.; Moore, Kara A.; Zimmermann, Niklaus E.

    2016-01-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species’ range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species’ response to climate change but also emphasise several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches operational for large numbers of species. PMID:26872305

  5. Benchmarking novel approaches for modelling species range dynamics.

    PubMed

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches operational for large numbers of species. © 2016 John Wiley & Sons Ltd.

  6. Real-world use of the risk-need-responsivity model and the level of service/case management inventory with community-supervised offenders.

    PubMed

    Dyck, Heather L; Campbell, Mary Ann; Wershler, Julie L

    2018-06-01

    The risk-need-responsivity model (RNR; Bonta & Andrews, 2017) has become a leading approach for effective offender case management, but field tests of this model are still required. The present study first assessed the predictive validity of the RNR-informed Level of Service/Case Management Inventory (LS/CMI; Andrews, Bonta, & Wormith, 2004) with a sample of Atlantic Canadian male and female community-supervised provincial offenders (N = 136). Next, the case management plans prepared from these LS/CMI results were analyzed for adherence to the principles of risk, need, and responsivity. As expected, the LS/CMI was a strong predictor of general recidivism for both males (area under the curve = .75, 95% confidence interval [.66, .85]), and especially females (area under the curve = .94, 95% confidence interval [.84, 1.00]), over an average 3.42-year follow-up period. The LS/CMI was predictive of time to recidivism, with lower risk cases taking longer to reoffend than higher risk cases. Despite the robust predictive validity of the LS/CMI, case management plans developed by probation officers generally reflected poor adherence to the RNR principles. These findings highlight the need for better training on how to transfer risk appraisal information from valid risk tools to case plans to better meet the best-practice principles of risk, need, and responsivity for criminal behavior risk reduction. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Estimating the probability of survival of individual shortleaf pine (Pinus echinata mill.) trees

    Treesearch

    Sudip Shrestha; Thomas B. Lynch; Difei Zhang; James M. Guldin

    2012-01-01

    A survival model is needed in a forest growth system which predicts the survival of trees on individual basis or on a stand basis (Gertner, 1989). An individual-tree modeling approach is one of the better methods available for predicting growth and yield as it provides essential information about particular tree species; tree size, tree quality and tree present status...

  8. The development of a probabilistic approach to forecast coastal change

    USGS Publications Warehouse

    Lentz, Erika E.; Hapke, Cheryl J.; Rosati, Julie D.; Wang, Ping; Roberts, Tiffany M.

    2011-01-01

    This study demonstrates the applicability of a Bayesian probabilistic model as an effective tool in predicting post-storm beach changes along sandy coastlines. Volume change and net shoreline movement are modeled for two study sites at Fire Island, New York in response to two extratropical storms in 2007 and 2009. Both study areas include modified areas adjacent to unmodified areas in morphologically different segments of coast. Predicted outcomes are evaluated against observed changes to test model accuracy and uncertainty along 163 cross-shore transects. Results show strong agreement in the cross validation of predictions vs. observations, with 70-82% accuracies reported. Although no consistent spatial pattern in inaccurate predictions could be determined, the highest prediction uncertainties appeared in locations that had been recently replenished. Further testing and model refinement are needed; however, these initial results show that Bayesian networks have the potential to serve as important decision-support tools in forecasting coastal change.

  9. Examination of a sociocultural model of excessive exercise among male and female adolescents.

    PubMed

    White, James; Halliwell, Emma

    2010-06-01

    There is substantial evidence that sociocultural pressures and body image disturbances can lead to disordered eating, yet few studies have examined their impact on excessive exercise. The study adapted a sociocultural model for disordered eating to predict excessive exercise using data from boys and girls in early adolescence (N=421). Perceived sociocultural pressures to lose weight and build muscle, body image disturbance and appearance investment were associated with a compulsive need to exercise. Adolescents' investment in appearance and body image disturbance fully mediated the relationship between sociocultural pressures and a compulsive need for exercise. There was no support for the meditational model in predicting adolescents' frequency or duration of exercise. Results support the sociocultural model as an explanatory model for excessive exercise, but suggest appearance investment and body image disturbance are important mediators of sociocultural pressures. 2010 Elsevier Ltd. All rights reserved.

  10. Altruism in the wild: when affiliative motives to help positive people overtake empathic motives to help the distressed.

    PubMed

    Hauser, David J; Preston, Stephanie D; Stansfield, R Brent

    2014-06-01

    Psychological theories of human altruism suggest that helping results from an evolved tendency in caregiving mammals to respond to distress or need with empathy and sympathy. However, theories from biology, economics, and social psychology demonstrate that social animals also evolved to affiliate with and help desirable social partners. These models make different predictions about the affect of those we should prefer to help. Empathic models predict a preference to help sad, distressed targets in need, while social affiliative models predict a preference for happy, positive, successful targets. We compared these predictions in 3 field studies that measured the tendency to help sad, happy, and neutral confederates in a real-world, daily context: holding the door for a stranger in public. People consistently held the door more for happy over sad or neutral targets. To allow empathic motivations to compete more strongly against social affiliative ones, a 4th study examined a more consequential form of aid for hypothetical hospital patients in clear need. These conditions enhanced the preference to help a sad over a happy patient, because sadness made the patient appear sicker and in greater need. However, people still preferred the happy patient when the aid required a direct social interaction, attesting to the strength of social affiliation motives, even for sick patients. Theories of prosocial behavior should place greater emphasis on the role of social affiliation in motivating aid, particularly in everyday interpersonal contexts. (PsycINFO Database Record (c) 2014 APA, all rights reserved). PsycINFO Database Record (c) 2014 APA, all rights reserved.

  11. 2016 International Land Model Benchmarking (ILAMB) Workshop Report

    NASA Technical Reports Server (NTRS)

    Hoffman, Forrest M.; Koven, Charles D.; Keppel-Aleks, Gretchen; Lawrence, David M.; Riley, William J.; Randerson, James T.; Ahlstrom, Anders; Abramowitz, Gabriel; Baldocchi, Dennis D.; Best, Martin J.; hide

    2016-01-01

    As earth system models (ESMs) become increasingly complex, there is a growing need for comprehensive and multi-faceted evaluation of model projections. To advance understanding of terrestrial biogeochemical processes and their interactions with hydrology and climate under conditions of increasing atmospheric carbon dioxide, new analysis methods are required that use observations to constrain model predictions, inform model development, and identify needed measurements and field experiments. Better representations of biogeochemistryclimate feedbacks and ecosystem processes in these models are essential for reducing the acknowledged substantial uncertainties in 21st century climate change projections.

  12. 2016 International Land Model Benchmarking (ILAMB) Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forrest M.; Koven, Charles D.; Keppel-Aleks, Gretchen

    As Earth system models become increasingly complex, there is a growing need for comprehensive and multi-faceted evaluation of model projections. To advance understanding of biogeochemical processes and their interactions with hydrology and climate under conditions of increasing atmospheric carbon dioxide, new analysis methods are required that use observations to constrain model predictions, inform model development, and identify needed measurements and field experiments. Better representations of biogeochemistry–climate feedbacks and ecosystem processes in these models are essential for reducing uncertainties associated with projections of climate change during the remainder of the 21st century.

  13. Multiscale modeling and simulation of embryogenesis for in silico predictive toxicology (WC9)

    EPA Science Inventory

    Translating big data from alternative and HTS platforms into hazard identification and risk assessment is an important need for predictive toxicology and for elucidating adverse outcome pathways (AOPs) in developmental toxicity. Understanding how chemical disruption of molecular ...

  14. Prediction of energy expenditure and physical activity in preschoolers

    USDA-ARS?s Scientific Manuscript database

    Accurate, nonintrusive, and feasible methods are needed to predict energy expenditure (EE) and physical activity (PA) levels in preschoolers. Herein, we validated cross-sectional time series (CSTS) and multivariate adaptive regression splines (MARS) models based on accelerometry and heart rate (HR) ...

  15. Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML)

    PubMed Central

    Lechevalier, D.; Ak, R.; Ferguson, M.; Law, K. H.; Lee, Y.-T. T.; Rachuri, S.

    2017-01-01

    This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain. PMID:29202125

  16. Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML).

    PubMed

    Park, J; Lechevalier, D; Ak, R; Ferguson, M; Law, K H; Lee, Y-T T; Rachuri, S

    2017-01-01

    This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain.

  17. Predicting Vandalism in a General Youth Sample via the HEW Youth Development Model's Community Program Impact Scales, Age, and Sex.

    ERIC Educational Resources Information Center

    Truckenmiller, James L.

    The former HEW National Strategy for Youth Development model was a community-based planning and procedural tool to enhance and to prevent delinquency through a process of youth needs assessments, needs targeted programs, and program impact evaluation. The program's 12 Impact Scales have been found to have acceptable reliabilities, substantial…

  18. High-Throughput Physiologically Based Toxicokinetic Models for ToxCast Chemicals

    EPA Science Inventory

    Physiologically based toxicokinetic (PBTK) models aid in predicting exposure doses needed to create tissue concentrations equivalent to those identified as bioactive by ToxCast. We have implemented four empirical and physiologically-based toxicokinetic (TK) models within a new R ...

  19. Climate change and maize yield in Iowa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Hong; Twine, Tracy E.; Girvetz, Evan

    Climate is changing across the world, including the major maize-growing state of Iowa in the USA. To maintain crop yields, farmers will need a suite of adaptation strategies, and choice of strategy will depend on how the local to regional climate is expected to change. Here we predict how maize yield might change through the 21 st century as compared with late 20 th century yields across Iowa, USA, a region representing ideal climate and soils for maize production that contributes substantially to the global maize economy. To account for climate model uncertainty, we drive a dynamic ecosystem model withmore » output from six climate models and two future climate forcing scenarios. Despite a wide range in the predicted amount of warming and change to summer precipitation, all simulations predict a decrease in maize yields from late 20 th century to middle and late 21 st century ranging from 15% to 50%. Linear regression of all models predicts a 6% state-averaged yield decrease for every 1°C increase in warm season average air temperature. When the influence of moisture stress on crop growth is removed from the model, yield decreases either remain the same or are reduced, depending on predicted changes in warm season precipitation. Lastly, our results suggest that even if maize were to receive all the water it needed, under the strongest climate forcing scenario yields will decline by 10-20% by the end of the 21 st century.« less

  20. Climate change and maize yield in Iowa

    DOE PAGES

    Xu, Hong; Twine, Tracy E.; Girvetz, Evan

    2016-05-24

    Climate is changing across the world, including the major maize-growing state of Iowa in the USA. To maintain crop yields, farmers will need a suite of adaptation strategies, and choice of strategy will depend on how the local to regional climate is expected to change. Here we predict how maize yield might change through the 21 st century as compared with late 20 th century yields across Iowa, USA, a region representing ideal climate and soils for maize production that contributes substantially to the global maize economy. To account for climate model uncertainty, we drive a dynamic ecosystem model withmore » output from six climate models and two future climate forcing scenarios. Despite a wide range in the predicted amount of warming and change to summer precipitation, all simulations predict a decrease in maize yields from late 20 th century to middle and late 21 st century ranging from 15% to 50%. Linear regression of all models predicts a 6% state-averaged yield decrease for every 1°C increase in warm season average air temperature. When the influence of moisture stress on crop growth is removed from the model, yield decreases either remain the same or are reduced, depending on predicted changes in warm season precipitation. Lastly, our results suggest that even if maize were to receive all the water it needed, under the strongest climate forcing scenario yields will decline by 10-20% by the end of the 21 st century.« less

  1. Predicting the role of assistive technologies in the lives of people with dementia using objective care recipient factors.

    PubMed

    Czarnuch, Stephen; Ricciardelli, Rose; Mihailidis, Alex

    2016-07-20

    The population of people with dementia is not homogeneous. People with dementia exhibit a wide range of needs, each characterized by diverse factors including age, sex, ethnicity, and place of residence. These needs and characterizing factors may influence the applicability, and ultimately the acceptance, of assistive technologies developed to support the independence of people with dementia. Accordingly, predicting the needs of users before developing the technologies may increase the applicability and acceptance of assistive technologies. Current methods of prediction rely on the difficult collection of subjective, potentially invasive information. We propose a method of prediction that uses objective, unobtrusive, easy to collect information to help inform the development of assistive technologies. We develop a set of models that can predict the level of independence of people with dementia during 20 activities of daily living using simple, objective information. Using data collected from a Canadian survey conducted with caregivers of people with dementia, we create an ordered logistic regression model for each of the twenty daily tasks in the Bristol ADL scale. Data collected from 430 Canadian caregivers of people with dementia were analyzed to reveal: most care recipients were mothers or husbands, married, living in private housing with their caregivers, English-speaking, Canadian born, clinically diagnosed with dementia 1 to 6 years prior to the study, and were dependent on their caregiver. Next, we developed models that use 13 factors to predict a person with dementia's ability to complete the 20 Bristol activities of daily living independently. The 13 factors include caregiver relation, age, marital status, place of residence, language, housing type, proximity to caregiver, service use, informal primary caregiver, diagnosis of Alzheimer's disease or dementia, time since diagnosis, and level of dependence on caregiver. The resulting models predicted the aggregate level of independence correctly for 88 of 100 total responses categories, marginally for nine, and incorrectly for three. Objective, easy to collect information can predict caregiver-reported level of task independence for a person with dementia. Knowledge of task independence can then inform the development of assistive technologies for people with dementia, improving their applicability and acceptance.

  2. The Contribution of Mathematical Modeling to Understanding Dynamic Aspects of Rumen Metabolism

    PubMed Central

    Bannink, André; van Lingen, Henk J.; Ellis, Jennifer L.; France, James; Dijkstra, Jan

    2016-01-01

    All mechanistic rumen models cover the main drivers of variation in rumen function, which are feed intake, the differences between feedstuffs and feeds in their intrinsic rumen degradation characteristics, and fractional outflow rate of fluid and particulate matter. Dynamic modeling approaches are best suited to the prediction of more nuanced responses in rumen metabolism, and represent the dynamics of the interactions between substrates and micro-organisms and inter-microbial interactions. The concepts of dynamics are discussed for the case of rumen starch digestion as influenced by starch intake rate and frequency of feed intake, and for the case of fermentation of fiber in the large intestine. Adding representations of new functional classes of micro-organisms (i.e., with new characteristics from the perspective of whole rumen function) in rumen models only delivers new insights if complemented by the dynamics of their interactions with other functional classes. Rumen fermentation conditions have to be represented due to their profound impact on the dynamics of substrate degradation and microbial metabolism. Although the importance of rumen pH is generally acknowledged, more emphasis is needed on predicting its variation as well as variation in the processes that underlie rumen fluid dynamics. The rumen wall has an important role in adapting to rapid changes in the rumen environment, clearing of volatile fatty acids (VFA), and maintaining rumen pH within limits. Dynamics of rumen wall epithelia and their role in VFA absorption needs to be better represented in models that aim to predict rumen responses across nutritional or physiological states. For a detailed prediction of rumen N balance there is merit in a dynamic modeling approach compared to the static approaches adopted in current protein evaluation systems. Improvement is needed on previous attempts to predict rumen VFA profiles, and this should be pursued by introducing factors that relate more to microbial metabolism. For rumen model construction, data on rumen microbiomes are preferably coupled with knowledge consolidated in rumen models instead of relying on correlations with rather general aspects of treatment or animal. This helps to prevent the disregard of basic principles and underlying mechanisms of whole rumen function. PMID:27933039

  3. Physiologically Based Pharmacokinetic Modeling in Lead Optimization. 1. Evaluation and Adaptation of GastroPlus To Predict Bioavailability of Medchem Series.

    PubMed

    Daga, Pankaj R; Bolger, Michael B; Haworth, Ian S; Clark, Robert D; Martin, Eric J

    2018-03-05

    When medicinal chemists need to improve bioavailability (%F) within a chemical series during lead optimization, they synthesize new series members with systematically modified properties mainly by following experience and general rules of thumb. More quantitative models that predict %F of proposed compounds from chemical structure alone have proven elusive. Global empirical %F quantitative structure-property (QSPR) models perform poorly, and projects have too little data to train local %F QSPR models. Mechanistic oral absorption and physiologically based pharmacokinetic (PBPK) models simulate the dissolution, absorption, systemic distribution, and clearance of a drug in preclinical species and humans. Attempts to build global PBPK models based purely on calculated inputs have not achieved the <2-fold average error needed to guide lead optimization. In this work, local GastroPlus PBPK models are instead customized for individual medchem series. The key innovation was building a local QSPR for a numerically fitted effective intrinsic clearance (CL loc ). All inputs are subsequently computed from structure alone, so the models can be applied in advance of synthesis. Training CL loc on the first 15-18 rat %F measurements gave adequate predictions, with clear improvements up to about 30 measurements, and incremental improvements beyond that.

  4. Quantifying the Dynamics of Field Cancerization in Tobacco-Related Head and Neck Cancer: A Multiscale Modeling Approach.

    PubMed

    Ryser, Marc D; Lee, Walter T; Ready, Neal E; Leder, Kevin Z; Foo, Jasmine

    2016-12-15

    High rates of local recurrence in tobacco-related head and neck squamous cell carcinoma (HNSCC) are commonly attributed to unresected fields of precancerous tissue. Because they are not easily detectable at the time of surgery without additional biopsies, there is a need for noninvasive methods to predict the extent and dynamics of these fields. Here, we developed a spatial stochastic model of tobacco-related HNSCC at the tissue level and calibrated the model using a Bayesian framework and population-level incidence data from the Surveillance, Epidemiology, and End Results (SEER) registry. Probabilistic model analyses were performed to predict the field geometry at time of diagnosis, and model predictions of age-specific recurrence risks were tested against outcome data from SEER. The calibrated models predicted a strong dependence of the local field size on age at diagnosis, with a doubling of the expected field diameter between ages at diagnosis of 50 and 90 years, respectively. Similarly, the probability of harboring multiple, clonally unrelated fields at the time of diagnosis was found to increase substantially with patient age. On the basis of these findings, we hypothesized a higher recurrence risk in older than in younger patients when treated by surgery alone; we successfully tested this hypothesis using age-stratified outcome data. Further clinical studies are needed to validate the model predictions in a patient-specific setting. This work highlights the importance of spatial structure in models of epithelial carcinogenesis and suggests that patient age at diagnosis may be a critical predictor of the size and multiplicity of precancerous lesions. Cancer Res; 76(24); 7078-88. ©2016 AACR. ©2016 American Association for Cancer Research.

  5. Quantifying the dynamics of field cancerization in tobacco-related head and neck cancer: a multi-scale modeling approach

    PubMed Central

    Ryser, Marc D.; Lee, Walter T.; Readyz, Neal E.; Leder, Kevin Z.; Foo, Jasmine

    2017-01-01

    High rates of local recurrence in tobacco-related head and neck squamous cell carcinoma (HNSCC) are commonly attributed to unresected fields of precancerous tissue. Since they are not easily detectable at the time of surgery without additional biopsies, there is a need for non-invasive methods to predict the extent and dynamics of these fields. Here we developed a spatial stochastic model of tobacco-related HNSCC at the tissue level and calibrated the model using a Bayesian framework and population-level incidence data from the Surveillance, Epidemiology, and End Results (SEER) registry. Probabilistic model analyses were performed to predict the field geometry at time of diagnosis, and model predictions of age-specific recurrence risks were tested against outcome data from SEER. The calibrated models predicted a strong dependence of the local field size on age at diagnosis, with a doubling of the expected field diameter between ages at diagnosis of 50 and 90 years, respectively. Similarly, the probability of harboring multiple, clonally unrelated fields at the time of diagnosis were found to increase substantially with patient age. Based on these findings, we hypothesized a higher recurrence risk in older compared to younger patients when treated by surgery alone; we successfully tested this hypothesis using age-stratified outcome data. Further clinical studies are needed to validate the model predictions in a patient-specific setting. This work highlights the importance of spatial structure in models of epithelial carcinogenesis, and suggests that patient age at diagnosis may be a critical predictor of the size and multiplicity of precancerous lesions. Major Findings Patient age at diagnosis was found to be a critical predictor of the size and multiplicity of precancerous lesions. This finding challenges the current one-size-fits-all approach to surgical excision margins. PMID:27913438

  6. Race, Genetic West African Ancestry, and Prostate Cancer Prediction by PSA in Prospectively Screened High-Risk Men

    PubMed Central

    Giri, Veda N.; Egleston, Brian; Ruth, Karen; Uzzo, Robert G.; Chen, David Y.T.; Buyyounouski, Mark; Raysor, Susan; Hooker, Stanley; Torres, Jada Benn; Ramike, Teniel; Mastalski, Kathleen; Kim, Taylor Y.; Kittles, Rick

    2008-01-01

    Introduction “Race-specific” PSA needs evaluation in men at high-risk for prostate cancer (PCA) for optimizing early detection. Baseline PSA and longitudinal prediction for PCA was examined by self-reported race and genetic West African (WA) ancestry in the Prostate Cancer Risk Assessment Program, a prospective high-risk cohort. Materials and Methods Eligibility criteria are age 35–69 years, FH of PCA, African American (AA) race, or BRCA1/2 mutations. Biopsies have been performed at low PSA values (<4.0 ng/mL). WA ancestry was discerned by genotyping 100 ancestry informative markers. Cox proportional hazards models evaluated baseline PSA, self-reported race, and genetic WA ancestry. Cox models were used for 3-year predictions for PCA. Results 646 men (63% AA) were analyzed. Individual WA ancestry estimates varied widely among self-reported AA men. “Race-specific” differences in baseline PSA were not found by self-reported race or genetic WA ancestry. Among men with ≥ 1 follow-up visit (405 total, 54% AA), three-year prediction for PCA with a PSA of 1.5–4.0 ng/mL was higher in AA men with age in the model (p=0.025) compared to EA men. Hazard ratios of PSA for PCA were also higher by self-reported race (1.59 for AA vs. 1.32 for EA, p=0.04). There was a trend for increasing prediction for PCA with increasing genetic WA ancestry. Conclusions “Race-specific” PSA may need to be redefined as higher prediction for PCA at any given PSA in AA men. Large-scale studies are needed to confirm if genetic WA ancestry explains these findings to make progress in personalizing PCA early detection. PMID:19240249

  7. Immunogenicity of therapeutic proteins: the use of animal models.

    PubMed

    Brinks, Vera; Jiskoot, Wim; Schellekens, Huub

    2011-10-01

    Immunogenicity of therapeutic proteins lowers patient well-being and drastically increases therapeutic costs. Preventing immunogenicity is an important issue to consider when developing novel therapeutic proteins and applying them in the clinic. Animal models are increasingly used to study immunogenicity of therapeutic proteins. They are employed as predictive tools to assess different aspects of immunogenicity during drug development and have become vital in studying the mechanisms underlying immunogenicity of therapeutic proteins. However, the use of animal models needs critical evaluation. Because of species differences, predictive value of such models is limited, and mechanistic studies can be restricted. This review addresses the suitability of animal models for immunogenicity prediction and summarizes the insights in immunogenicity that they have given so far.

  8. [Prediction of total nitrogen and alkali hydrolysable nitrogen content in loess using hyperspectral data based on correlation analysis and partial least squares regression].

    PubMed

    Liu, Xiu-ying; Wang, Li; Chang, Qing-rui; Wang, Xiao-xing; Shang, Yan

    2015-07-01

    Wuqi County of Shaanxi Province, where the vegetation recovering measures have been carried out for years, was taken as the study area. A total of 100 loess samples from 24 different profiles were collected. Total nitrogen (TN) and alkali hydrolysable nitrogen (AHN) contents of the soil samples were analyzed, and the soil samples were scanned in the visible/near-infrared (VNIR) region of 350-2500 nm in the laboratory. The calibration models were developed between TN and AHN contents and VNIR values based on correlation analysis (CA) and partial least squares regression (PLS). Independent samples validated the calibration models. The results indicated that the optimum model for predicting TN of loess was established by using first derivative of reflectance. The best model for predicting AHN of loess was established by using normal derivative spectra. The optimum TN model could effectively predict TN in loess from 0 to 40 cm, but the optimum AHN model could only roughly predict AHN at the same depth. This study provided a good method for rapidly predicting TN of loess where vegetation recovering measures have been adopted, but prediction of AHN needs to be further studied.

  9. Accuracy Analysis of a Box-wing Theoretical SRP Model

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoya; Hu, Xiaogong; Zhao, Qunhe; Guo, Rui

    2016-07-01

    For Beidou satellite navigation system (BDS) a high accuracy SRP model is necessary for high precise applications especially with Global BDS establishment in future. The BDS accuracy for broadcast ephemeris need be improved. So, a box-wing theoretical SRP model with fine structure and adding conical shadow factor of earth and moon were established. We verified this SRP model by the GPS Block IIF satellites. The calculation was done with the data of PRN 1, 24, 25, 27 satellites. The results show that the physical SRP model for POD and forecast for GPS IIF satellite has higher accuracy with respect to Bern empirical model. The 3D-RMS of orbit is about 20 centimeters. The POD accuracy for both models is similar but the prediction accuracy with the physical SRP model is more than doubled. We tested 1-day 3-day and 7-day orbit prediction. The longer is the prediction arc length, the more significant is the improvement. The orbit prediction accuracy with the physical SRP model for 1-day, 3-day and 7-day arc length are 0.4m, 2.0m, 10.0m respectively. But they are 0.9m, 5.5m and 30m with Bern empirical model respectively. We apply this means to the BDS and give out a SRP model for Beidou satellites. Then we test and verify the model with Beidou data of one month only for test. Initial results show the model is good but needs more data for verification and improvement. The orbit residual RMS is similar to that with our empirical force model which only estimate the force for along track, across track direction and y-bias. But the orbit overlap and SLR observation evaluation show some improvement. The remaining empirical force is reduced significantly for present Beidou constellation.

  10. Is there a `universal' dynamic zero-parameter hydrological model? Evaluation of a dynamic Budyko model in US and India

    NASA Astrophysics Data System (ADS)

    Patnaik, S.; Biswal, B.; Sharma, V. C.

    2017-12-01

    River flow varies greatly in space and time, and the single biggest challenge for hydrologists and ecologists around the world is the fact that most rivers are either ungauged or poorly gauged. Although it is relatively easier to predict long-term average flow of a river using the `universal' zero-parameter Budyko model, lack of data hinders short-term flow prediction at ungauged locations using traditional hydrological models as they require observed flow data for model calibration. Flow prediction in ungauged basins thus requires a dynamic 'zero-parameter' hydrological model. One way to achieve this is to regionalize a dynamic hydrological model's parameters. However, a regionalization method based zero-parameter dynamic hydrological model is not `universal'. An alternative attempt was made recently to develop a zero-parameter dynamic model by defining an instantaneous dryness index as a function of antecedent rainfall and solar energy inputs with the help of a decay function and using the original Budyko function. The model was tested first in 63 US catchments and later in 50 Indian catchments. The median Nash-Sutcliffe efficiency (NSE) was found to be close to 0.4 in both the cases. Although improvements need to be incorporated in order to use the model for reliable prediction, the main aim of this study was to rather understand hydrological processes. The overall results here seem to suggest that the dynamic zero-parameter Budyko model is `universal.' In other words natural catchments around the world are strikingly similar to each other in the way they respond to hydrologic inputs; we thus need to focus more on utilizing catchment similarities in hydrological modelling instead of over parameterizing our models.

  11. Applying Sequential Analytic Methods to Self-Reported Information to Anticipate Care Needs.

    PubMed

    Bayliss, Elizabeth A; Powers, J David; Ellis, Jennifer L; Barrow, Jennifer C; Strobel, MaryJo; Beck, Arne

    2016-01-01

    Identifying care needs for newly enrolled or newly insured individuals is important under the Affordable Care Act. Systematically collected patient-reported information can potentially identify subgroups with specific care needs prior to service use. We conducted a retrospective cohort investigation of 6,047 individuals who completed a 10-question needs assessment upon initial enrollment in Kaiser Permanente Colorado (KPCO), a not-for-profit integrated delivery system, through the Colorado State Individual Exchange. We used responses from the Brief Health Questionnaire (BHQ), to develop a predictive model for cost for receiving care in the top 25 percent, then applied cluster analytic techniques to identify different high-cost subpopulations. Per-member, per-month cost was measured from 6 to 12 months following BHQ response. BHQ responses significantly predictive of high-cost care included self-reported health status, functional limitations, medication use, presence of 0-4 chronic conditions, self-reported emergency department (ED) use during the prior year, and lack of prior insurance. Age, gender, and deductible-based insurance product were also predictive. The largest possible range of predicted probabilities of being in the top 25 percent of cost was 3.5 percent to 96.4 percent. Within the top cost quartile, examples of potentially actionable clusters of patients included those with high morbidity, prior utilization, depression risk and financial constraints; those with high morbidity, previously uninsured individuals with few financial constraints; and relatively healthy, previously insured individuals with medication needs. Applying sequential predictive modeling and cluster analytic techniques to patient-reported information can identify subgroups of individuals within heterogeneous populations who may benefit from specific interventions to optimize initial care delivery.

  12. Mechanisms controlling primary and new production in a global ecosystem model - Part I: Validation of the biological simulation

    NASA Astrophysics Data System (ADS)

    Popova, E. E.; Coward, A. C.; Nurser, G. A.; de Cuevas, B.; Fasham, M. J. R.; Anderson, T. R.

    2006-12-01

    A global general circulation model coupled to a simple six-compartment ecosystem model is used to study the extent to which global variability in primary and export production can be realistically predicted on the basis of advanced parameterizations of upper mixed layer physics, without recourse to introducing extra complexity in model biology. The "K profile parameterization" (KPP) scheme employed, combined with 6-hourly external forcing, is able to capture short-term periodic and episodic events such as diurnal cycling and storm-induced deepening. The model realistically reproduces various features of global ecosystem dynamics that have been problematic in previous global modelling studies, using a single generic parameter set. The realistic simulation of deep convection in the North Atlantic, and lack of it in the North Pacific and Southern Oceans, leads to good predictions of chlorophyll and primary production in these contrasting areas. Realistic levels of primary production are predicted in the oligotrophic gyres due to high frequency external forcing of the upper mixed layer (accompanying paper Popova et al., 2006) and novel parameterizations of zooplankton excretion. Good agreement is shown between model and observations at various JGOFS time series sites: BATS, KERFIX, Papa and HOT. One exception is the northern North Atlantic where lower grazing rates are needed, perhaps related to the dominance of mesozooplankton there. The model is therefore not globally robust in the sense that additional parameterizations are needed to realistically simulate ecosystem dynamics in the North Atlantic. Nevertheless, the work emphasises the need to pay particular attention to the parameterization of mixed layer physics in global ocean ecosystem modelling as a prerequisite to increasing the complexity of ecosystem models.

  13. Prediction of material strength and fracture of glass using the SPHINX smooth particle hydrodynamics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandell, D.A.; Wingate, C.A.

    1994-08-01

    The design of many military devices involves numerical predictions of the material strength and fracture of brittle materials. The materials of interest include ceramics, that are used in armor packages; glass that is used in truck and jeep windshields and in helicopters; and rock and concrete that are used in underground bunkers. As part of a program to develop advanced hydrocode design tools, the authors have implemented a brittle fracture model for glass into the SPHINX smooth particle hydrodynamics code. The authors have evaluated this model and the code by predicting data from one-dimensional flyer plate impacts into glass, andmore » data from tungsten rods impacting glass. Since fractured glass properties, which are needed in the model, are not available, the authors did sensitivity studies of these properties, as well as sensitivity studies to determine the number of particles needed in the calculations. The numerical results are in good agreement with the data.« less

  14. Microbial Community Metabolic Modeling: A Community Data-Driven Network Reconstruction: COMMUNITY DATA-DRIVEN METABOLIC NETWORK MODELING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Christopher S.; Bernstein, Hans C.; Weisenhorn, Pamela

    Metabolic network modeling of microbial communities provides an in-depth understanding of community-wide metabolic and regulatory processes. Compared to single organism analyses, community metabolic network modeling is more complex because it needs to account for interspecies interactions. To date, most approaches focus on reconstruction of high-quality individual networks so that, when combined, they can predict community behaviors as a result of interspecies interactions. However, this conventional method becomes ineffective for communities whose members are not well characterized and cannot be experimentally interrogated in isolation. Here, we tested a new approach that uses community-level data as a critical input for the networkmore » reconstruction process. This method focuses on directly predicting interspecies metabolic interactions in a community, when axenic information is insufficient. We validated our method through the case study of a bacterial photoautotroph-heterotroph consortium that was used to provide data needed for a community-level metabolic network reconstruction. Resulting simulations provided experimentally validated predictions of how a photoautotrophic cyanobacterium supports the growth of an obligate heterotrophic species by providing organic carbon and nitrogen sources.« less

  15. A CBR-Based and MAHP-Based Customer Value Prediction Model for New Product Development

    PubMed Central

    Zhao, Yu-Jie; Luo, Xin-xing; Deng, Li

    2014-01-01

    In the fierce market environment, the enterprise which wants to meet customer needs and boost its market profit and share must focus on the new product development. To overcome the limitations of previous research, Chan et al. proposed a dynamic decision support system to predict the customer lifetime value (CLV) for new product development. However, to better meet the customer needs, there are still some deficiencies in their model, so this study proposes a CBR-based and MAHP-based customer value prediction model for a new product (C&M-CVPM). CBR (case based reasoning) can reduce experts' workload and evaluation time, while MAHP (multiplicative analytic hierarchy process) can use actual but average influencing factor's effectiveness in stimulation, and at same time C&M-CVPM uses dynamic customers' transition probability which is more close to reality. This study not only introduces the realization of CBR and MAHP, but also elaborates C&M-CVPM's three main modules. The application of the proposed model is illustrated and confirmed to be sensible and convincing through a stimulation experiment. PMID:25162050

  16. A CBR-based and MAHP-based customer value prediction model for new product development.

    PubMed

    Zhao, Yu-Jie; Luo, Xin-xing; Deng, Li

    2014-01-01

    In the fierce market environment, the enterprise which wants to meet customer needs and boost its market profit and share must focus on the new product development. To overcome the limitations of previous research, Chan et al. proposed a dynamic decision support system to predict the customer lifetime value (CLV) for new product development. However, to better meet the customer needs, there are still some deficiencies in their model, so this study proposes a CBR-based and MAHP-based customer value prediction model for a new product (C&M-CVPM). CBR (case based reasoning) can reduce experts' workload and evaluation time, while MAHP (multiplicative analytic hierarchy process) can use actual but average influencing factor's effectiveness in stimulation, and at same time C&M-CVPM uses dynamic customers' transition probability which is more close to reality. This study not only introduces the realization of CBR and MAHP, but also elaborates C&M-CVPM's three main modules. The application of the proposed model is illustrated and confirmed to be sensible and convincing through a stimulation experiment.

  17. Need for Affect and Attitudes Toward Drugs: The Mediating Role of Values.

    PubMed

    Lins de Holanda Coelho, Gabriel; H P Hanel, Paul; Vilar, Roosevelt; P Monteiro, Renan; Gouveia, Valdiney V; R Maio, Gregory

    2018-05-04

    Human values and affective traits were found to predict attitudes toward the use of different types of drugs (e.g., alcohol, marijuana, and other illegal drugs). In this study (N = 196, M age = 23.09), we aimed to gain a more comprehensive understanding of those predictors of attitudes toward drug use in a mediated structural equation model, providing a better overview of a possible motivational path that drives to such a risky behavior. Specifically, we predicted and found that the relations between need for affect and attitudes toward drug use were mediated by excitement values. Also, results showed that excitement values and need for affect positively predicted attitudes toward the use of drugs, whereas normative values predicted it negatively. The pattern of results remained the same when we investigated attitudes toward alcohol, marijuana, or illegal drugs separately. Overall, the findings indicate that emotions operate via excitement and normative values to influence risk behavior.

  18. Ground-water models for water resources planning

    USGS Publications Warehouse

    Moore, John E.

    1980-01-01

    In the past decade hydrologists have emphasized the development of computer-based mathematical models to aid in the understanding of flow, the transport of solutes, transport of heat, and deformation in the groundwater system. These models have been used to provide information and predictions for water managers. Too frequently, groundwater was neglected in water-resource planning because managers believed that it could not be adequately evaluated in terms of availability, quality, and effect of development on surface water supplies. Now, however, with newly developed digital groundwater models, effects of development can be predicted. Such models have been used to predict hydrologic and quality changes under different stresses. These models have grown in complexity over the last 10 years from simple one-layer flow models to three-dimensional simulations of groundwater flow which may include solute transport, heat transport, effects of land subsidence, and encroachment of salt water. This paper illustrates, through case histories, how predictive groundwater models have provided the information needed for the sound planning and management of water resources in the United States. (USGS)

  19. 4th Annual Predictive Toxicology Summit 2012.

    PubMed

    Cui, Zhanfeng

    2013-08-01

    This meeting report presents a brief summary on the 4th Annual Predictive Toxicology Summit 2012, which was held on 15 - 16 February 2012 in London. The majority of presentations came from global pharmaceutical companies, although small and medium enterprise (SME) and academic researchers were represented too. Major regulatory bodies were also present. The article highlights the summit, which was considered a good learning opportunity to catch up on the recent advances in predictive toxicology. Predictive toxicology has become more and more important due to social and economic pressure and scientific reasons. Technological developments are rapid, but there is a gulf between the technology developers and the pharmaceutical end users; hence, early engagement is desirable. Stem cell-derived cell-based assays as well as three-dimensional in vitro tissue/organ model development are within the reach now, but a lot needs to be done to optimise and validate the developed protocols and products. The field of predictive toxicology needs fundamental research of interdisciplinary nature, which requires much needed trained personnel and funding.

  20. Practical Meteor Stream Forecasting

    NASA Technical Reports Server (NTRS)

    Cooke, William J.; Suggs, Robert M.

    2003-01-01

    Inspired by the recent Leonid meteor storms, researchers have made great strides in our ability to predict enhanced meteor activity. However, the necessary calibration of the meteor stream models with Earth-based ZHRs (Zenith Hourly Rates) has placed emphasis on the terran observer and meteor activity predictions are published in such a manner to reflect this emphasis. As a consequence, many predictions are often unusable by the satellite community, which has the most at stake and the greatest interest in meteor forecasting. This paper suggests that stream modelers need to pay more attention to the needs of this community and publish not just durations and times of maxima for Earth, but everything needed to characterize the meteor stream in and out of the plane of the ecliptic, which, at a minimum, consists of the location of maximum stream density (ZHR) and the functional form of the density decay with distance from this point. It is also suggested that some of the terminology associated with meteor showers may need to be more strictly defined in order to eliminate the perception of crying wolf by meteor scientists. An outburst is especially problematic, as it usually denotes an enhancement by a factor of 2 or more to researchers, but conveys the notion of a sky filled with meteors to satellite operators and the public. Experience has also taught that predicted ZHRs often lead to public disappointment, as these values vastly overestimate what is seen.

  1. Are prediction models for Lynch syndrome valid for probands with endometrial cancer?

    PubMed

    Backes, Floor J; Hampel, Heather; Backes, Katherine A; Vaccarello, Luis; Lewandowski, George; Bell, Jeffrey A; Reid, Gary C; Copeland, Larry J; Fowler, Jeffrey M; Cohn, David E

    2009-01-01

    Currently, three prediction models are used to predict a patient's risk of having Lynch syndrome (LS). These models have been validated in probands with colorectal cancer (CRC), but not in probands presenting with endometrial cancer (EMC). Thus, the aim was to determine the performance of these prediction models in women with LS presenting with EMC. Probands with EMC and LS were identified. Personal and family history was entered into three prediction models, PREMM(1,2), MMRpro, and MMRpredict. Probabilities of mutations in the mismatch repair genes were recorded. Accurate prediction was defined as a model predicting at least a 5% chance of a proband carrying a mutation. From 562 patients prospectively enrolled in a clinical trial of patients with EMC, 13 (2.2%) were shown to have LS. Nine patients had a mutation in MSH6, three in MSH2, and one in MLH1. MMRpro predicted that 3 of 9 patients with an MSH6, 3 of 3 with an MSH2, and 1 of 1 patient with an MLH1 mutation could have LS. For MMRpredict, EMC coded as "proximal CRC" predicted 5 of 5, and as "distal CRC" three of five. PREMM(1,2) predicted that 4 of 4 with an MLH1 or MSH2 could have LS. Prediction of LS in probands presenting with EMC using current models for probands with CRC works reasonably well. Further studies are needed to develop models that include questions specific to patients with EMC with a greater age range, as well as placing increased emphasis on prediction of LS in probands with MSH6 mutations.

  2. Model of spacecraft atomic oxygen and solar exposure microenvironments

    NASA Technical Reports Server (NTRS)

    Bourassa, R. J.; Pippin, H. G.

    1993-01-01

    Computer models of environmental conditions in Earth orbit are needed for the following reasons: (1) derivation of material performance parameters from orbital test data, (2) evaluation of spacecraft hardware designs, (3) prediction of material service life, and (4) scheduling spacecraft maintenance. To meet these needs, Boeing has developed programs for modeling atomic oxygen (AO) and solar radiation exposures. The model allows determination of AO and solar ultraviolet (UV) radiation exposures for spacecraft surfaces (1) in arbitrary orientations with respect to the direction of spacecraft motion, (2) overall ranges of solar conditions, and (3) for any mission duration. The models have been successfully applied to prediction of experiment environments on the Long Duration Exposure Facility (LDEF) and for analysis of selected hardware designs for deployment on other spacecraft. The work on these models has been reported at previous LDEF conferences. Since publication of these reports, a revision has been made to the AO calculation for LDEF, and further work has been done on the microenvironments model for solar exposure.

  3. Evaluation of methods for predicting rail-highway crossing hazards.

    DOT National Transportation Integrated Search

    1986-01-01

    The need for improvement at a rail/highway crossing typically is based on the Expected Accident Rate (EAR) in conjunction with other criteria carrying lesser weight. In recent years new models for assessing the need for improvements have been develop...

  4. Stochastic Short-term High-resolution Prediction of Solar Irradiance and Photovoltaic Power Output

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melin, Alexander M.; Olama, Mohammed M.; Dong, Jin

    The increased penetration of solar photovoltaic (PV) energy sources into electric grids has increased the need for accurate modeling and prediction of solar irradiance and power production. Existing modeling and prediction techniques focus on long-term low-resolution prediction over minutes to years. This paper examines the stochastic modeling and short-term high-resolution prediction of solar irradiance and PV power output. We propose a stochastic state-space model to characterize the behaviors of solar irradiance and PV power output. This prediction model is suitable for the development of optimal power controllers for PV sources. A filter-based expectation-maximization and Kalman filtering mechanism is employed tomore » estimate the parameters and states in the state-space model. The mechanism results in a finite dimensional filter which only uses the first and second order statistics. The structure of the scheme contributes to a direct prediction of the solar irradiance and PV power output without any linearization process or simplifying assumptions of the signal’s model. This enables the system to accurately predict small as well as large fluctuations of the solar signals. The mechanism is recursive allowing the solar irradiance and PV power to be predicted online from measurements. The mechanism is tested using solar irradiance and PV power measurement data collected locally in our lab.« less

  5. A Comparison of Combustor-Noise Models

    NASA Technical Reports Server (NTRS)

    Hultgren, Lennart S.

    2012-01-01

    The present status of combustor-noise prediction in the NASA Aircraft Noise Prediction Program (ANOPP)1 for current-generation (N) turbofan engines is summarized. Several semi-empirical models for turbofan combustor noise are discussed, including best methods for near-term updates to ANOPP. An alternate turbine-transmission factor2 will appear as a user selectable option in the combustor-noise module GECOR in the next release. The three-spectrum model proposed by Stone et al.3 for GE turbofan-engine combustor noise is discussed and compared with ANOPP predictions for several relevant cases. Based on the results presented herein and in their report,3 it is recommended that the application of this fully empirical combustor-noise prediction method be limited to situations involving only General-Electric turbofan engines. Long-term needs and challenges for the N+1 through N+3 time frame are discussed. Because the impact of other propulsion-noise sources continues to be reduced due to turbofan design trends, advances in noise-mitigation techniques, and expected aircraft configuration changes, the relative importance of core noise is expected to greatly increase in the future. The noise-source structure in the combustor, including the indirect one, and the effects of the propagation path through the engine and exhaust nozzle need to be better understood. In particular, the acoustic consequences of the expected trends toward smaller, highly efficient gas-generator cores and low-emission fuel-flexible combustors need to be fully investigated since future designs are quite likely to fall outside of the parameter space of existing (semi-empirical) prediction tools.

  6. Reducing usage of the computational resources by event driven approach to model predictive control

    NASA Astrophysics Data System (ADS)

    Misik, Stefan; Bradac, Zdenek; Cela, Arben

    2017-08-01

    This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.

  7. Validation of Automated Prediction of Blood Product Needs Algorithm Processing Continuous Non Invasive Vital Signs Streams (ONPOINT4)

    DTIC Science & Technology

    2018-01-25

    ORGANIZATION NAME(S) AND ADDRESS(ES) University of Maryland, Baltimore 22 S. Greene St. R Adams Cowley Shock Trauma Center, T5R46 Baltimore, MD 21201 8...the revised trauma score, shock index (= heart rate/systolic blood pressure), and assessment of blood consumption, our M2 (bleeding risk index...11 4.2 Transfusion Prediction Model Evaluation in Special Subsets (Model Stress Test) ....... 15 4.3 Feature Sets and Model Stability

  8. Development of Kinetic Mechanisms for Next-Generation Fuels and CFD Simulation of Advanced Combustion Engines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pitz, William J.; McNenly, Matt J.; Whitesides, Russell

    Predictive chemical kinetic models are needed to represent next-generation fuel components and their mixtures with conventional gasoline and diesel fuels. These kinetic models will allow the prediction of the effect of alternative fuel blends in CFD simulations of advanced spark-ignition and compression-ignition engines. Enabled by kinetic models, CFD simulations can be used to optimize fuel formulations for advanced combustion engines so that maximum engine efficiency, fossil fuel displacement goals, and low pollutant emission goals can be achieved.

  9. Vibration Response Models of a Stiffened Aluminum Plate Excited by a Shaker

    NASA Technical Reports Server (NTRS)

    Cabell, Randolph H.

    2008-01-01

    Numerical models of structural-acoustic interactions are of interest to aircraft designers and the space program. This paper describes a comparison between two energy finite element codes, a statistical energy analysis code, a structural finite element code, and the experimentally measured response of a stiffened aluminum plate excited by a shaker. Different methods for modeling the stiffeners and the power input from the shaker are discussed. The results show that the energy codes (energy finite element and statistical energy analysis) accurately predicted the measured mean square velocity of the plate. In addition, predictions from an energy finite element code had the best spatial correlation with measured velocities. However, predictions from a considerably simpler, single subsystem, statistical energy analysis model also correlated well with the spatial velocity distribution. The results highlight a need for further work to understand the relationship between modeling assumptions and the prediction results.

  10. Prediction of physical workload in reduced gravity environments

    NASA Technical Reports Server (NTRS)

    Goldberg, Joseph H.

    1987-01-01

    The background, development, and application of a methodology to predict human energy expenditure and physical workload in low gravity environments, such as a Lunar or Martian base, is described. Based on a validated model to predict energy expenditures in Earth-based industrial jobs, the model relies on an elemental analysis of the proposed job. Because the job itself need not physically exist, many alternative job designs may be compared in their physical workload. The feasibility of using the model for prediction of low gravity work was evaluated by lowering body and load weights, while maintaining basal energy expenditure. Comparison of model results was made both with simulated low gravity energy expenditure studies and with reported Apollo 14 Lunar EVA expenditure. Prediction accuracy was very good for walking and for cart pulling on slopes less than 15 deg, but the model underpredicted the most difficult work conditions. This model was applied to example core sampling and facility construction jobs, as presently conceptualized for a Lunar or Martian base. Resultant energy expenditures and suggested work-rest cycles were well within the range of moderate work difficulty. Future model development requirements were also discussed.

  11. Model predictive control of the solid oxide fuel cell stack temperature with models based on experimental data

    NASA Astrophysics Data System (ADS)

    Pohjoranta, Antti; Halinen, Matias; Pennanen, Jari; Kiviaho, Jari

    2015-03-01

    Generalized predictive control (GPC) is applied to control the maximum temperature in a solid oxide fuel cell (SOFC) stack and the temperature difference over the stack. GPC is a model predictive control method and the models utilized in this work are ARX-type (autoregressive with extra input), multiple input-multiple output, polynomial models that were identified from experimental data obtained from experiments with a complete SOFC system. The proposed control is evaluated by simulation with various input-output combinations, with and without constraints. A comparison with conventional proportional-integral-derivative (PID) control is also made. It is shown that if only the stack maximum temperature is controlled, a standard PID controller can be used to obtain output performance comparable to that obtained with the significantly more complex model predictive controller. However, in order to control the temperature difference over the stack, both the stack minimum and the maximum temperature need to be controlled and this cannot be done with a single PID controller. In such a case the model predictive controller provides a feasible and effective solution.

  12. Stochastic Modeling and Global Warming Trend Extraction For Ocean Acoustic Travel Times.

    DTIC Science & Technology

    1995-01-06

    consideration and that these models can not currently be relied upon by themselves to predict global warming . Experimental data is most certainly needed, not...only to measure global warming itself, but to help improve the ocean model themselves. (AN)

  13. Updates to In-Line Calculation of Photolysis Rates

    EPA Science Inventory

    How photolysis rates are calculated affects ozone and aerosol concentrations predicted by the CMAQ model and the model?s run-time. The standard configuration of CMAQ uses the inline option that calculates photolysis rates by solving the radiative transfer equation for the needed ...

  14. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  15. Highway Safety Manual applied in Missouri - freeway/software : research summary.

    DOT National Transportation Integrated Search

    2016-03-01

    AASHTOs Highway Safety Manual (HSM) : includes models for freeway segments, speedchange : lanes (transitional area between mainline : and ramps), ramps, and interchange terminals. : These predictive models for freeway : interchanges need to be cal...

  16. Predictive models and prognostic factors for upper tract urothelial carcinoma: a comprehensive review of the literature.

    PubMed

    Mbeutcha, Aurélie; Mathieu, Romain; Rouprêt, Morgan; Gust, Kilian M; Briganti, Alberto; Karakiewicz, Pierre I; Shariat, Shahrokh F

    2016-10-01

    In the context of customized patient care for upper tract urothelial carcinoma (UTUC), decision-making could be facilitated by risk assessment and prediction tools. The aim of this study was to provide a critical overview of existing predictive models and to review emerging promising prognostic factors for UTUC. A literature search of articles published in English from January 2000 to June 2016 was performed using PubMed. Studies on risk group stratification models and predictive tools in UTUC were selected, together with studies on predictive factors and biomarkers associated with advanced-stage UTUC and oncological outcomes after surgery. Various predictive tools have been described for advanced-stage UTUC assessment, disease recurrence and cancer-specific survival (CSS). Most of these models are based on well-established prognostic factors such as tumor stage, grade and lymph node (LN) metastasis, but some also integrate newly described prognostic factors and biomarkers. These new prediction tools seem to reach a high level of accuracy, but they lack external validation and decision-making analysis. The combinations of patient-, pathology- and surgery-related factors together with novel biomarkers have led to promising predictive tools for oncological outcomes in UTUC. However, external validation of these predictive models is a prerequisite before their introduction into daily practice. New models predicting response to therapy are urgently needed to allow accurate and safe individualized management in this heterogeneous disease.

  17. Prediction and error of baldcypress stem volume from stump diameter

    Treesearch

    Bernard R. Parresol

    1998-01-01

    The need to estimate the volume of removals occurs for many reasons, such as in trespass cases, severance tax reports, and post-harvest assessments. A logarithmic model is presented for prediction of baldcypress total stem cubic foot volume using stump diameter as the independent variable. Because the error of prediction is as important as the volume estimate, the...

  18. Understanding Quality of Life in Adults with Spinal Cord Injury Via SCI-Related Needs and Secondary Complications.

    PubMed

    Sweet, Shane N; Noreau, Luc; Leblond, Jean; Dumont, Frédéric S

    2014-01-01

    Understanding the factors that can predict greater quality of life (QoL) is important for adults with spinal cord injury (SCI), given that they report lower levels of QoL than the general population. To build a conceptual model linking SCI-related needs, secondary complications, and QoL in adults with SCI. Prior to testing the conceptual model, we aimed to develop and evaluate the factor structure for both SCI-related needs and secondary complications. Individuals with a traumatic SCI (N = 1,137) responded to an online survey measuring 13 SCI-related needs, 13 secondary complications, and the Life Satisfaction Questionnaire to assess QoL. The SCI-related needs and secondary complications were conceptualized into factors, tested with a confirmatory factor analysis, and subsequently evaluated in a structural equation model to predict QoL. The confirmatory factor analysis supported a 2-factor model for SCI related needs, χ(2)(61, N = 1,137) = 250.40, P <.001, comparative fit index (CFI) = .93, root mean square error of approximation (RMSEA) = .05, standardized root mean square residual (SRMR) = .04, and for 11 of the 13 secondary complications, χ(2)(44, N = 1,137) = 305.67, P < .001, CFI = .91, RMSEA = .060, SRMR = .033. The final 2 secondary complications were kept as observed constructs. In the structural model, both vital and personal development unmet SCI-related needs (β = -.22 and -.20, P < .05, respectively) and the neuro-physiological systems factor (β = -.45, P < .05) were negatively related with QoL. Identifying unmet SCI-related needs of individuals with SCI and preventing or managing secondary complications are essential to their QoL.

  19. Plant water potential improves prediction of empirical stomatal models.

    PubMed

    Anderegg, William R L; Wolf, Adam; Arango-Velez, Adriana; Choat, Brendan; Chmura, Daniel J; Jansen, Steven; Kolb, Thomas; Li, Shan; Meinzer, Frederick; Pita, Pilar; Resco de Dios, Víctor; Sperry, John S; Wolfe, Brett T; Pacala, Stephen

    2017-01-01

    Climate change is expected to lead to increases in drought frequency and severity, with deleterious effects on many ecosystems. Stomatal responses to changing environmental conditions form the backbone of all ecosystem models, but are based on empirical relationships and are not well-tested during drought conditions. Here, we use a dataset of 34 woody plant species spanning global forest biomes to examine the effect of leaf water potential on stomatal conductance and test the predictive accuracy of three major stomatal models and a recently proposed model. We find that current leaf-level empirical models have consistent biases of over-prediction of stomatal conductance during dry conditions, particularly at low soil water potentials. Furthermore, the recently proposed stomatal conductance model yields increases in predictive capability compared to current models, and with particular improvement during drought conditions. Our results reveal that including stomatal sensitivity to declining water potential and consequent impairment of plant water transport will improve predictions during drought conditions and show that many biomes contain a diversity of plant stomatal strategies that range from risky to conservative stomatal regulation during water stress. Such improvements in stomatal simulation are greatly needed to help unravel and predict the response of ecosystems to future climate extremes.

  20. Prediction models for intracranial hemorrhage or major bleeding in patients on antiplatelet therapy: a systematic review and external validation study.

    PubMed

    Hilkens, N A; Algra, A; Greving, J P

    2016-01-01

    ESSENTIALS: Prediction models may help to identify patients at high risk of bleeding on antiplatelet therapy. We identified existing prediction models for bleeding and validated them in patients with cerebral ischemia. Five prediction models were identified, all of which had some methodological shortcomings. Performance in patients with cerebral ischemia was poor. Background Antiplatelet therapy is widely used in secondary prevention after a transient ischemic attack (TIA) or ischemic stroke. Bleeding is the main adverse effect of antiplatelet therapy and is potentially life threatening. Identification of patients at increased risk of bleeding may help target antiplatelet therapy. This study sought to identify existing prediction models for intracranial hemorrhage or major bleeding in patients on antiplatelet therapy and evaluate their performance in patients with cerebral ischemia. We systematically searched PubMed and Embase for existing prediction models up to December 2014. The methodological quality of the included studies was assessed with the CHARMS checklist. Prediction models were externally validated in the European Stroke Prevention Study 2, comprising 6602 patients with a TIA or ischemic stroke. We assessed discrimination and calibration of included prediction models. Five prediction models were identified, of which two were developed in patients with previous cerebral ischemia. Three studies assessed major bleeding, one studied intracerebral hemorrhage and one gastrointestinal bleeding. None of the studies met all criteria of good quality. External validation showed poor discriminative performance, with c-statistics ranging from 0.53 to 0.64 and poor calibration. A limited number of prediction models is available that predict intracranial hemorrhage or major bleeding in patients on antiplatelet therapy. The methodological quality of the models varied, but was generally low. Predictive performance in patients with cerebral ischemia was poor. In order to reliably predict the risk of bleeding in patients with cerebral ischemia, development of a prediction model according to current methodological standards is needed. © 2015 International Society on Thrombosis and Haemostasis.

  1. Antecedents of perceived coach autonomy supportive and controlling behaviors: coach psychological need satisfaction and well-being.

    PubMed

    Stebbings, Juliette; Taylor, Ian M; Spray, Christopher M

    2011-04-01

    Within the self-determination theory (Deci & Ryan, 2000) framework, research has considered the consequences of coaches' autonomy supportive and controlling behaviors on various athlete outcomes (e.g., motivation and performance). The antecedents of such behaviors, however, have received little attention. Coaches (N = 443) from a variety of sports and competitive levels completed a self-report questionnaire to assess their psychological need satisfaction, well-being and perceived interpersonal behaviors toward their athletes. Structural equation modeling demonstrated that coaches' competence and autonomy need satisfaction positively predicted their levels of psychological well-being, as indexed by positive affect and subjective vitality. In turn, coaches' psychological well-being positively predicted their perceived autonomy support toward their athletes, and negatively predicted their perceived controlling behaviors. Overall, the results highlight the importance of coaching contexts that facilitate coaches' psychological need satisfaction and well-being, thereby increasing the likelihood of adaptive coach interpersonal behavior toward athletes.

  2. A Novel Modelling Approach for Predicting Forest Growth and Yield under Climate Change.

    PubMed

    Ashraf, M Irfan; Meng, Fan-Rui; Bourque, Charles P-A; MacLean, David A

    2015-01-01

    Global climate is changing due to increasing anthropogenic emissions of greenhouse gases. Forest managers need growth and yield models that can be used to predict future forest dynamics during the transition period of present-day forests under a changing climatic regime. In this study, we developed a forest growth and yield model that can be used to predict individual-tree growth under current and projected future climatic conditions. The model was constructed by integrating historical tree growth records with predictions from an ecological process-based model using neural networks. The new model predicts basal area (BA) and volume growth for individual trees in pure or mixed species forests. For model development, tree-growth data under current climatic conditions were obtained using over 3000 permanent sample plots from the Province of Nova Scotia, Canada. Data to reflect tree growth under a changing climatic regime were projected with JABOWA-3 (an ecological process-based model). Model validation with designated data produced model efficiencies of 0.82 and 0.89 in predicting individual-tree BA and volume growth. Model efficiency is a relative index of model performance, where 1 indicates an ideal fit, while values lower than zero means the predictions are no better than the average of the observations. Overall mean prediction error (BIAS) of basal area and volume growth predictions was nominal (i.e., for BA: -0.0177 cm(2) 5-year(-1) and volume: 0.0008 m(3) 5-year(-1)). Model variability described by root mean squared error (RMSE) in basal area prediction was 40.53 cm(2) 5-year(-1) and 0.0393 m(3) 5-year(-1) in volume prediction. The new modelling approach has potential to reduce uncertainties in growth and yield predictions under different climate change scenarios. This novel approach provides an avenue for forest managers to generate required information for the management of forests in transitional periods of climate change. Artificial intelligence technology has substantial potential in forest modelling.

  3. A Novel Modelling Approach for Predicting Forest Growth and Yield under Climate Change

    PubMed Central

    Ashraf, M. Irfan; Meng, Fan-Rui; Bourque, Charles P.-A.; MacLean, David A.

    2015-01-01

    Global climate is changing due to increasing anthropogenic emissions of greenhouse gases. Forest managers need growth and yield models that can be used to predict future forest dynamics during the transition period of present-day forests under a changing climatic regime. In this study, we developed a forest growth and yield model that can be used to predict individual-tree growth under current and projected future climatic conditions. The model was constructed by integrating historical tree growth records with predictions from an ecological process-based model using neural networks. The new model predicts basal area (BA) and volume growth for individual trees in pure or mixed species forests. For model development, tree-growth data under current climatic conditions were obtained using over 3000 permanent sample plots from the Province of Nova Scotia, Canada. Data to reflect tree growth under a changing climatic regime were projected with JABOWA-3 (an ecological process-based model). Model validation with designated data produced model efficiencies of 0.82 and 0.89 in predicting individual-tree BA and volume growth. Model efficiency is a relative index of model performance, where 1 indicates an ideal fit, while values lower than zero means the predictions are no better than the average of the observations. Overall mean prediction error (BIAS) of basal area and volume growth predictions was nominal (i.e., for BA: -0.0177 cm2 5-year-1 and volume: 0.0008 m3 5-year-1). Model variability described by root mean squared error (RMSE) in basal area prediction was 40.53 cm2 5-year-1 and 0.0393 m3 5-year-1 in volume prediction. The new modelling approach has potential to reduce uncertainties in growth and yield predictions under different climate change scenarios. This novel approach provides an avenue for forest managers to generate required information for the management of forests in transitional periods of climate change. Artificial intelligence technology has substantial potential in forest modelling. PMID:26173081

  4. Accuracy of continuous noninvasive hemoglobin monitoring for the prediction of blood transfusions in trauma patients.

    PubMed

    Galvagno, Samuel M; Hu, Peter; Yang, Shiming; Gao, Cheng; Hanna, David; Shackelford, Stacy; Mackenzie, Colin

    2015-12-01

    Early detection of hemorrhagic shock is required to facilitate prompt coordination of blood component therapy delivery to the bedside and to expedite performance of lifesaving interventions. Standard physical findings and vital signs are difficult to measure during the acute resuscitation stage, and these measures are often inaccurate until patients deteriorate to a state of decompensated shock. The aim of this study is to examine a severely injured trauma patient population to determine whether a noninvasive SpHb monitor can predict the need for urgent blood transfusion (universal donor or additional urgent blood transfusion) during the first 12 h of trauma patient resuscitation. We hypothesize that trends in continuous SpHb, combined with easily derived patient-specific factors, can identify the immediate need for transfusion in trauma patients. Subjects were enrolled if directly admitted to the trauma center, >17 years of age, and with a shock index (heart rate/systolic blood pressure) >0.62. Upon admission, a Masimo Radical-7 co-oximeter sensor (Masimo Corporation, Irvine, CA) was applied, providing measurement of continuous non-invasive hemoglobin (SpHb) levels. Blood was drawn and hemoglobin concentration analyzed and conventional pulse oximetry photopletysmograph signals were continuously recorded. Demographic information and both prehospital and admission vital signs were collected. The primary outcome was transfusion of at least one unit of packed red blood cells within 24 h of admission. Eight regression models (C1-C8) were evaluated for the prediction of blood use by comparing area under receiver operating curve (AUROC) at different time intervals after admission. 711 subjects had continuous vital signs waveforms available, to include heart rate (HR), SpHb and SpO2 trends. When SpHb was monitored for 15 min, SpHb did not increase AUROC for prediction of transfusion. The highest ROC was recorded for model C8 (age, sex, prehospital shock index, admission HR, SpHb and SpO2) for the prediction of blood products within the first 3 h of admission. When data from 15 min of continuous monitoring were analyzed, significant improvement in AUROC occurred as more variables were added to the model; however, the addition of SpHb to any of the models did not improve AUROC significantly for prediction of blood use within the first 3 h of admission in comparison to analysis of conventional oximetry features. The results demonstrate that SpHb monitoring, accompanied by continuous vital signs data and adjusted for age and sex, has good accuracy for the prediction of need for transfusion; however, as an independent variable, SpHb did not enhance predictive models in comparison to use of features extracted from conventional pulse oximetry. Nor was shock index better than conventional oximetry at discriminating hemorrhaging and prediction of casualties receiving blood. In this population of trauma patients, noninvasive SpHb monitoring, including both trends and absolute values, did not enhance the ability to predict the need for blood transfusion.

  5. Intrinsic dimensionality predicts the saliency of natural dynamic scenes.

    PubMed

    Vig, Eleonora; Dorr, Michael; Martinetz, Thomas; Barth, Erhardt

    2012-06-01

    Since visual attention-based computer vision applications have gained popularity, ever more complex, biologically inspired models seem to be needed to predict salient locations (or interest points) in naturalistic scenes. In this paper, we explore how far one can go in predicting eye movements by using only basic signal processing, such as image representations derived from efficient coding principles, and machine learning. To this end, we gradually increase the complexity of a model from simple single-scale saliency maps computed on grayscale videos to spatiotemporal multiscale and multispectral representations. Using a large collection of eye movements on high-resolution videos, supervised learning techniques fine-tune the free parameters whose addition is inevitable with increasing complexity. The proposed model, although very simple, demonstrates significant improvement in predicting salient locations in naturalistic videos over four selected baseline models and two distinct data labeling scenarios.

  6. A comprehensive pipeline for multi-resolution modeling of the mitral valve: Validation, computational efficiency, and predictive capability.

    PubMed

    Drach, Andrew; Khalighi, Amir H; Sacks, Michael S

    2018-02-01

    Multiple studies have demonstrated that the pathological geometries unique to each patient can affect the durability of mitral valve (MV) repairs. While computational modeling of the MV is a promising approach to improve the surgical outcomes, the complex MV geometry precludes use of simplified models. Moreover, the lack of complete in vivo geometric information presents significant challenges in the development of patient-specific computational models. There is thus a need to determine the level of detail necessary for predictive MV models. To address this issue, we have developed a novel pipeline for building attribute-rich computational models of MV with varying fidelity directly from the in vitro imaging data. The approach combines high-resolution geometric information from loaded and unloaded states to achieve a high level of anatomic detail, followed by mapping and parametric embedding of tissue attributes to build a high-resolution, attribute-rich computational models. Subsequent lower resolution models were then developed and evaluated by comparing the displacements and surface strains to those extracted from the imaging data. We then identified the critical levels of fidelity for building predictive MV models in the dilated and repaired states. We demonstrated that a model with a feature size of about 5 mm and mesh size of about 1 mm was sufficient to predict the overall MV shape, stress, and strain distributions with high accuracy. However, we also noted that more detailed models were found to be needed to simulate microstructural events. We conclude that the developed pipeline enables sufficiently complex models for biomechanical simulations of MV in normal, dilated, repaired states. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Putting mechanisms into crop production models.

    PubMed

    Boote, Kenneth J; Jones, James W; White, Jeffrey W; Asseng, Senthold; Lizaso, Jon I

    2013-09-01

    Crop growth models dynamically simulate processes of C, N and water balance on daily or hourly time-steps to predict crop growth and development and at season-end, final yield. Their ability to integrate effects of genetics, environment and crop management have led to applications ranging from understanding gene function to predicting potential impacts of climate change. The history of crop models is reviewed briefly, and their level of mechanistic detail for assimilation and respiration, ranging from hourly leaf-to-canopy assimilation to daily radiation-use efficiency is discussed. Crop models have improved steadily over the past 30-40 years, but much work remains. Improvements are needed for the prediction of transpiration response to elevated CO₂ and high temperature effects on phenology and reproductive fertility, and simulation of root growth and nutrient uptake under stressful edaphic conditions. Mechanistic improvements are needed to better connect crop growth to genetics and to soil fertility, soil waterlogging and pest damage. Because crop models integrate multiple processes and consider impacts of environment and management, they have excellent potential for linking research from genomics and allied disciplines to crop responses at the field scale, thus providing a valuable tool for deciphering genotype by environment by management effects. © 2013 John Wiley & Sons Ltd.

  8. Energy and time modelling of kerbside waste collection: Changes incurred when adding source separated food waste.

    PubMed

    Edwards, Joel; Othman, Maazuza; Burn, Stewart; Crossin, Enda

    2016-10-01

    The collection of source separated kerbside municipal FW (SSFW) is being incentivised in Australia, however such a collection is likely to increase the fuel and time a collection truck fleet requires. Therefore, waste managers need to determine whether the incentives outweigh the cost. With literature scarcely describing the magnitude of increase, and local parameters playing a crucial role in accurately modelling kerbside collection; this paper develops a new general mathematical model that predicts the energy and time requirements of a collection regime whilst incorporating the unique variables of different jurisdictions. The model, Municipal solid waste collect (MSW-Collect), is validated and shown to be more accurate at predicting fuel consumption and trucks required than other common collection models. When predicting changes incurred for five different SSFW collection scenarios, results show that SSFW scenarios require an increase in fuel ranging from 1.38% to 57.59%. There is also a need for additional trucks across most SSFW scenarios tested. All SSFW scenarios are ranked and analysed in regards to fuel consumption; sensitivity analysis is conducted to test key assumptions. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  9. Multiscale modeling and distributed computing to predict cosmesis outcome after a lumpectomy

    NASA Astrophysics Data System (ADS)

    Garbey, M.; Salmon, R.; Thanoon, D.; Bass, B. L.

    2013-07-01

    Surgery for early stage breast carcinoma is either total mastectomy (complete breast removal) or surgical lumpectomy (only tumor removal). The lumpectomy or partial mastectomy is intended to preserve a breast that satisfies the woman's cosmetic, emotional and physical needs. But in a fairly large number of cases the cosmetic outcome is not satisfactory. Today, predicting that surgery outcome is essentially based on heuristic. Modeling such a complex process must encompass multiple scales, in space from cells to tissue, as well as in time, from minutes for the tissue mechanics to months for healing. The goal of this paper is to present a first step in multiscale modeling of the long time scale prediction of breast shape after tumor resection. This task requires coupling very different mechanical and biological models with very different computing needs. We provide a simple illustration of the application of heterogeneous distributed computing and modular software design to speed up the model development. Our computational framework serves currently to test hypothesis on breast tissue healing in a pilot study with women who have been elected to undergo BCT and are being treated at the Methodist Hospital in Houston, TX.

  10. A perspective on sustained marine observations for climate modelling and prediction

    PubMed Central

    Dunstone, Nick J.

    2014-01-01

    Here, I examine some of the many varied ways in which sustained global ocean observations are used in numerical modelling activities. In particular, I focus on the use of ocean observations to initialize predictions in ocean and climate models. Examples are also shown of how models can be used to assess the impact of both current ocean observations and to simulate that of potential new ocean observing platforms. The ocean has never been better observed than it is today and similarly ocean models have never been as capable at representing the real ocean as they are now. However, there remain important unanswered questions that can likely only be addressed via future improvements in ocean observations. In particular, ocean observing systems need to respond to the needs of the burgeoning field of near-term climate predictions. Although new ocean observing platforms promise exciting new discoveries, there is a delicate balance to be made between their funding and that of the current ocean observing system. Here, I identify the need to secure long-term funding for ocean observing platforms as they mature, from a mainly research exercise to an operational system for sustained observation over climate change time scales. At the same time, considerable progress continues to be made via ship-based observing campaigns and I highlight some that are dedicated to addressing uncertainties in key ocean model parametrizations. The use of ocean observations to understand the prominent long time scale changes observed in the North Atlantic is another focus of this paper. The exciting first decade of monitoring of the Atlantic meridional overturning circulation by the RAPID-MOCHA array is highlighted. The use of ocean and climate models as tools to further probe the drivers of variability seen in such time series is another exciting development. I also discuss the need for a concerted combined effort from climate models and ocean observations in order to understand the current slow-down in surface global warming. PMID:25157195

  11. 5D Modelling: An Efficient Approach for Creating Spatiotemporal Predictive 3D Maps of Large-Scale Cultural Resources

    NASA Astrophysics Data System (ADS)

    Doulamis, A.; Doulamis, N.; Ioannidis, C.; Chrysouli, C.; Grammalidis, N.; Dimitropoulos, K.; Potsiou, C.; Stathopoulou, E.-K.; Ioannides, M.

    2015-08-01

    Outdoor large-scale cultural sites are mostly sensitive to environmental, natural and human made factors, implying an imminent need for a spatio-temporal assessment to identify regions of potential cultural interest (material degradation, structuring, conservation). On the other hand, in Cultural Heritage research quite different actors are involved (archaeologists, curators, conservators, simple users) each of diverse needs. All these statements advocate that a 5D modelling (3D geometry plus time plus levels of details) is ideally required for preservation and assessment of outdoor large scale cultural sites, which is currently implemented as a simple aggregation of 3D digital models at different time and levels of details. The main bottleneck of such an approach is its complexity, making 5D modelling impossible to be validated in real life conditions. In this paper, a cost effective and affordable framework for 5D modelling is proposed based on a spatial-temporal dependent aggregation of 3D digital models, by incorporating a predictive assessment procedure to indicate which regions (surfaces) of an object should be reconstructed at higher levels of details at next time instances and which at lower ones. In this way, dynamic change history maps are created, indicating spatial probabilities of regions needed further 3D modelling at forthcoming instances. Using these maps, predictive assessment can be made, that is, to localize surfaces within the objects where a high accuracy reconstruction process needs to be activated at the forthcoming time instances. The proposed 5D Digital Cultural Heritage Model (5D-DCHM) is implemented using open interoperable standards based on the CityGML framework, which also allows the description of additional semantic metadata information. Visualization aspects are also supported to allow easy manipulation, interaction and representation of the 5D-DCHM geometry and the respective semantic information. The open source 3DCityDB incorporating a PostgreSQL geo-database is used to manage and manipulate 3D data and their semantics.

  12. Calculations on the Back of an Envelope Model: Applying Seasonal Fecundity Models to Species’ Range Limits

    EPA Science Inventory

    Most predictions of the effect of climate change on species’ ranges are based on correlations between climate and current species’ distributions. These so-called envelope models may be a good first approximation, but we need demographically mechanistic models to incorporate the ...

  13. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes.

    PubMed

    Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-11-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  14. Prediction of first episode of panic attack among white-collar workers.

    PubMed

    Watanabe, Akira; Nakao, Kazuhisa; Tokuyama, Madoka; Takeda, Masatoshi

    2005-04-01

    The purpose of the present study was to elucidate a longitudinal matrix of the etiology for first-episode panic attack among white-collar workers. A path model was designed for this purpose. A 5-year, open-cohort study was carried out in a Japanese company. To evaluate the risk factors associated with the onset of a first episode of panic attack, the odds ratios of a new episode of panic attack were calculated by logistic regression. The path model contained five predictor variables: gender difference, overprotection, neuroticism, lifetime history of major depression, and recent stressful life events. The logistic regression analysis indicated that a person with a lifetime history of major depression and recent stressful life events had a fivefold and a threefold higher risk of panic attacks at follow up, respectively. The path model for the prediction of a first episode of panic attack fitted the data well. However, this model presented low accountability for the variance in the ultimate dependent variables, the first episode of panic attack. Three predictors (neuroticism, lifetime history of major depression, and recent stressful life events) had a direct effect on the risk for a first episode of panic attack, whereas gender difference and overprotection had no direct effect. The present model could not fully predict first episodes of panic attack in white-collar workers. To make a path model for the prediction of the first episode of panic attack, other strong predictor variables, which were not surveyed in the present study, are needed. It is suggested that genetic variables are among the other strong predictor variables. A new path model containing genetic variables (e.g. family history etc.) will be needed to predict the first episode of panic attack.

  15. Using models to manage systems subject to sustainability indicators

    USGS Publications Warehouse

    Hill, M.C.

    2006-01-01

    Mathematical and numerical models can provide insight into sustainability indicators using relevant simulated quantities, which are referred to here as predictions. To be useful, many concerns need to be considered. Four are discussed here: (a) mathematical and numerical accuracy of the model; (b) the accuracy of the data used in model development, (c) the information observations provide to aspects of the model important to predictions of interest as measured using sensitivity analysis; and (d) the existence of plausible alternative models for a given system. The four issues are illustrated using examples from conservative and transport modelling, and using conceptual arguments. Results suggest that ignoring these issues can produce misleading conclusions.

  16. Finite Element Model Development For Aircraft Fuselage Structures

    NASA Technical Reports Server (NTRS)

    Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.

    2000-01-01

    The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results.

  17. Tornado damage risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reinhold, T.A.; Ellingwood, B.

    1982-09-01

    Several proposed models were evaluated for predicting tornado wind speed probabilities at nuclear plant sites as part of a program to develop statistical data on tornadoes needed for probability-based load combination analysis. A unified model was developed which synthesized the desired aspects of tornado occurrence and damage potential. The sensitivity of wind speed probability estimates to various tornado modeling assumptions are examined, and the probability distributions of tornado wind speed that are needed for load combination studies are presented.

  18. Comparing an annual and daily time-step model for predicting field-scale P loss

    USDA-ARS?s Scientific Manuscript database

    Several models with varying degrees of complexity are available for describing P movement through the landscape. The complexity of these models is dependent on the amount of data required by the model, the number of model parameters needed to be estimated, the theoretical rigor of the governing equa...

  19. Universal access to HIV treatment versus universal 'test and treat': transmission, drug resistance & treatment costs.

    PubMed

    Wagner, Bradley G; Blower, Sally

    2012-01-01

    In South Africa (SA) universal access to treatment for HIV-infected individuals in need has yet to be achieved. Currently ~1 million receive treatment, but an additional 1.6 million are in need. It is being debated whether to use a universal 'test and treat' (T&T) strategy to try to eliminate HIV in SA; treatment reduces infectivity and hence transmission. Under a T&T strategy all HIV-infected individuals would receive treatment whether in need or not. This would require treating 5 million individuals almost immediately and providing treatment for several decades. We use a validated mathematical model to predict impact and costs of: (i) a universal T&T strategy and (ii) achieving universal access to treatment. Using modeling the WHO has predicted a universal T&T strategy in SA would eliminate HIV within a decade, and (after 40 years) cost ~$10 billion less than achieving universal access. In contrast, we predict a universal T&T strategy in SA could eliminate HIV, but take 40 years and cost ~$12 billion more than achieving universal access. We determine the difference in predictions is because the WHO has under-estimated survival time on treatment and ignored the risk of resistance. We predict, after 20 years, ~2 million individuals would need second-line regimens if a universal T&T strategy is implemented versus ~1.5 million if universal access is achieved. Costs need to be realistically estimated and multiple evaluation criteria used to compare 'treatment as prevention' with other prevention strategies. Before implementing a universal T&T strategy, which may not be sustainable, we recommend striving to achieve universal access to treatment as quickly as possible. We predict achieving universal access to treatment would be a very effective 'treatment as prevention' approach and bring the HIV epidemic in SA close to elimination, preventing ~4 million infections after 20 years and ~11 million after 40 years.

  20. Highway runoff quality models for the protection of environmentally sensitive areas

    NASA Astrophysics Data System (ADS)

    Trenouth, William R.; Gharabaghi, Bahram

    2016-11-01

    This paper presents novel highway runoff quality models using artificial neural networks (ANN) which take into account site-specific highway traffic and seasonal storm event meteorological factors to predict the event mean concentration (EMC) statistics and mean daily unit area load (MDUAL) statistics of common highway pollutants for the design of roadside ditch treatment systems (RDTS) to protect sensitive receiving environs. A dataset of 940 monitored highway runoff events from fourteen sites located in five countries (Canada, USA, Australia, New Zealand, and China) was compiled and used to develop ANN models for the prediction of highway runoff suspended solids (TSS) seasonal EMC statistical distribution parameters, as well as the MDUAL statistics for four different heavy metal species (Cu, Zn, Cr and Pb). TSS EMCs are needed to estimate the minimum required removal efficiency of the RDTS needed in order to improve highway runoff quality to meet applicable standards and MDUALs are needed to calculate the minimum required capacity of the RDTS to ensure performance longevity.

  1. Workshop on Planning and Learning in Multi- Agent Environments

    DTIC Science & Technology

    2014-12-31

    needed for translating the physical aspects of an interaction (see Section 3.1) into the numeric utility values needed for game -theoretic...calculations. Furthermore, the game -theoretic techniques themselves will require significant enhancements. Game -theoretic solution concepts (e.g., Nash...robotics. Real-time strategy games may provide useful data for research on predictive models of ad- versaries, modeling long-term and short-term plans

  2. A predictive framework and review of the ecological impacts of exotic plant invasions on reptiles and amphibians.

    PubMed

    Martin, Leigh J; Murray, Brad R

    2011-05-01

    The invasive spread of exotic plants in native vegetation can pose serious threats to native faunal assemblages. This is of particular concern for reptiles and amphibians because they form a significant component of the world's vertebrate fauna, play a pivotal role in ecosystem functioning and are often neglected in biodiversity research. A framework to predict how exotic plant invasion will affect reptile and amphibian assemblages is imperative for conservation, management and the identification of research priorities. Here, we present a new predictive framework that integrates three mechanistic models. These models are based on exotic plant invasion altering: (1) habitat structure; (2) herbivory and predator-prey interactions; (3) the reproductive success of reptile and amphibian species and assemblages. We present a series of testable predictions from these models that arise from the interplay over time among three exotic plant traits (growth form, area of coverage, taxonomic distinctiveness) and six traits of reptiles and amphibians (body size, lifespan, home range size, habitat specialisation, diet, reproductive strategy). A literature review provided robust empirical evidence of exotic plant impacts on reptiles and amphibians from each of the three model mechanisms. Evidence relating to the role of body size and diet was less clear-cut, indicating the need for further research. The literature provided limited empirical support for many of the other model predictions. This was not, however, because findings contradicted our model predictions but because research in this area is sparse. In particular, the small number of studies specifically examining the effects of exotic plants on amphibians highlights the pressing need for quantitative research in this area. There is enormous scope for detailed empirical investigation of interactions between exotic plants and reptile and amphibian species and assemblages. The framework presented here and further testing of predictions will provide a basis for informing and prioritising environmental management and exotic plant control efforts. © 2010 The Authors. Biological Reviews © 2010 Cambridge Philosophical Society.

  3. Ensuring long-term utility of the AOP framework and knowledge for multiple stakeholders

    EPA Science Inventory

    1.Introduction There is a need to increase the development and implementation of predictive approaches to support chemical safety assessment. These predictive approaches feature generation of data from tools such as computational models, pathway-based in vitro assays, and short-t...

  4. MODEL OF PHYTOPLANKTON COMPETITION FOR LIMITING AND NONLIMITING NUTRIENTS: IMPLICATIONS FOR DEVELOPMENT OF ESTUARINE AND NEARSHORE MANAGEMENT SCHEMES

    EPA Science Inventory

    The global increase of noxious bloom occurrences has increased the need for phytoplankton management schemes. Such schemes require the ability to predict phytoplankton succession. Equilibrium Resources Competition theory, which is popular for predicting succession in lake systems...

  5. Emission of pesticides into the air

    USGS Publications Warehouse

    Van Den, Berg; Kubiak, R.; Benjey, W.G.; Majewski, M.S.; Yates, S.R.; Reeves, G.L.; Smelt, J.H.; Van Der Linden, A. M. A.

    1999-01-01

    During and after the application of a pesticide in agriculture, a substantial fraction of the dosage may enter the atmosphere and be transported over varying distances downwind of the target. The rate and extent of the emission during application, predominantly as spray particle drift, depends primarily on the application method (equipment and technique), the formulation and environmental conditions, whereas the emission after application depends primarily on the properties of the pesticide, soils, crops and environmental conditions. The fraction of the dosage that misses the target area may be high in some cases and more experimental data on this loss term are needed for various application types and weather conditions. Such data are necessary to test spray drift models, and for further model development and verification as well. Following application, the emission of soil fumigants and soil incorporated pesticides into the air can be measured and computed with reasonable accuracy, but further model development is needed to improve the reliability of the model predictions. For soil surface applied pesticides reliable measurement methods are available, but there is not yet a reliable model. Further model development is required which must be verified by field experiments. Few data are available on pesticide volatilization from plants and more field experiments are also needed to study the fate processes on the plants. Once this information is available, a model needs to be developed to predict the volatilization of pesticides from plants, which, again, should be verified with field measurements. For regional emission estimates, a link between data on the temporal and spatial pesticide use and a geographical information system for crops and soils with their characteristics is needed.

  6. Bayesian averaging over Decision Tree models for trauma severity scoring.

    PubMed

    Schetinin, V; Jakaite, L; Krzanowski, W

    2018-01-01

    Health care practitioners analyse possible risks of misleading decisions and need to estimate and quantify uncertainty in predictions. We have examined the "gold" standard of screening a patient's conditions for predicting survival probability, based on logistic regression modelling, which is used in trauma care for clinical purposes and quality audit. This methodology is based on theoretical assumptions about data and uncertainties. Models induced within such an approach have exposed a number of problems, providing unexplained fluctuation of predicted survival and low accuracy of estimating uncertainty intervals within which predictions are made. Bayesian method, which in theory is capable of providing accurate predictions and uncertainty estimates, has been adopted in our study using Decision Tree models. Our approach has been tested on a large set of patients registered in the US National Trauma Data Bank and has outperformed the standard method in terms of prediction accuracy, thereby providing practitioners with accurate estimates of the predictive posterior densities of interest that are required for making risk-aware decisions. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Predicting students' physical activity and health-related well-being: a prospective cross-domain investigation of motivation across school physical education and exercise settings.

    PubMed

    Standage, Martyn; Gillison, Fiona B; Ntoumanis, Nikos; Treasure, Darren C

    2012-02-01

    A three-wave prospective design was used to assess a model of motivation guided by self-determination theory (Ryan & Deci, 2008) spanning the contexts of school physical education (PE) and exercise. The outcome variables examined were health-related quality of life (HRQoL), physical self-concept (PSC), and 4 days of objectively assessed estimates of activity. Secondary school students (n = 494) completed questionnaires at three separate time points and were familiarized with how to use a sealed pedometer. Results of structural equation modeling supported a model in which perceptions of autonomy support from a PE teacher positively predicted PE-related need satisfaction (autonomy, competence, and relatedness). Competence predicted PSC, whereas relatedness predicted HRQoL. Autonomy and competence positively predicted autonomous motivation toward PE, which in turn positively predicted autonomous motivation toward exercise (i.e., 4-day pedometer step count). Autonomous motivation toward exercise positively predicted step count, HRQoL, and PSC. Results of multisample structural equation modeling supported gender invariance. Suggestions for future work are discussed.

  8. The First Attempt at Non-Linear in Silico Prediction of Sampling Rates for Polar Organic Chemical Integrative Samplers (POCIS)

    PubMed Central

    2016-01-01

    Modeling and prediction of polar organic chemical integrative sampler (POCIS) sampling rates (Rs) for 73 compounds using artificial neural networks (ANNs) is presented for the first time. Two models were constructed: the first was developed ab initio using a genetic algorithm (GSD-model) to shortlist 24 descriptors covering constitutional, topological, geometrical and physicochemical properties and the second model was adapted for Rs prediction from a previous chromatographic retention model (RTD-model). Mechanistic evaluation of descriptors showed that models did not require comprehensive a priori information to predict Rs. Average predicted errors for the verification and blind test sets were 0.03 ± 0.02 L d–1 (RTD-model) and 0.03 ± 0.03 L d–1 (GSD-model) relative to experimentally determined Rs. Prediction variability in replicated models was the same or less than for measured Rs. Networks were externally validated using a measured Rs data set of six benzodiazepines. The RTD-model performed best in comparison to the GSD-model for these compounds (average absolute errors of 0.0145 ± 0.008 L d–1 and 0.0437 ± 0.02 L d–1, respectively). Improvements to generalizability of modeling approaches will be reliant on the need for standardized guidelines for Rs measurement. The use of in silico tools for Rs determination represents a more economical approach than laboratory calibrations. PMID:27363449

  9. CaPTHUS scoring model in primary hyperparathyroidism: can it eliminate the need for ioPTH testing?

    PubMed

    Elfenbein, Dawn M; Weber, Sara; Schneider, David F; Sippel, Rebecca S; Chen, Herbert

    2015-04-01

    The CaPTHUS model was reported to have a positive predictive value of 100 % to correctly predict single-gland disease in patients with primary hyperparathyroidism, thus obviating the need for intraoperative parathyroid hormone (ioPTH) testing. We sought to apply the CaPTHUS scoring model in our patient population and assess its utility in predicting long-term biochemical cure. We retrospective reviewed all parathyroidectomies for primary hyperparathyroidism performed at our university hospital from 2003 to 2012. We routinely perform ioPTH testing. Biochemical cure was defined as a normal calcium level at 6 months. A total of 1,421 patients met the inclusion criteria: 78 % of patients had a single adenoma at the time of surgery, 98 % had a normal serum calcium at 1 week postoperatively, and 96 % had a normal serum calcium level 6 months postoperatively. Using the CaPTHUS scoring model, 307 patients (22.5 %) had a score of ≥ 3, with a positive predictive value of 91 % for single adenoma. A CaPTHUS score of ≥ 3 had a positive predictive value of 98 % for biochemical cure at 1 week as well as at 6 months. In our population, where ioPTH testing is used routinely to guide use of bilateral exploration, patients with a preoperative CaPTHUS score of ≥ 3 had good long-term biochemical cure rates. However, the model only predicted adenoma in 91 % of cases. If minimally invasive parathyroidectomy without ioPTH testing had been done for these patients, the cure rate would have dropped from 98 % to an unacceptable 89 %. Even in these patients with high CaPTHUS scores, multigland disease is present in almost 10 %, and ioPTH testing is necessary.

  10. The importance of different frequency bands in predicting subcutaneous glucose concentration in type 1 diabetic patients.

    PubMed

    Lu, Yinghui; Gribok, Andrei V; Ward, W Kenneth; Reifman, Jaques

    2010-08-01

    We investigated the relative importance and predictive power of different frequency bands of subcutaneous glucose signals for the short-term (0-50 min) forecasting of glucose concentrations in type 1 diabetic patients with data-driven autoregressive (AR) models. The study data consisted of minute-by-minute glucose signals collected from nine deidentified patients over a five-day period using continuous glucose monitoring devices. AR models were developed using single and pairwise combinations of frequency bands of the glucose signal and compared with a reference model including all bands. The results suggest that: for open-loop applications, there is no need to explicitly represent exogenous inputs, such as meals and insulin intake, in AR models; models based on a single-frequency band, with periods between 60-120 min and 150-500 min, yield good predictive power (error <3 mg/dL) for prediction horizons of up to 25 min; models based on pairs of bands produce predictions that are indistinguishable from those of the reference model as long as the 60-120 min period band is included; and AR models can be developed on signals of short length (approximately 300 min), i.e., ignoring long circadian rhythms, without any detriment in prediction accuracy. Together, these findings provide insights into efficient development of more effective and parsimonious data-driven models for short-term prediction of glucose concentrations in diabetic patients.

  11. Reevaluation of a walleye (Sander vitreus) bioenergetics model

    USGS Publications Warehouse

    Madenjian, Charles P.; Wang, Chunfang

    2013-01-01

    Walleye (Sander vitreus) is an important sport fish throughout much of North America, and walleye populations support valuable commercial fisheries in certain lakes as well. Using a corrected algorithm for balancing the energy budget, we reevaluated the performance of the Wisconsin bioenergetics model for walleye in the laboratory. Walleyes were fed rainbow smelt (Osmerus mordax) in four laboratory tanks each day during a 126-day experiment. Feeding rates ranged from 1.4 to 1.7 % of walleye body weight per day. Based on a statistical comparison of bioenergetics model predictions of monthly consumption with observed monthly consumption, we concluded that the bioenergetics model estimated food consumption by walleye without any significant bias. Similarly, based on a statistical comparison of bioenergetics model predictions of weight at the end of the monthly test period with observed weight, we concluded that the bioenergetics model predicted walleye growth without any detectable bias. In addition, the bioenergetics model predictions of cumulative consumption over the 126-day experiment differed fromobserved cumulative consumption by less than 10 %. Although additional laboratory and field testing will be needed to fully evaluate model performance, based on our laboratory results, the Wisconsin bioenergetics model for walleye appears to be providing unbiased predictions of food consumption.

  12. Calibration and validation of toxicokinetic-toxicodynamic models for three neonicotinoids and some aquatic macroinvertebrates.

    PubMed

    Focks, Andreas; Belgers, Dick; Boerwinkel, Marie-Claire; Buijse, Laura; Roessink, Ivo; Van den Brink, Paul J

    2018-05-01

    Exposure patterns in ecotoxicological experiments often do not match the exposure profiles for which a risk assessment needs to be performed. This limitation can be overcome by using toxicokinetic-toxicodynamic (TKTD) models for the prediction of effects under time-variable exposure. For the use of TKTD models in the environmental risk assessment of chemicals, it is required to calibrate and validate the model for specific compound-species combinations. In this study, the survival of macroinvertebrates after exposure to the neonicotinoid insecticide was modelled using TKTD models from the General Unified Threshold models of Survival (GUTS) framework. The models were calibrated on existing survival data from acute or chronic tests under static exposure regime. Validation experiments were performed for two sets of species-compound combinations: one set focussed on multiple species sensitivity to a single compound: imidacloprid, and the other set on the effects of multiple compounds for a single species, i.e., the three neonicotinoid compounds imidacloprid, thiacloprid and thiamethoxam, on the survival of the mayfly Cloeon dipterum. The calibrated models were used to predict survival over time, including uncertainty ranges, for the different time-variable exposure profiles used in the validation experiments. From the comparison between observed and predicted survival, it appeared that the accuracy of the model predictions was acceptable for four of five tested species in the multiple species data set. For compounds such as neonicotinoids, which are known to have the potential to show increased toxicity under prolonged exposure, the calibration and validation of TKTD models for survival needs to be performed ideally by considering calibration data from both acute and chronic tests.

  13. Detection of carbon monoxide trends in the presence of interannual variability

    NASA Astrophysics Data System (ADS)

    Strode, Sarah A.; Pawson, Steven

    2013-11-01

    in fossil fuel emissions are a major driver of changes in atmospheric CO, but detection of trends in CO from anthropogenic sources is complicated by the presence of large interannual variability (IAV) in biomass burning. We use a multiyear model simulation of CO with year-specific biomass burning to predict the number of years needed to detect the impact of changes in Asian anthropogenic emissions on downwind regions. Our study includes two cases for changing anthropogenic emissions: a stepwise change of 15% and a linear trend of 3% yr-1. We first examine how well the model reproduces the observed IAV of CO over the North Pacific, since this variability impacts the time needed to detect significant anthropogenic trends. The modeled IAV over the North Pacific correlates well with that seen from the Measurements of Pollution in the Troposphere (MOPITT) instrument but underestimates the magnitude of the variability. The model predicts that a 3% yr-1 trend in Asian anthropogenic emissions would lead to a statistically significant trend in CO surface concentration in the western United States within 12 years, and accounting for Siberian boreal biomass-burning emissions greatly reduces the number of years needed for trend detection. Combining the modeled trend with the observed MOPITT variability at 500 hPa, we estimate that the 3% yr-1 trend could be detectable in satellite observations over Asia in approximately a decade. Our predicted timescales for trend detection highlight the importance of long-term measurements of CO from satellites.

  14. Reductions in global biodiversity loss predicted from conservation spending.

    PubMed

    Waldron, Anthony; Miller, Daniel C; Redding, Dave; Mooers, Arne; Kuhn, Tyler S; Nibbelink, Nate; Roberts, J Timmons; Tobias, Joseph A; Gittleman, John L

    2017-11-16

    Halting global biodiversity loss is central to the Convention on Biological Diversity and United Nations Sustainable Development Goals, but success to date has been very limited. A critical determinant of success in achieving these goals is the financing that is committed to maintaining biodiversity; however, financing decisions are hindered by considerable uncertainty over the likely impact of any conservation investment. For greater effectiveness, we need an evidence-based model that shows how conservation spending quantitatively reduces the rate of biodiversity loss. Here we demonstrate such a model, and empirically quantify how conservation investment reduced biodiversity loss in 109 countries (signatories to the Convention on Biological Diversity and Sustainable Development Goals), by a median average of 29% per country between 1996 and 2008. We also show that biodiversity changes in signatory countries can be predicted with high accuracy, using a dual model that balances the effects of conservation investment against those of economic, agricultural and population growth (human development pressures). Decision-makers can use this model to forecast the improvement that any proposed biodiversity budget would achieve under various scenarios of human development pressure, and then compare these forecasts to any chosen policy target. We find that the impact of spending decreases as human development pressures grow, which implies that funding may need to increase over time. The model offers a flexible tool for balancing the Sustainable Development Goals of human development and maintaining biodiversity, by predicting the dynamic changes in conservation finance that will be needed as human development proceeds.

  15. Reductions in global biodiversity loss predicted from conservation spending

    NASA Astrophysics Data System (ADS)

    Waldron, Anthony; Miller, Daniel C.; Redding, Dave; Mooers, Arne; Kuhn, Tyler S.; Nibbelink, Nate; Roberts, J. Timmons; Tobias, Joseph A.; Gittleman, John L.

    2017-11-01

    Halting global biodiversity loss is central to the Convention on Biological Diversity and United Nations Sustainable Development Goals, but success to date has been very limited. A critical determinant of success in achieving these goals is the financing that is committed to maintaining biodiversity; however, financing decisions are hindered by considerable uncertainty over the likely impact of any conservation investment. For greater effectiveness, we need an evidence-based model that shows how conservation spending quantitatively reduces the rate of biodiversity loss. Here we demonstrate such a model, and empirically quantify how conservation investment between 1996 and 2008 reduced biodiversity loss in 109 countries (signatories to the Convention on Biological Diversity and Sustainable Development Goals), by a median average of 29% per country. We also show that biodiversity changes in signatory countries can be predicted with high accuracy, using a dual model that balances the effects of conservation investment against those of economic, agricultural and population growth (human development pressures). Decision-makers can use this model to forecast the improvement that any proposed biodiversity budget would achieve under various scenarios of human development pressure, and then compare these forecasts to any chosen policy target. We find that the impact of spending decreases as human development pressures grow, which implies that funding may need to increase over time. The model offers a flexible tool for balancing the Sustainable Development Goals of human development and maintaining biodiversity, by predicting the dynamic changes in conservation finance that will be needed as human development proceeds.

  16. Modeling patients' acceptance of provider-delivered e-health.

    PubMed

    Wilson, E Vance; Lankton, Nancy K

    2004-01-01

    Health care providers are beginning to deliver a range of Internet-based services to patients; however, it is not clear which of these e-health services patients need or desire. The authors propose that patients' acceptance of provider-delivered e-health can be modeled in advance of application development by measuring the effects of several key antecedents to e-health use and applying models of acceptance developed in the information technology (IT) field. This study tested three theoretical models of IT acceptance among patients who had recently registered for access to provider-delivered e-health. An online questionnaire administered items measuring perceptual constructs from the IT acceptance models (intrinsic motivation, perceived ease of use, perceived usefulness/extrinsic motivation, and behavioral intention to use e-health) and five hypothesized antecedents (satisfaction with medical care, health care knowledge, Internet dependence, information-seeking preference, and health care need). Responses were collected and stored in a central database. All tested IT acceptance models performed well in predicting patients' behavioral intention to use e-health. Antecedent factors of satisfaction with provider, information-seeking preference, and Internet dependence uniquely predicted constructs in the models. Information technology acceptance models provide a means to understand which aspects of e-health are valued by patients and how this may affect future use. In addition, antecedents to the models can be used to predict e-health acceptance in advance of system development.

  17. Evaluation of snow modeling with Noah and Noah-MP land surface models in NCEP GFS/CFS system

    NASA Astrophysics Data System (ADS)

    Dong, J.; Ek, M. B.; Wei, H.; Meng, J.

    2017-12-01

    Land surface serves as lower boundary forcing in global forecast system (GFS) and climate forecast system (CFS), simulating interactions between land and the atmosphere. Understanding the underlying land model physics is a key to improving weather and seasonal prediction skills. With the upgrades in land model physics (e.g., release of newer versions of a land model), different land initializations, changes in parameterization schemes used in the land model (e.g., land physical parametrization options), and how the land impact is handled (e.g., physics ensemble approach), it always prompts the necessity that climate prediction experiments need to be re-conducted to examine its impact. The current NASA LIS (version 7) integrates NOAA operational land surface and hydrological models (NCEP's Noah, versions from 2.7.1 to 3.6 and the future Noah-MP), high-resolution satellite and observational data, and land DA tools. The newer versions of the Noah LSM used in operational models have a variety of enhancements compared to older versions, where the Noah-MP allows for different physics parameterization options and the choice could have large impact on physical processes underlying seasonal predictions. These impacts need to be reexamined before implemented into NCEP operational systems. A set of offline numerical experiments driven by the GFS forecast forcing have been conducted to evaluate the impact of snow modeling with daily Global Historical Climatology Network (GHCN).

  18. Can the everyday technology use questionnaire predict overall functional level among older adults with mild cognitive impairment or mild-stage alzheimer's disease? - a pilot study.

    PubMed

    Ryd, Charlotta; Nygård, Louise; Malinowsky, Camilla; Öhman, Annika; Kottorp, Anders

    2017-03-01

    The number of older adults living with mild cognitive impairment (MCI) or mild-stage Alzheimer's disease (AD) is increasing and they are often expected to live in their own homes without support, despite limited ability to perform daily life activities. The Everyday Technology Use Questionnaire (ETUQ) has proven to be able to separate these groups and might also have potential to predict overall functional level (need of assistance in daily life activities) among them. To investigate whether the ETUQ can predict overall functional level among older adults with MCI or mild-stage AD. Participants were older adults with a mean age of 76 years with MCI (n = 28) or mild-stage AD (n = 39). A three-step scale indicating (i) independence, (ii) need for minimal assistance or (iii) need for moderate to maximal assistance in daily life was dichotomised in two ways and used as outcome variables in two logistic regression models. Predictors in both models were perceived ability to use everyday technology (ET) and amount of relevant everyday technologies measured by the ETUQ. Ethical approval was obtained from the regional Ethical Committee. Perceived ability to use ET discriminated individuals who were independent or in need of minimal support from those in need of moderate to maximal assistance (OR = 1.82, p < 0.01, confidence interval = 95%; 1.76-2.82). The amount of relevant everyday technologies discriminated individuals who were independent from those in need of assistance at any level (OR = 1.39; p < 0.01; confidence interval = 95%; 1.11-1.75). Both perceived ability to use ET and amount of relevant everyday technologies had potential to predict overall function but at different levels. The findings support the predictive validity of the ETUQ and suggest further research for the development of clinical cut-off criteria. © 2016 Nordic College of Caring Science.

  19. Assessment of Protein Side-Chain Conformation Prediction Methods in Different Residue Environments

    PubMed Central

    Peterson, Lenna X.; Kang, Xuejiao; Kihara, Daisuke

    2016-01-01

    Computational prediction of side-chain conformation is an important component of protein structure prediction. Accurate side-chain prediction is crucial for practical applications of protein structure models that need atomic detailed resolution such as protein and ligand design. We evaluated the accuracy of eight side-chain prediction methods in reproducing the side-chain conformations of experimentally solved structures deposited to the Protein Data Bank. Prediction accuracy was evaluated for a total of four different structural environments (buried, surface, interface, and membrane-spanning) in three different protein types (monomeric, multimeric, and membrane). Overall, the highest accuracy was observed for buried residues in monomeric and multimeric proteins. Notably, side-chains at protein interfaces and membrane-spanning regions were better predicted than surface residues even though the methods did not all use multimeric and membrane proteins for training. Thus, we conclude that the current methods are as practically useful for modeling protein docking interfaces and membrane-spanning regions as for modeling monomers. PMID:24619909

  20. A Critical Plane-energy Model for Multiaxial Fatigue Life Prediction of Homogeneous and Heterogeneous Materials

    NASA Astrophysics Data System (ADS)

    Wei, Haoyang

    A new critical plane-energy model is proposed in this thesis for multiaxial fatigue life prediction of homogeneous and heterogeneous materials. Brief review of existing methods, especially on the critical plane-based and energy-based methods, are given first. Special focus is on one critical plane approach which has been shown to work for both brittle and ductile metals. The key idea is to automatically change the critical plane orientation with respect to different materials and stress states. One potential drawback of the developed model is that it needs an empirical calibration parameter for non-proportional multiaxial loadings since only the strain terms are used and the out-of-phase hardening cannot be considered. The energy-based model using the critical plane concept is proposed with help of the Mroz-Garud hardening rule to explicitly include the effect of non-proportional hardening under fatigue cyclic loadings. Thus, the empirical calibration for non-proportional loading is not needed since the out-of-phase hardening is naturally included in the stress calculation. The model predictions are compared with experimental data from open literature and it is shown the proposed model can work for both proportional and non-proportional loadings without the empirical calibration. Next, the model is extended for the fatigue analysis of heterogeneous materials integrating with finite element method. Fatigue crack initiation of representative volume of heterogeneous materials is analyzed using the developed critical plane-energy model and special focus is on the microstructure effect on the multiaxial fatigue life predictions. Several conclusions and future work is drawn based on the proposed study.

  1. Comparison of simplified models in the prediction of two phase flow in pipelines

    NASA Astrophysics Data System (ADS)

    Jerez-Carrizales, M.; Jaramillo, J. E.; Fuentes, D.

    2014-06-01

    Prediction of two phase flow in pipelines is a common task in engineering. It is a complex phenomenon and many models have been developed to find an approximate solution to the problem. Some old models, such as the Hagedorn & Brown (HB) model, have been highlighted by many authors to give very good performance. Furthermore, many modifications have been applied to this method to improve its predictions. In this work two simplified models which are based on empiricism (HB and Mukherjee and Brill, MB) are considered. One mechanistic model which is based on the physics of the phenomenon (AN) and it still needs some correlations called closure relations is also used. Moreover, a drift flux model defined in steady state that is flow pattern dependent (HK model) is implemented. The implementation of these methods was tested using published data in the scientific literature for vertical upward flows. Furthermore, a comparison of the predictive performance of the four models is done against a well from Campo Escuela Colorado. Difference among four models is smaller than difference with experimental data from the well in Campo Escuela Colorado.

  2. Re-Examining the Relationship between Need for Cognition and Creativity: Predicting Creative Problem Solving across Multiple Domains

    ERIC Educational Resources Information Center

    Watts, Logan L.; Steele, Logan M.; Song, Hairong

    2017-01-01

    Prior studies have demonstrated inconsistent findings with regard to the relationship between need for cognition and creativity. In our study, measurement issues were explored as a potential source of these inconsistencies. Structural equation modeling techniques were used to examine the factor structure underlying the 18-item need for cognition…

  3. Arthroplasty Utilization in the United States is Predicted by Age-Specific Population Groups.

    PubMed

    Bashinskaya, Bronislava; Zimmerman, Ryan M; Walcott, Brian P; Antoci, Valentin

    2012-01-01

    Osteoarthritis is a common indication for hip and knee arthroplasty. An accurate assessment of current trends in healthcare utilization as they relate to arthroplasty may predict the needs of a growing elderly population in the United States. First, incidence data was queried from the United States Nationwide Inpatient Sample from 1993 to 2009. Patients undergoing total knee and hip arthroplasty were identified. Then, the United States Census Bureau was queried for population data from the same study period as well as to provide future projections. Arthroplasty followed linear regression models with the population group >64 years in both hip and knee groups. Projections for procedure incidence in the year 2050 based on these models were calculated to be 1,859,553 cases (hip) and 4,174,554 cases (knee). The need for hip and knee arthroplasty is expected to grow significantly in the upcoming years, given population growth predictions.

  4. International Land Model Benchmarking (ILAMB) Workshop Report, Technical Report DOE/SC-0186

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forrest M.; Koven, Charles D.; Kappel-Aleks, Gretchen

    2016-11-01

    As Earth system models become increasingly complex, there is a growing need for comprehensive and multi-faceted evaluation of model projections. To advance understanding of biogeochemical processes and their interactions with hydrology and climate under conditions of increasing atmospheric carbon dioxide, new analysis methods are required that use observations to constrain model predictions, inform model development, and identify needed measurements and field experiments. Better representations of biogeochemistry–climate feedbacks and ecosystem processes in these models are essential for reducing uncertainties associated with projections of climate change during the remainder of the 21st century.

  5. Prediction of dynamical systems by symbolic regression

    NASA Astrophysics Data System (ADS)

    Quade, Markus; Abel, Markus; Shafi, Kamran; Niven, Robert K.; Noack, Bernd R.

    2016-07-01

    We study the modeling and prediction of dynamical systems based on conventional models derived from measurements. Such algorithms are highly desirable in situations where the underlying dynamics are hard to model from physical principles or simplified models need to be found. We focus on symbolic regression methods as a part of machine learning. These algorithms are capable of learning an analytically tractable model from data, a highly valuable property. Symbolic regression methods can be considered as generalized regression methods. We investigate two particular algorithms, the so-called fast function extraction which is a generalized linear regression algorithm, and genetic programming which is a very general method. Both are able to combine functions in a certain way such that a good model for the prediction of the temporal evolution of a dynamical system can be identified. We illustrate the algorithms by finding a prediction for the evolution of a harmonic oscillator based on measurements, by detecting an arriving front in an excitable system, and as a real-world application, the prediction of solar power production based on energy production observations at a given site together with the weather forecast.

  6. Molecular determinants of blood-brain barrier permeation.

    PubMed

    Geldenhuys, Werner J; Mohammad, Afroz S; Adkins, Chris E; Lockman, Paul R

    2015-01-01

    The blood-brain barrier (BBB) is a microvascular unit which selectively regulates the permeability of drugs to the brain. With the rise in CNS drug targets and diseases, there is a need to be able to accurately predict a priori which compounds in a company database should be pursued for favorable properties. In this review, we will explore the different computational tools available today, as well as underpin these to the experimental methods used to determine BBB permeability. These include in vitro models and the in vivo models that yield the dataset we use to generate predictive models. Understanding of how these models were experimentally derived determines our accurate and predicted use for determining a balance between activity and BBB distribution.

  7. Initial CGE Model Results Summary Exogenous and Endogenous Variables Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Brian Keith; Boero, Riccardo; Rivera, Michael Kelly

    The following discussion presents initial results of tests of the most recent version of the National Infrastructure Simulation and Analysis Center Dynamic Computable General Equilibrium (CGE) model developed by Los Alamos National Laboratory (LANL). The intent of this is to test and assess the model’s behavioral properties. The test evaluated whether the predicted impacts are reasonable from a qualitative perspective. This issue is whether the predicted change, be it an increase or decrease in other model variables, is consistent with prior economic intuition and expectations about the predicted change. One of the purposes of this effort is to determine whethermore » model changes are needed in order to improve its behavior qualitatively and quantitatively.« less

  8. The importance of measuring growth in response to intervention models: Testing a core assumption✩

    PubMed Central

    Schatschneider, Christopher; Wagner, Richard K.; Crawford, Elizabeth C.

    2011-01-01

    A core assumption of response to instruction or intervention (RTI) models is the importance of measuring growth in achievement over time in response to effective instruction or intervention. Many RTI models actively monitor growth for identifying individuals who need different levels of intervention. A large-scale (N=23,438), two-year longitudinal study of first grade children was carried out to compare the predictive validity of measures of achievement status, growth in achievement, and their combination for predicting future reading achievement. The results indicate that under typical conditions, measures of growth do not make a contribution to prediction that is independent of measures of achievement status. These results question the validity of a core assumption of RTI models. PMID:22224065

  9. Molecular determinants of blood–brain barrier permeation

    PubMed Central

    Geldenhuys, Werner J; Mohammad, Afroz S; Adkins, Chris E; Lockman, Paul R

    2015-01-01

    The blood–brain barrier (BBB) is a microvascular unit which selectively regulates the permeability of drugs to the brain. With the rise in CNS drug targets and diseases, there is a need to be able to accurately predict a priori which compounds in a company database should be pursued for favorable properties. In this review, we will explore the different computational tools available today, as well as underpin these to the experimental methods used to determine BBB permeability. These include in vitro models and the in vivo models that yield the dataset we use to generate predictive models. Understanding of how these models were experimentally derived determines our accurate and predicted use for determining a balance between activity and BBB distribution. PMID:26305616

  10. Predicting category intuitiveness with the rational model, the simplicity model, and the generalized context model.

    PubMed

    Pothos, Emmanuel M; Bailey, Todd M

    2009-07-01

    Naïve observers typically perceive some groupings for a set of stimuli as more intuitive than others. The problem of predicting category intuitiveness has been historically considered the remit of models of unsupervised categorization. In contrast, this article develops a measure of category intuitiveness from one of the most widely supported models of supervised categorization, the generalized context model (GCM). Considering different category assignments for a set of instances, the authors asked how well the GCM can predict the classification of each instance on the basis of all the other instances. The category assignment that results in the smallest prediction error is interpreted as the most intuitive for the GCM-the authors refer to this way of applying the GCM as "unsupervised GCM." The authors systematically compared predictions of category intuitiveness from the unsupervised GCM and two models of unsupervised categorization: the simplicity model and the rational model. The unsupervised GCM compared favorably with the simplicity model and the rational model. This success of the unsupervised GCM illustrates that the distinction between supervised and unsupervised categorization may need to be reconsidered. However, no model emerged as clearly superior, indicating that there is more work to be done in understanding and modeling category intuitiveness.

  11. Criminogenic Needs, Substance Use, and Offending among Rural Stimulant Users

    PubMed Central

    Timko, Christine; Booth, Brenda M.; Han, Xiaotong; Schultz, Nicole R.; Blonigen, Daniel M.; Wong, Jessie J.; Cucciare, Michael A.

    2017-01-01

    There is a need to understand the determinants of both substance use and criminal activity in rural areas in order to design appropriate treatment interventions for these linked problems. The present study drew on a predominant model used to assess and treat offenders -- the Risk-Need-Responsivity (RNR) model -- to examine risk factors for substance use and criminal activity in a rural drug using sample. This study extends the RNR model’s focus on offenders to assessing rural-dwelling individuals using stimulants (N=462). We examined substance use and criminal justice outcomes at 6-month (91%) and 3-year (79%) follow-ups, and used Generalized Estimating Equations to examine the extent to which RNR criminogenic need factors at baseline predicted outcomes at follow-ups. Substance use and criminal justice outcomes improved at six months, and even more at three years, post-baseline. As expected, higher risk was associated with poorer outcomes. Antisocial personality patterns and procriminal attitudes at baseline predicted poorer legal and drug outcomes measured at subsequent follow-ups. In contrast, less connection to antisocial others and fewer work difficulties predicted lower alcohol problem severity, but more frequent alcohol use. Engagement in social-recreational activities was associated with fewer subsequent arrests and less severe alcohol and drug problems. The RNR model’s criminogenic need factors predicted drug use and crime-related outcomes among rural residents. Services adapted to rural settings that target these factors, such as telehealth and other technology-based resources, may hasten improvement on both types of outcomes among drug users. PMID:29051795

  12. Revisiting the Holy Grail: using plant functional traits to understand ecological processes.

    PubMed

    Funk, Jennifer L; Larson, Julie E; Ames, Gregory M; Butterfield, Bradley J; Cavender-Bares, Jeannine; Firn, Jennifer; Laughlin, Daniel C; Sutton-Grier, Ariana E; Williams, Laura; Wright, Justin

    2017-05-01

    One of ecology's grand challenges is developing general rules to explain and predict highly complex systems. Understanding and predicting ecological processes from species' traits has been considered a 'Holy Grail' in ecology. Plant functional traits are increasingly being used to develop mechanistic models that can predict how ecological communities will respond to abiotic and biotic perturbations and how species will affect ecosystem function and services in a rapidly changing world; however, significant challenges remain. In this review, we highlight recent work and outstanding questions in three areas: (i) selecting relevant traits; (ii) describing intraspecific trait variation and incorporating this variation into models; and (iii) scaling trait data to community- and ecosystem-level processes. Over the past decade, there have been significant advances in the characterization of plant strategies based on traits and trait relationships, and the integration of traits into multivariate indices and models of community and ecosystem function. However, the utility of trait-based approaches in ecology will benefit from efforts that demonstrate how these traits and indices influence organismal, community, and ecosystem processes across vegetation types, which may be achieved through meta-analysis and enhancement of trait databases. Additionally, intraspecific trait variation and species interactions need to be incorporated into predictive models using tools such as Bayesian hierarchical modelling. Finally, existing models linking traits to community and ecosystem processes need to be empirically tested for their applicability to be realized. © 2016 Cambridge Philosophical Society.

  13. Particulate Matter Emissions for Fires in the Palmetto-Gallberry Fuel Type

    Treesearch

    Darold E. Ward

    1983-01-01

    Fire management specialists in the southeastern United States needing guides for predicting or assessing particulate matter emission factors, emission rates, and heat release rate can use the models presented in this paper for making these predictions as a function of flame length in the palmetto-gallberry fuel type.

  14. Predicting Successful Mathematics Remediation among Latina/o Students

    ERIC Educational Resources Information Center

    Crisp, Gloria; Reyes, Nicole Alia Salis; Doran, Erin

    2017-01-01

    This study examines Latina/o students' remedial math needs and outcomes. Data were drawn from a national sample of Latina/o students. Hierarchical generalized linear modeling techniques were used to predict three successful remediation outcomes. Results highlight the importance of providing financial aid and academic support to Latina/o students,…

  15. Questioning the Faith - Models and Prediction in Stream Restoration (Invited)

    NASA Astrophysics Data System (ADS)

    Wilcock, P.

    2013-12-01

    River management and restoration demand prediction at and beyond our present ability. Management questions, framed appropriately, can motivate fundamental advances in science, although the connection between research and application is not always easy, useful, or robust. Why is that? This presentation considers the connection between models and management, a connection that requires critical and creative thought on both sides. Essential challenges for managers include clearly defining project objectives and accommodating uncertainty in any model prediction. Essential challenges for the research community include matching the appropriate model to project duration, space, funding, information, and social constraints and clearly presenting answers that are actually useful to managers. Better models do not lead to better management decisions or better designs if the predictions are not relevant to and accepted by managers. In fact, any prediction may be irrelevant if the need for prediction is not recognized. The predictive target must be developed in an active dialog between managers and modelers. This relationship, like any other, can take time to develop. For example, large segments of stream restoration practice have remained resistant to models and prediction because the foundational tenet - that channels built to a certain template will be able to transport the supplied sediment with the available flow - has no essential physical connection between cause and effect. Stream restoration practice can be steered in a predictive direction in which project objectives are defined as predictable attributes and testable hypotheses. If stream restoration design is defined in terms of the desired performance of the channel (static or dynamic, sediment surplus or deficit), then channel properties that provide these attributes can be predicted and a basis exists for testing approximations, models, and predictions.

  16. Improving predictions of carbon fluxes in the tropics undre climatic changes using ED2

    NASA Astrophysics Data System (ADS)

    Feng, X.; Uriarte, M.

    2016-12-01

    Tropical forests play a critical role in the exchange of carbon between land and atmosphere, highlighting the urgency of understanding the effects of climate change on these ecosystems. The most optimistic predictions of climate models indicate that global mean temperatures will increase by up to 2 0C with some tropical regions experiencing extreme heat. Drought and heat-induced tree mortality will accelerate the release of carbon to the atmosphere creating a positive feedback that greatly exacerbates global warming. Thus, under a warmer and drier climate, tropical forests may become net sources, rather than sinks, of carbon. Earth system models have not reached a consensus on the magnitude and direction of climate change impacts on tropical forests, calling into question the reliability of their predictions. Thus, there is an immediate need to improve the representation of tropical forests in earth system models to make robust predictions. The goal of our study is to quantify the responses of tropical forests to climate variability and improve the predictive capacity of terrestrial ecosystem models. We have collected species-specific physiological and functional trait data from 144 tree species in a Puerto Rican rainforest to parameterize the Ecosystem Demography model (ED2). The large amount of data generated by this research will lead to better validation and lowering the uncertainty in future model predictions. To best represent the forest landscape in ED2, all the trees have been assigned to three plant functional types (PFTs): early, mid, and late successional species. Trait data for each PFT were synthesized in a Bayesian meta-analytical model and posterior distributions of traits were used to parameterize the ED2 model. Model predictions show that biomass production of late successional PFT (118.89 ton/ha) was consistently higher than mid (71.33 ton/ha) and early (13.21 ton/ha) PFTs. However, mid successional PFT had the highest contributions to NPP for the modeled period. Tropical forest biomass reduces by 30% under future drought scenario turning the tropics into carbon sources. Ensemble runs were conducted to construct error estimates around model forecasts, to compare modeled and observed aboveground biomass, and to identify which processes and tree species need further study.

  17. Bayesian Modeling of Exposure and Airflow Using Two-Zone Models

    PubMed Central

    Zhang, Yufen; Banerjee, Sudipto; Yang, Rui; Lungu, Claudiu; Ramachandran, Gurumurthy

    2009-01-01

    Mathematical modeling is being increasingly used as a means for assessing occupational exposures. However, predicting exposure in real settings is constrained by lack of quantitative knowledge of exposure determinants. Validation of models in occupational settings is, therefore, a challenge. Not only do the model parameters need to be known, the models also need to predict the output with some degree of accuracy. In this paper, a Bayesian statistical framework is used for estimating model parameters and exposure concentrations for a two-zone model. The model predicts concentrations in a zone near the source and far away from the source as functions of the toluene generation rate, air ventilation rate through the chamber, and the airflow between near and far fields. The framework combines prior or expert information on the physical model along with the observed data. The framework is applied to simulated data as well as data obtained from the experiments conducted in a chamber. Toluene vapors are generated from a source under different conditions of airflow direction, the presence of a mannequin, and simulated body heat of the mannequin. The Bayesian framework accounts for uncertainty in measurement as well as in the unknown rate of airflow between the near and far fields. The results show that estimates of the interzonal airflow are always close to the estimated equilibrium solutions, which implies that the method works efficiently. The predictions of near-field concentration for both the simulated and real data show nice concordance with the true values, indicating that the two-zone model assumptions agree with the reality to a large extent and the model is suitable for predicting the contaminant concentration. Comparison of the estimated model and its margin of error with the experimental data thus enables validation of the physical model assumptions. The approach illustrates how exposure models and information on model parameters together with the knowledge of uncertainty and variability in these quantities can be used to not only provide better estimates of model outputs but also model parameters. PMID:19403840

  18. Genomic selection accuracies within and between environments and small breeding groups in white spruce.

    PubMed

    Beaulieu, Jean; Doerksen, Trevor K; MacKay, John; Rainville, André; Bousquet, Jean

    2014-12-02

    Genomic selection (GS) may improve selection response over conventional pedigree-based selection if markers capture more detailed information than pedigrees in recently domesticated tree species and/or make it more cost effective. Genomic prediction accuracies using 1748 trees and 6932 SNPs representative of as many distinct gene loci were determined for growth and wood traits in white spruce, within and between environments and breeding groups (BG), each with an effective size of Ne ≈ 20. Marker subsets were also tested. Model fits and/or cross-validation (CV) prediction accuracies for ridge regression (RR) and the least absolute shrinkage and selection operator models approached those of pedigree-based models. With strong relatedness between CV sets, prediction accuracies for RR within environment and BG were high for wood (r = 0.71-0.79) and moderately high for growth (r = 0.52-0.69) traits, in line with trends in heritabilities. For both classes of traits, these accuracies achieved between 83% and 92% of those obtained with phenotypes and pedigree information. Prediction into untested environments remained moderately high for wood (r ≥ 0.61) but dropped significantly for growth (r ≥ 0.24) traits, emphasizing the need to phenotype in all test environments and model genotype-by-environment interactions for growth traits. Removing relatedness between CV sets sharply decreased prediction accuracies for all traits and subpopulations, falling near zero between BGs with no known shared ancestry. For marker subsets, similar patterns were observed but with lower prediction accuracies. Given the need for high relatedness between CV sets to obtain good prediction accuracies, we recommend to build GS models for prediction within the same breeding population only. Breeding groups could be merged to build genomic prediction models as long as the total effective population size does not exceed 50 individuals in order to obtain high prediction accuracy such as that obtained in the present study. A number of markers limited to a few hundred would not negatively impact prediction accuracies, but these could decrease more rapidly over generations. The most promising short-term approach for genomic selection would likely be the selection of superior individuals within large full-sib families vegetatively propagated to implement multiclonal forestry.

  19. Predicting environmental mitigation requirements for hydropower projects through the integration of biophysical and socio-political geographies

    DOE PAGES

    Bevelhimer, Mark S.; DeRolph, Christopher R.; Schramm, Michael P.

    2016-06-06

    Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multidisciplinary explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, wemore » were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements have been a result of a range of factors, from biological and hydrological to political and cultural. Furthermore, project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation.« less

  20. Whole body acid-base modeling revisited.

    PubMed

    Ring, Troels; Nielsen, Søren

    2017-04-01

    The textbook account of whole body acid-base balance in terms of endogenous acid production, renal net acid excretion, and gastrointestinal alkali absorption, which is the only comprehensive model around, has never been applied in clinical practice or been formally validated. To improve understanding of acid-base modeling, we managed to write up this conventional model as an expression solely on urine chemistry. Renal net acid excretion and endogenous acid production were already formulated in terms of urine chemistry, and we could from the literature also see gastrointestinal alkali absorption in terms of urine excretions. With a few assumptions it was possible to see that this expression of net acid balance was arithmetically identical to minus urine charge, whereby under the development of acidosis, urine was predicted to acquire a net negative charge. The literature already mentions unexplained negative urine charges so we scrutinized a series of seminal papers and confirmed empirically the theoretical prediction that observed urine charge did acquire negative charge as acidosis developed. Hence, we can conclude that the conventional model is problematic since it predicts what is physiologically impossible. Therefore, we need a new model for whole body acid-base balance, which does not have impossible implications. Furthermore, new experimental studies are needed to account for charge imbalance in urine under development of acidosis. Copyright © 2017 the American Physiological Society.

  1. Predicting environmental mitigation requirements for hydropower projects through the integration of biophysical and socio-political geographies.

    PubMed

    DeRolph, Christopher R; Schramm, Michael P; Bevelhimer, Mark S

    2016-10-01

    Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multi-faceted explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, we were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements are functions of a range of factors, from biophysical to socio-political. Project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Predicting environmental mitigation requirements for hydropower projects through the integration of biophysical and socio-political geographies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bevelhimer, Mark S.; DeRolph, Christopher R.; Schramm, Michael P.

    Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multidisciplinary explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, wemore » were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements have been a result of a range of factors, from biological and hydrological to political and cultural. Furthermore, project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation.« less

  3. A personalized medicine approach to the design of dry powder inhalers: Selecting the optimal amount of bypass.

    PubMed

    Kopsch, Thomas; Murnane, Darragh; Symons, Digby

    2017-08-30

    In dry powder inhalers (DPIs) the patient's inhalation manoeuvre strongly influences the release of drug. Drug release from a DPI may also be influenced by the size of any air bypass incorporated in the device. If the amount of bypass is high less air flows through the entrainment geometry and the release rate is lower. In this study we propose to reduce the intra- and inter-patient variations of drug release by controlling the amount of air bypass in a DPI. A fast computational method is proposed that can predict how much bypass is needed for a specified drug delivery rate for a particular patient. This method uses a meta-model which was constructed using multiphase computational fluid dynamic (CFD) simulations. The meta-model is applied in an optimization framework to predict the required amount of bypass needed for drug delivery that is similar to a desired target release behaviour. The meta-model was successfully validated by comparing its predictions to results from additional CFD simulations. The optimization framework has been applied to identify the optimal amount of bypass needed for fictitious sample inhalation manoeuvres in order to deliver a target powder release profile for two patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Methods for exploring uncertainty in groundwater management predictions

    USGS Publications Warehouse

    Guillaume, Joseph H. A.; Hunt, Randall J.; Comunian, Alessandro; Fu, Baihua; Blakers, Rachel S; Jakeman, Anthony J.; Barreteau, Olivier; Hunt, Randall J.; Rinaudo, Jean-Daniel; Ross, Andrew

    2016-01-01

    Models of groundwater systems help to integrate knowledge about the natural and human system covering different spatial and temporal scales, often from multiple disciplines, in order to address a range of issues of concern to various stakeholders. A model is simply a tool to express what we think we know. Uncertainty, due to lack of knowledge or natural variability, means that there are always alternative models that may need to be considered. This chapter provides an overview of uncertainty in models and in the definition of a problem to model, highlights approaches to communicating and using predictions of uncertain outcomes and summarises commonly used methods to explore uncertainty in groundwater management predictions. It is intended to raise awareness of how alternative models and hence uncertainty can be explored in order to facilitate the integration of these techniques with groundwater management.

  5. NOAA Climate Program Office Contributions to National ESPC

    NASA Astrophysics Data System (ADS)

    Higgins, W.; Huang, J.; Mariotti, A.; Archambault, H. M.; Barrie, D.; Lucas, S. E.; Mathis, J. T.; Legler, D. M.; Pulwarty, R. S.; Nierenberg, C.; Jones, H.; Cortinas, J. V., Jr.; Carman, J.

    2016-12-01

    NOAA is one of five federal agencies (DOD, DOE, NASA, NOAA, and NSF) which signed an updated charter in 2016 to partner on the National Earth System Prediction Capability (ESPC). Situated within NOAA's Office of Oceanic and Atmospheric Research (OAR), NOAA Climate Program Office (CPO) programs contribute significantly to the National ESPC goals and activities. This presentation will provide an overview of CPO contributions to National ESPC. First, we will discuss selected CPO research and transition activities that directly benefit the ESPC coupled model prediction capability, including The North American Multi-Model Ensemble (NMME) seasonal prediction system The Subseasonal Experiment (SubX) project to test real-time subseasonal ensemble prediction systems. Improvements to the NOAA operational Climate Forecast System (CFS), including software infrastructure and data assimilation. Next, we will show how CPO's foundational research activities are advancing future ESPC capabilities. Highlights will include: The Tropical Pacific Observing System (TPOS) to provide the basis for predicting climate on subseasonal to decadal timescales. Subseasonal-to-Seasonal (S2S) processes and predictability studies to improve understanding, modeling and prediction of the MJO. An Arctic Research Program to address urgent needs for advancing monitoring and prediction capabilities in this major area of concern. Advances towards building an experimental multi-decadal prediction system through studies on the Atlantic Meridional Overturning Circulation (AMOC). Finally, CPO has embraced Integrated Information Systems (IIS's) that build on the innovation of programs such as the National Integrated Drought Information System (NIDIS) to develop and deliver end to end environmental information for key societal challenges (e.g. extreme heat; coastal flooding). These contributions will help the National ESPC better understand and address societal needs and decision support requirements.

  6. Progress in Earth System Modeling since the ENIAC Calculation

    NASA Astrophysics Data System (ADS)

    Fung, I.

    2009-05-01

    The success of the first numerical weather prediction experiment on the ENIAC computer in 1950 was hinged on the expansion of the meteorological observing network, which led to theoretical advances in atmospheric dynamics and subsequently the implementation of the simplified equations on the computer. This paper briefly reviews the progress in Earth System Modeling and climate observations, and suggests a strategy to sustain and expand the observations needed to advance climate science and prediction.

  7. Consistent Simulation Framework for Efficient Mass Discharge and Source Depletion Time Predictions of DNAPL Contaminants in Heterogeneous Aquifers Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Koch, J.

    2014-12-01

    Predicting DNAPL fate and transport in heterogeneous aquifers is challenging and subject to an uncertainty that needs to be quantified. Models for this task needs to be equipped with an accurate source zone description, i.e., the distribution of mass of all partitioning phases (DNAPL, water, and soil) in all possible states ((im)mobile, dissolved, and sorbed), mass-transfer algorithms, and the simulation of transport processes in the groundwater. Such detailed models tend to be computationally cumbersome when used for uncertainty quantification. Therefore, a selective choice of the relevant model states, processes, and scales are both sensitive and indispensable. We investigate the questions: what is a meaningful level of model complexity and how to obtain an efficient model framework that is still physically and statistically consistent. In our proposed model, aquifer parameters and the contaminant source architecture are conceptualized jointly as random space functions. The governing processes are simulated in a three-dimensional, highly-resolved, stochastic, and coupled model that can predict probability density functions of mass discharge and source depletion times. We apply a stochastic percolation approach as an emulator to simulate the contaminant source formation, a random walk particle tracking method to simulate DNAPL dissolution and solute transport within the aqueous phase, and a quasi-steady-state approach to solve for DNAPL depletion times. Using this novel model framework, we test whether and to which degree the desired model predictions are sensitive to simplifications often found in the literature. With this we identify that aquifer heterogeneity, groundwater flow irregularity, uncertain and physically-based contaminant source zones, and their mutual interlinkages are indispensable components of a sound model framework.

  8. LiDAR based prediction of forest biomass using hierarchical models with spatially varying coefficients

    USGS Publications Warehouse

    Babcock, Chad; Finley, Andrew O.; Bradford, John B.; Kolka, Randall K.; Birdsey, Richard A.; Ryan, Michael G.

    2015-01-01

    Many studies and production inventory systems have shown the utility of coupling covariates derived from Light Detection and Ranging (LiDAR) data with forest variables measured on georeferenced inventory plots through regression models. The objective of this study was to propose and assess the use of a Bayesian hierarchical modeling framework that accommodates both residual spatial dependence and non-stationarity of model covariates through the introduction of spatial random effects. We explored this objective using four forest inventory datasets that are part of the North American Carbon Program, each comprising point-referenced measures of above-ground forest biomass and discrete LiDAR. For each dataset, we considered at least five regression model specifications of varying complexity. Models were assessed based on goodness of fit criteria and predictive performance using a 10-fold cross-validation procedure. Results showed that the addition of spatial random effects to the regression model intercept improved fit and predictive performance in the presence of substantial residual spatial dependence. Additionally, in some cases, allowing either some or all regression slope parameters to vary spatially, via the addition of spatial random effects, further improved model fit and predictive performance. In other instances, models showed improved fit but decreased predictive performance—indicating over-fitting and underscoring the need for cross-validation to assess predictive ability. The proposed Bayesian modeling framework provided access to pixel-level posterior predictive distributions that were useful for uncertainty mapping, diagnosing spatial extrapolation issues, revealing missing model covariates, and discovering locally significant parameters.

  9. Evaluation of wave runup predictions from numerical and parametric models

    USGS Publications Warehouse

    Stockdon, Hilary F.; Thompson, David M.; Plant, Nathaniel G.; Long, Joseph W.

    2014-01-01

    Wave runup during storms is a primary driver of coastal evolution, including shoreline and dune erosion and barrier island overwash. Runup and its components, setup and swash, can be predicted from a parameterized model that was developed by comparing runup observations to offshore wave height, wave period, and local beach slope. Because observations during extreme storms are often unavailable, a numerical model is used to simulate the storm-driven runup to compare to the parameterized model and then develop an approach to improve the accuracy of the parameterization. Numerically simulated and parameterized runup were compared to observations to evaluate model accuracies. The analysis demonstrated that setup was accurately predicted by both the parameterized model and numerical simulations. Infragravity swash heights were most accurately predicted by the parameterized model. The numerical model suffered from bias and gain errors that depended on whether a one-dimensional or two-dimensional spatial domain was used. Nonetheless, all of the predictions were significantly correlated to the observations, implying that the systematic errors can be corrected. The numerical simulations did not resolve the incident-band swash motions, as expected, and the parameterized model performed best at predicting incident-band swash heights. An assimilated prediction using a weighted average of the parameterized model and the numerical simulations resulted in a reduction in prediction error variance. Finally, the numerical simulations were extended to include storm conditions that have not been previously observed. These results indicated that the parameterized predictions of setup may need modification for extreme conditions; numerical simulations can be used to extend the validity of the parameterized predictions of infragravity swash; and numerical simulations systematically underpredict incident swash, which is relatively unimportant under extreme conditions.

  10. Predictive Monitoring for Improved Management of Glucose Levels

    PubMed Central

    Reifman, Jaques; Rajaraman, Srinivasan; Gribok, Andrei; Ward, W. Kenneth

    2007-01-01

    Background Recent developments and expected near-future improvements in continuous glucose monitoring (CGM) devices provide opportunities to couple them with mathematical forecasting models to produce predictive monitoring systems for early, proactive glycemia management of diabetes mellitus patients before glucose levels drift to undesirable levels. This article assesses the feasibility of data-driven models to serve as the forecasting engine of predictive monitoring systems. Methods We investigated the capabilities of data-driven autoregressive (AR) models to (1) capture the correlations in glucose time-series data, (2) make accurate predictions as a function of prediction horizon, and (3) be made portable from individual to individual without any need for model tuning. The investigation is performed by employing CGM data from nine type 1 diabetic subjects collected over a continuous 5-day period. Results With CGM data serving as the gold standard, AR model-based predictions of glucose levels assessed over nine subjects with Clarke error grid analysis indicated that, for a 30-minute prediction horizon, individually tuned models yield 97.6 to 100.0% of data in the clinically acceptable zones A and B, whereas cross-subject, portable models yield 95.8 to 99.7% of data in zones A and B. Conclusions This study shows that, for a 30-minute prediction horizon, data-driven AR models provide sufficiently-accurate and clinically-acceptable estimates of glucose levels for timely, proactive therapy and should be considered as the modeling engine for predictive monitoring of patients with type 1 diabetes mellitus. It also suggests that AR models can be made portable from individual to individual with minor performance penalties, while greatly reducing the burden associated with model tuning and data collection for model development. PMID:19885110

  11. Seismic Velocity Gradients Across the Transition Zone

    NASA Astrophysics Data System (ADS)

    Escalante, C.; Cammarano, F.; de Koker, N.; Piazzoni, A.; Wang, Y.; Marone, F.; Dalton, C.; Romanowicz, B.

    2006-12-01

    One-D elastic velocity models derived from mineral physics do a notoriously poor job at predicting the velocity gradients in the upper mantle transition zone, as well as some other features of models derived from seismological data. During the 2006 CIDER summer program, we computed Vs and Vp velocity profiles in the upper mantle based on three different mineral physics approaches: two approaches based on the minimization of Gibbs Free Energy (Stixrude and Lithgow-Bertelloni, 2005; Piazzoni et al., 2006) and one obtained by using experimentally determined phase diagrams (Weidner and Wang, 1998). The profiles were compared by assuming a vertical temperature profile and two end-member compositional models, the pyrolite model of Ringwood (1979) and the piclogite model of Anderson and Bass (1984). The predicted seismic profiles, which are significantly different from each other, primarily due to different choices of properties of single minerals and their extrapolation with temperature, are tested against a global dataset of P and S travel times and spheroidal and toroidal normal mode eigenfrequencies. All the models derived using a potential temperature of 1600K predict seismic velocities that are too slow in the upper mantle, suggesting the need to use a colder geotherm. The velocity gradient in the transition zone is somewhat better for piclogite than for pyrolite, possibly indicating the need to increase Ca content. The presence of stagnant slabs in the transition zone is a possible explanation for the need for 1) colder temperature and 2) increased Ca content. Future improvements in seismic profiles obtained from mineral physics will arise from better knowledge of elastic properties of upper mantle constituents and aggregates at high temperature and pressure, a better understanding of differences between thermodynamic models, and possibly the effect of water through and on Q. High resolution seismic constraints on velocity jumps at 400 and 660 km also need to be included. earth.org/2006/workshop.html

  12. Genetic determinants of freckle occurrence in the Spanish population: Towards ephelides prediction from human DNA samples.

    PubMed

    Hernando, Barbara; Ibañez, Maria Victoria; Deserio-Cuesta, Julio Alberto; Soria-Navarro, Raquel; Vilar-Sastre, Inca; Martinez-Cadenas, Conrado

    2018-03-01

    Prediction of human pigmentation traits, one of the most differentiable externally visible characteristics among individuals, from biological samples represents a useful tool in the field of forensic DNA phenotyping. In spite of freckling being a relatively common pigmentation characteristic in Europeans, little is known about the genetic basis of this largely genetically determined phenotype in southern European populations. In this work, we explored the predictive capacity of eight freckle and sunlight sensitivity-related genes in 458 individuals (266 non-freckled controls and 192 freckled cases) from Spain. Four loci were associated with freckling (MC1R, IRF4, ASIP and BNC2), and female sex was also found to be a predictive factor for having a freckling phenotype in our population. After identifying the most informative genetic variants responsible for human ephelides occurrence in our sample set, we developed a DNA-based freckle prediction model using a multivariate regression approach. Once developed, the capabilities of the prediction model were tested by a repeated 10-fold cross-validation approach. The proportion of correctly predicted individuals using the DNA-based freckle prediction model was 74.13%. The implementation of sex into the DNA-based freckle prediction model slightly improved the overall prediction accuracy by 2.19% (76.32%). Further evaluation of the newly-generated prediction model was performed by assessing the model's performance in a new cohort of 212 Spanish individuals, reaching a classification success rate of 74.61%. Validation of this prediction model may be carried out in larger populations, including samples from different European populations. Further research to validate and improve this newly-generated freckle prediction model will be needed before its forensic application. Together with DNA tests already validated for eye and hair colour prediction, this freckle prediction model may lead to a substantially more detailed physical description of unknown individuals from DNA found at the crime scene. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Software Tools for Weed Seed Germination Modeling

    USDA-ARS?s Scientific Manuscript database

    The next generation of weed seed germination models will need to account for variable soil microclimate conditions. In order to predict this microclimate environment we have developed a suite of individual tools (models) that can be used in conjunction with the next generation of weed seed germinati...

  14. Dynamic Bayesian Networks for Student Modeling

    ERIC Educational Resources Information Center

    Kaser, Tanja; Klingler, Severin; Schwing, Alexander G.; Gross, Markus

    2017-01-01

    Intelligent tutoring systems adapt the curriculum to the needs of the individual student. Therefore, an accurate representation and prediction of student knowledge is essential. Bayesian Knowledge Tracing (BKT) is a popular approach for student modeling. The structure of BKT models, however, makes it impossible to represent the hierarchy and…

  15. The Effects of Autonomy-supportive Coaching, Need Satisfaction and Self-Perceptions on Initiative and Identity in Youth Swimmers

    PubMed Central

    Coatsworth, J. Douglas; Conroy, David E.

    2015-01-01

    This study tested a sequential process model linking youth sport coaching climates (perceived coach behaviors and perceived need satisfaction) to youth self-perceptions (perceived competence and global self-esteem) and youth development outcomes (initiative, identity reflection, identity exploration). A sample of 119 youth between the ages 10–18 who participated in a community-directed summer swim league completed questionnaires over the course of the seven-week season. Results indicated that coaches’ autonomy support, particularly via process-focused praise, predicted youth competence and relatedness need satisfaction in the coaching relationship. Youth competence need satisfaction predicted self-esteem indirectly via perceived competence. Finally, self-esteem predicted identity reflection and perceived competence predicted both identity reflection and initiative. Effects of age, sex, and perceptions of direct contact with the coach were not significant. Findings suggest that the quality of the coaching climate is an important predictor of the developmental benefits of sport participation and that one pathway by which the coaching climate has its effect on initiative and identity reflection is through developing youth self-perceptions. PMID:19271821

  16. Thermal Vacuum Test Correlation of a Zero Propellant Load Case Thermal Capacitance Propellant Gauging Analytical Model

    NASA Technical Reports Server (NTRS)

    Mckim, Stephen A.

    2016-01-01

    This thesis describes the development and correlation of a thermal model that forms the foundation of a thermal capacitance spacecraft propellant load estimator. Specific details of creating the thermal model for the diaphragm propellant tank used on NASA's Magnetospheric Multiscale spacecraft using ANSYS and the correlation process implemented are presented. The thermal model was correlated to within plus or minus 3 degrees Celsius of the thermal vacuum test data, and was determined sufficient to make future propellant predictions on MMS. The model was also found to be relatively sensitive to uncertainties in applied heat flux and mass knowledge of the tank. More work is needed to improve temperature predictions in the upper hemisphere of the propellant tank where predictions were found to be 2 to 2.5 C lower than the test data. A road map for applying the model to predict propellant loads on the actual MMS spacecraft toward its end of life in 2017-2018 is also presented.

  17. Prognostic models for renal cell carcinoma recurrence: external validation in a Japanese population.

    PubMed

    Utsumi, Takanobu; Ueda, Takeshi; Fukasawa, Satoshi; Komaru, Atsushi; Sazuka, Tomokazu; Kawamura, Koji; Imamoto, Takashi; Nihei, Naoki; Suzuki, Hiroyoshi; Ichikawa, Tomohiko

    2011-09-01

    The aim of the present study was to compare the accuracy of three prognostic models in predicting recurrence-free survival among Japanese patients who underwent nephrectomy for non-metastatic renal cell carcinoma (RCC). Patients originated from two centers: Chiba University Hospital (n = 152) and Chiba Cancer Center (n = 65). The following data were collected: age, sex, clinical presentation, Eastern Cooperative Oncology Group performance status, surgical technique, 1997 tumor-node-metastasis stage, clinical and pathological tumor size, histological subtype, disease recurrence, and progression. Three western models, including Yaycioglu's model, Cindolo's model and Kattan's nomogram, were used to predict recurrence-free survival. Predictive accuracy of these models were validated by using Harrell's concordance-index. Concordance-indexes were 0.795 and 0.745 for Kattan's nomogram, 0.700 and 0.634 for Yaycioglu's model, and 0.700 and 0.634 for Cindolo's model, respectively. Furthermore, the constructed calibration plots of Kattan's nomogram overestimated the predicted probability of recurrence-free survival after 5 years compared with the actual probability. Our findings suggest that despite working better than other predictive tools, Kattan's nomogram needs be used with caution when applied to Japanese patients who have undergone nephrectomy for non-metastatic RCC. © 2011 The Japanese Urological Association.

  18. PREDICTING CLINICALLY DIAGNOSED DYSENTERY INCIDENCE OBTAINED FROM MONTHLY CASE REPORTING BASED ON METEOROLOGICAL VARIABLES IN DALIAN, LIAONING PROVINCE, CHINA, 2005-2011 USING A DEVELOPED MODEL.

    PubMed

    An, Qingyu; Yao, Wei; Wu, Jun

    2015-03-01

    This study describes our development of a model to predict the incidence of clinically diagnosed dysentery in Dalian, Liaoning Province, China, using time series analysis. The model was developed using the seasonal autoregressive integrated moving average (SARIMA). Spearman correlation analysis was conducted to explore the relationship between meteorological variables and the incidence of clinically diagnosed dysentery. The meteorological variables which significantly correlated with the incidence of clinically diagnosed dysentery were then used as covariables in the model, which incorporated the monthly incidence of clinically diagnosed dysentery from 2005 to 2010 in Dalian. After model development, a simulation was conducted for the year 2011 and the results of this prediction were compared with the real observed values. The model performed best when the temperature data for the preceding month was used to predict clinically diagnosed dysentery during the following month. The developed model was effective and reliable in predicting the incidence of clinically diagnosed dysentery for most but not all months, and may be a useful tool for dysentery disease control and prevention, but further studies are needed to fine tune the model.

  19. An Arrhenius-type viscosity function to model sintering using the Skorohod Olevsky viscous sintering model within a finite element code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewsuk, Kevin Gregory; Arguello, Jose Guadalupe, Jr.; Reiterer, Markus W.

    2006-02-01

    The ease and ability to predict sintering shrinkage and densification with the Skorohod-Olevsky viscous sintering (SOVS) model within a finite-element (FE) code have been improved with the use of an Arrhenius-type viscosity function. The need for a better viscosity function was identified by evaluating SOVS model predictions made using a previously published polynomial viscosity function. Predictions made using the original, polynomial viscosity function do not accurately reflect experimentally observed sintering behavior. To more easily and better predict sintering behavior using FE simulations, a thermally activated viscosity function based on creep theory was used with the SOVS model. In comparison withmore » the polynomial viscosity function, SOVS model predictions made using the Arrhenius-type viscosity function are more representative of experimentally observed viscosity and sintering behavior. Additionally, the effects of changes in heating rate on densification can easily be predicted with the Arrhenius-type viscosity function. Another attribute of the Arrhenius-type viscosity function is that it provides the potential to link different sintering models. For example, the apparent activation energy, Q, for densification used in the construction of the master sintering curve for a low-temperature cofire ceramic dielectric has been used as the apparent activation energy for material flow in the Arrhenius-type viscosity function to predict heating rate-dependent sintering behavior using the SOVS model.« less

  20. Consensus models to predict endocrine disruption for all ...

    EPA Pesticide Factsheets

    Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been tested for their ability to disrupt the endocrine system, in particular, their ability to interact with the estrogen receptor. EPA needs tools to prioritize thousands of chemicals, for instance in the Endocrine Disruptor Screening Program (EDSP). Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) was intended to be a demonstration of the use of predictive computational models on HTS data including ToxCast and Tox21 assays to prioritize a large chemical universe of 32464 unique structures for one specific molecular target – the estrogen receptor. CERAPP combined multiple computational models for prediction of estrogen receptor activity, and used the predicted results to build a unique consensus model. Models were developed in collaboration between 17 groups in the U.S. and Europe and applied to predict the common set of chemicals. Structure-based techniques such as docking and several QSAR modeling approaches were employed, mostly using a common training set of 1677 compounds provided by U.S. EPA, to build a total of 42 classification models and 8 regression models for binding, agonist and antagonist activity. All predictions were evaluated on ToxCast data and on an exte

  1. From Data-Sharing to Model-Sharing: SCEC and the Development of Earthquake System Science (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2009-12-01

    Earthquake system science seeks to construct system-level models of earthquake phenomena and use them to predict emergent seismic behavior—an ambitious enterprise that requires high degree of interdisciplinary, multi-institutional collaboration. This presentation will explore model-sharing structures that have been successful in promoting earthquake system science within the Southern California Earthquake Center (SCEC). These include disciplinary working groups to aggregate data into community models; numerical-simulation working groups to investigate system-specific phenomena (process modeling) and further improve the data models (inverse modeling); and interdisciplinary working groups to synthesize predictive system-level models. SCEC has developed a cyberinfrastructure, called the Community Modeling Environment, that can distribute the community models; manage large suites of numerical simulations; vertically integrate the hardware, software, and wetware needed for system-level modeling; and promote the interactions among working groups needed for model validation and refinement. Various socio-scientific structures contribute to successful model-sharing. Two of the most important are “communities of trust” and collaborations between government and academic scientists on mission-oriented objectives. The latter include improvements of earthquake forecasts and seismic hazard models and the use of earthquake scenarios in promoting public awareness and disaster management.

  2. Subseasonal-to-Seasonal Science and Prediction Initiatives of the NOAA MAPP Program

    NASA Astrophysics Data System (ADS)

    Archambault, H. M.; Barrie, D.; Mariotti, A.

    2016-12-01

    There is great practical interest in developing predictions beyond the 2-week weather timescale. Scientific communities have historically organized themselves around the weather and climate problems, but the subseasonal-to-seasonal (S2S) timescale range overall is recognized as new territory for which a concerted shared effort is needed. For instance, the climate community, as part of programs like CLIVAR, has historically tackled coupled phenomena and modeling, keys to harnessing predictability on longer timescales. In contrast, the weather community has focused on synoptic dynamics, higher-resolution modeling, and enhanced model initialization, of importance at the shorter timescales and especially for the prediction of extremes. The processes and phenomena specific to timescales between weather and climate require a unified approach to science, modeling, and predictions. Internationally, the WWRP/WCRP S2S Prediction Project is a promising catalyzer for these types of activities. Among the various contributing U.S. research programs, the Modeling, Analysis, Predictions and Projections (MAPP) program, as part of the NOAA Climate Program Office, has launched coordinated research and transition activities that help to meet the agency's goals to fill the weather-to-climate prediction gap and will contribute to advance international goals. This presentation will describe ongoing MAPP program S2S science and prediction initiatives, specifically the MAPP S2S Task Force and the SubX prediction experiment.

  3. CALCULATION OF PHYSICOCHEMICAL PROPERTIES FOR ENVIRONMENTAL MODELING

    EPA Science Inventory

    Recent trends in environmental regulatory strategies dictate that EPA will rely heavily on predictive modeling to carry out the increasingly complex array of exposure and risk assessments necessary to develop scientifically defensible regulations. In response to this need, resea...

  4. A comparison of hydrologic models for ecological flows and water availability

    Treesearch

    Peter V. Caldwell; Jonathan G. Kennen; Ge Sun; Julie E. Kiang; Jon B. Butcher; Michele C. Eddy; Lauren E. Hay; Jacob H. LaFontaine; Ernie F. Hain; Stacy A. C. Nelson; Steve G. McNulty

    2015-01-01

    Robust hydrologic models are needed to help manage water resources for healthy aquatic ecosystems and reliable water supplies for people, but there is a lack of comprehensive model comparison studies that quantify differences in streamflow predictions among model applications developed to answer management questions. We assessed differences in daily streamflow...

  5. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project

    PubMed Central

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%–80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects. PMID:26495427

  6. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    PubMed

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects.

  7. Using dynamic population simulations to extend resource selection analyses and prioritize habitats for conservation

    USGS Publications Warehouse

    Heinrichs, Julie; Aldridge, Cameron L.; O'Donnell, Michael; Schumaker, Nathan

    2017-01-01

    Prioritizing habitats for conservation is a challenging task, particularly for species with fluctuating populations and seasonally dynamic habitat needs. Although the use of resource selection models to identify and prioritize habitat for conservation is increasingly common, their ability to characterize important long-term habitats for dynamic populations are variable. To examine how habitats might be prioritized differently if resource selection was directly and dynamically linked with population fluctuations and movement limitations among seasonal habitats, we constructed a spatially explicit individual-based model for a dramatically fluctuating population requiring temporally varying resources. Using greater sage-grouse (Centrocercus urophasianus) in Wyoming as a case study, we used resource selection function maps to guide seasonal movement and habitat selection, but emergent population dynamics and simulated movement limitations modified long-term habitat occupancy. We compared priority habitats in RSF maps to long-term simulated habitat use. We examined the circumstances under which the explicit consideration of movement limitations, in combination with population fluctuations and trends, are likely to alter predictions of important habitats. In doing so, we assessed the future occupancy of protected areas under alternative population and habitat conditions. Habitat prioritizations based on resource selection models alone predicted high use in isolated parcels of habitat and in areas with low connectivity among seasonal habitats. In contrast, results based on more biologically-informed simulations emphasized central and connected areas near high-density populations, sometimes predicted to be low selection value. Dynamic models of habitat use can provide additional biological realism that can extend, and in some cases, contradict habitat use predictions generated from short-term or static resource selection analyses. The explicit inclusion of population dynamics and movement propensities via spatial simulation modeling frameworks may provide an informative means of predicting long-term habitat use, particularly for fluctuating populations with complex seasonal habitat needs. Importantly, our results indicate the possible need to consider habitat selection models as a starting point rather than the common end point for refining and prioritizing habitats for protection for cyclic and highly variable populations.

  8. Comparison of time series models for predicting campylobacteriosis risk in New Zealand.

    PubMed

    Al-Sakkaf, A; Jones, G

    2014-05-01

    Predicting campylobacteriosis cases is a matter of considerable concern in New Zealand, after the number of the notified cases was the highest among the developed countries in 2006. Thus, there is a need to develop a model or a tool to predict accurately the number of campylobacteriosis cases as the Microbial Risk Assessment Model used to predict the number of campylobacteriosis cases failed to predict accurately the number of actual cases. We explore the appropriateness of classical time series modelling approaches for predicting campylobacteriosis. Finding the most appropriate time series model for New Zealand data has additional practical considerations given a possible structural change, that is, a specific and sudden change in response to the implemented interventions. A univariate methodological approach was used to predict monthly disease cases using New Zealand surveillance data of campylobacteriosis incidence from 1998 to 2009. The data from the years 1998 to 2008 were used to model the time series with the year 2009 held out of the data set for model validation. The best two models were then fitted to the full 1998-2009 data and used to predict for each month of 2010. The Holt-Winters (multiplicative) and ARIMA (additive) intervention models were considered the best models for predicting campylobacteriosis in New Zealand. It was noticed that the prediction by an additive ARIMA with intervention was slightly better than the prediction by a Holt-Winter multiplicative method for the annual total in year 2010, the former predicting only 23 cases less than the actual reported cases. It is confirmed that classical time series techniques such as ARIMA with intervention and Holt-Winters can provide a good prediction performance for campylobacteriosis risk in New Zealand. The results reported by this study are useful to the New Zealand Health and Safety Authority's efforts in addressing the problem of the campylobacteriosis epidemic. © 2013 Blackwell Verlag GmbH.

  9. Correlation of Wissler Human Thermal Model Blood Flow and Shiver Algorithms

    NASA Technical Reports Server (NTRS)

    Bue, Grant; Makinen, Janice; Cognata, Thomas

    2010-01-01

    The Wissler Human Thermal Model (WHTM) is a thermal math model of the human body that has been widely used to predict the human thermoregulatory response to a variety of cold and hot environments. The model has been shown to predict core temperature and skin temperatures higher and lower, respectively, than in tests of subjects in crew escape suit working in a controlled hot environments. Conversely the model predicts core temperature and skin temperatures lower and higher, respectively, than in tests of lightly clad subjects immersed in cold water conditions. The blood flow algorithms of the model has been investigated to allow for more and less flow, respectively, for the cold and hot case. These changes in the model have yielded better correlation of skin and core temperatures in the cold and hot cases. The algorithm for onset of shiver did not need to be modified to achieve good agreement in cold immersion simulations

  10. Pretest predictions for the response of a 1:8-scale steel LWR containment building model to static overpressurization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clauss, D.B.

    The analyses used to predict the behavior of a 1:8-scale model of a steel LWR containment building to static overpressurization are described and results are presented. Finite strain, large displacement, and nonlinear material properties were accounted for using finite element methods. Three-dimensional models were needed to analyze the penetrations, which included operable equipment hatches, personnel lock representations, and a constrained pipe. It was concluded that the scale model would fail due to leakage caused by large deformations of the equipment hatch sleeves. 13 refs., 34 figs., 1 tab.

  11. Developing and validating a model to predict the success of an IHCS implementation: the Readiness for Implementation Model.

    PubMed

    Wen, Kuang-Yi; Gustafson, David H; Hawkins, Robert P; Brennan, Patricia F; Dinauer, Susan; Johnson, Pauley R; Siegler, Tracy

    2010-01-01

    To develop and validate the Readiness for Implementation Model (RIM). This model predicts a healthcare organization's potential for success in implementing an interactive health communication system (IHCS). The model consists of seven weighted factors, with each factor containing five to seven elements. Two decision-analytic approaches, self-explicated and conjoint analysis, were used to measure the weights of the RIM with a sample of 410 experts. The RIM model with weights was then validated in a prospective study of 25 IHCS implementation cases. Orthogonal main effects design was used to develop 700 conjoint-analysis profiles, which varied on seven factors. Each of the 410 experts rated the importance and desirability of the factors and their levels, as well as a set of 10 different profiles. For the prospective 25-case validation, three time-repeated measures of the RIM scores were collected for comparison with the implementation outcomes. Two of the seven factors, 'organizational motivation' and 'meeting user needs,' were found to be most important in predicting implementation readiness. No statistically significant difference was found in the predictive validity of the two approaches (self-explicated and conjoint analysis). The RIM was a better predictor for the 1-year implementation outcome than the half-year outcome. The expert sample, the order of the survey tasks, the additive model, and basing the RIM cut-off score on experience are possible limitations of the study. The RIM needs to be empirically evaluated in institutions adopting IHCS and sustaining the system in the long term.

  12. Comparison of prediction methods for octanol-air partition coefficients of diverse organic compounds.

    PubMed

    Fu, Zhiqiang; Chen, Jingwen; Li, Xuehua; Wang, Ya'nan; Yu, Haiying

    2016-04-01

    The octanol-air partition coefficient (KOA) is needed for assessing multimedia transport and bioaccumulability of organic chemicals in the environment. As experimental determination of KOA for various chemicals is costly and laborious, development of KOA estimation methods is necessary. We investigated three methods for KOA prediction, conventional quantitative structure-activity relationship (QSAR) models based on molecular structural descriptors, group contribution models based on atom-centered fragments, and a novel model that predicts KOA via solvation free energy from air to octanol phase (ΔGO(0)), with a collection of 939 experimental KOA values for 379 compounds at different temperatures (263.15-323.15 K) as validation or training sets. The developed models were evaluated with the OECD guidelines on QSAR models validation and applicability domain (AD) description. Results showed that although the ΔGO(0) model is theoretically sound and has a broad AD, the prediction accuracy of the model is the poorest. The QSAR models perform better than the group contribution models, and have similar predictability and accuracy with the conventional method that estimates KOA from the octanol-water partition coefficient and Henry's law constant. One QSAR model, which can predict KOA at different temperatures, was recommended for application as to assess the long-range transport potential of chemicals. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Non-animal assessment of skin sensitization hazard: Is an integrated testing strategy needed, and if so what should be integrated?

    PubMed

    Roberts, David W; Patlewicz, Grace

    2018-01-01

    There is an expectation that to meet regulatory requirements, and avoid or minimize animal testing, integrated approaches to testing and assessment will be needed that rely on assays representing key events (KEs) in the skin sensitization adverse outcome pathway. Three non-animal assays have been formally validated and regulatory adopted: the direct peptide reactivity assay (DPRA), the KeratinoSens™ assay and the human cell line activation test (h-CLAT). There have been many efforts to develop integrated approaches to testing and assessment with the "two out of three" approach attracting much attention. Here a set of 271 chemicals with mouse, human and non-animal sensitization test data was evaluated to compare the predictive performances of the three individual non-animal assays, their binary combinations and the "two out of three" approach in predicting skin sensitization potential. The most predictive approach was to use both the DPRA and h-CLAT as follows: (1) perform DPRA - if positive, classify as sensitizing, and (2) if negative, perform h-CLAT - a positive outcome denotes a sensitizer, a negative, a non-sensitizer. With this approach, 85% (local lymph node assay) and 93% (human) of non-sensitizer predictions were correct, whereas the "two out of three" approach had 69% (local lymph node assay) and 79% (human) of non-sensitizer predictions correct. The findings are consistent with the argument, supported by published quantitative mechanistic models that only the first KE needs to be modeled. All three assays model this KE to an extent. The value of using more than one assay depends on how the different assays compensate for each other's technical limitations. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Plans for Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Ballmann, Josef; Bhatia, Kumar; Blades, Eric; Boucke, Alexander; Chwalowski, Pawel; Dietz, Guido; Dowell, Earl; Florance, Jennifer P.; Hansen, Thorsten; hide

    2011-01-01

    This paper summarizes the plans for the first Aeroelastic Prediction Workshop. The workshop is designed to assess the state of the art of computational methods for predicting unsteady flow fields and aeroelastic response. The goals are to provide an impartial forum to evaluate the effectiveness of existing computer codes and modeling techniques, and to identify computational and experimental areas needing additional research and development. Three subject configurations have been chosen from existing wind tunnel data sets where there is pertinent experimental data available for comparison. For each case chosen, the wind tunnel testing was conducted using forced oscillation of the model at specified frequencies

  15. A comparison between the observed and predicted Fe II spectrum in different plasmas

    NASA Astrophysics Data System (ADS)

    Johansson, S.

    This paper gives a survey of the spectral distribution of emission lines of Fe II, predicted from a single atomic model. The observed differences between the recorded and the predicted spectrum are discussed in terms of deficiencies of the model and interactions within the emitting plasma. A number of illustrative examples of unexpected features with applications to astrophysics are given. Selective population, due to charge transfer and resonant photo excitation, is elucidated. The future need of more laboratory data for Fe II as regards energy levels and line classification is also discussed.

  16. 20171015 - Integrating Toxicity, Toxicokinetic, and Exposure Data for Risk-based Chemical Alternatives Assessment (ISES)

    EPA Science Inventory

    In order to predict the margin between the dose needed for adverse chemical effects and actual human exposure rates, data on hazard, exposure, and toxicokinetics are needed. In vitro methods, biomonitoring, and mathematical modeling have provided initial estimates for many extant...

  17. Near-wall k-epsilon turbulence modeling

    NASA Technical Reports Server (NTRS)

    Mansour, N. N.; Kim, J.; Moin, P.

    1987-01-01

    The flow fields from a turbulent channel simulation are used to compute the budgets for the turbulent kinetic energy (k) and its dissipation rate (epsilon). Data from boundary layer simulations are used to analyze the dependence of the eddy-viscosity damping-function on the Reynolds number and the distance from the wall. The computed budgets are used to test existing near-wall turbulence models of the k-epsilon type. It was found that the turbulent transport models should be modified in the vicinity of the wall. It was also found that existing models for the different terms in the epsilon-budget are adequate in the region from the wall, but need modification near the wall. The channel flow is computed using a k-epsilon model with an eddy-viscosity damping function from the data and no damping functions in the epsilon-equation. These computations show that the k-profile can be adequately predicted, but to correctly predict the epsilon-profile, damping functions in the epsilon-equation are needed.

  18. Predicting needs for nursing home admission - does sense of coherence delay nursing home admission in care dependent older people? A longitudinal study.

    PubMed

    Thygesen, Elin; Saevareid, Hans Inge; Lindstrom, Torill Christine; Nygaard, Harald A; Engedal, Knut

    2009-03-01

    Objectives.  This study examined predisposing, enabling and need variables (Andersen's Behavioral Model) influencing the need for nursing home admission (NHA) in older people receiving home nursing care. In particular, the potential role of coping ability, measured as 'sense of coherence' (SOC), was studied. Design, sample, and measurements.  A survey with baseline- and follow-up data after a 2-year period was undertaken with 208 patients aged 75+. The measures used were: gender, education, age, social visits, SOC, social provision scale (SPS), self-rated health (SRH), general health questionnaire (GHQ), clinical dementia rating (CDR), Barthel activities of daily living (ADL) index, and registered illnesses (RI). A Cox proportional model was used to examine factors that could explain risk of NHA. Results.  Measures with predictive properties were Barthel ADL index, SPS, SRH, and gender. SOC, along with subjective health complaints, general health questionnaire, RI and social visits did not predict NHA. Conclusions.  It is concluded that the patients' subjective evaluations of both their health and perceived social support were important predictors of future NHA needs, and should be seriously taken into consideration, along with the more commonly used objective measures of ADL and CDR. © 2008 The Authors. Journal compilation © 2008 Blackwell Publishing Ltd.

  19. Landscape structure and management alter the outcome of a pesticide ERA: Evaluating impacts of endocrine disruption using the ALMaSS European Brown Hare model.

    PubMed

    Topping, Chris J; Dalby, Lars; Skov, Flemming

    2016-01-15

    There is a gradual change towards explicitly considering landscapes in regulatory risk assessment. To realise the objective of developing representative scenarios for risk assessment it is necessary to know how detailed a landscape representation is needed to generate a realistic risk assessment, and indeed how to generate such landscapes. This paper evaluates the contribution of landscape and farming components to a model based risk assessment of a fictitious endocrine disruptor on hares. In addition, we present methods and code examples for generation of landscape structures and farming simulation from data collected primarily for EU agricultural subsidy support and GIS map data. Ten different Danish landscapes were generated and the ERA carried out for each landscape using two different assumed toxicities. The results showed negative impacts in all cases, but the extent and form in terms of impacts on abundance or occupancy differed greatly between landscapes. A meta-model was created, predicting impact from landscape and farming characteristics. Scenarios based on all combinations of farming and landscape for five landscapes representing extreme and middle impacts were created. The meta-models developed from the 10 real landscapes failed to predict impacts for these 25 scenarios. Landscape, farming, and the emergent density of hares all influenced the results of the risk assessment considerably. The study indicates that prediction of a reasonable worst case scenario is difficult from structural, farming or population metrics; rather the emergent properties generated from interactions between landscape, management and ecology are needed. Meta-modelling may also fail to predict impacts, even when restricting inputs to combinations of those used to create the model. Future ERA may therefore need to make use of multiple scenarios representing a wide range of conditions to avoid locally unacceptable risks. This approach could now be feasible Europe wide given the landscape generation methods presented.

  20. Testing the Predictive Power of Coulomb Stress on Aftershock Sequences

    NASA Astrophysics Data System (ADS)

    Woessner, J.; Lombardi, A.; Werner, M. J.; Marzocchi, W.

    2009-12-01

    Empirical and statistical models of clustered seismicity are usually strongly stochastic and perceived to be uninformative in their forecasts, since only marginal distributions are used, such as the Omori-Utsu and Gutenberg-Richter laws. In contrast, so-called physics-based aftershock models, based on seismic rate changes calculated from Coulomb stress changes and rate-and-state friction, make more specific predictions: anisotropic stress shadows and multiplicative rate changes. We test the predictive power of models based on Coulomb stress changes against statistical models, including the popular Short Term Earthquake Probabilities and Epidemic-Type Aftershock Sequences models: We score and compare retrospective forecasts on the aftershock sequences of the 1992 Landers, USA, the 1997 Colfiorito, Italy, and the 2008 Selfoss, Iceland, earthquakes. To quantify predictability, we use likelihood-based metrics that test the consistency of the forecasts with the data, including modified and existing tests used in prospective forecast experiments within the Collaboratory for the Study of Earthquake Predictability (CSEP). Our results indicate that a statistical model performs best. Moreover, two Coulomb model classes seem unable to compete: Models based on deterministic Coulomb stress changes calculated from a given fault-slip model, and those based on fixed receiver faults. One model of Coulomb stress changes does perform well and sometimes outperforms the statistical models, but its predictive information is diluted, because of uncertainties included in the fault-slip model. Our results suggest that models based on Coulomb stress changes need to incorporate stochastic features that represent model and data uncertainty.

  1. Turbulent flow separation in three-dimensional asymmetric diffusers

    NASA Astrophysics Data System (ADS)

    Jeyapaul, Elbert

    2011-12-01

    Turbulent three-dimensional flow separation is more complicated than 2-D. The physics of the flow is not well understood. Turbulent flow separation is nearly independent of the Reynolds number, and separation in 3-D occurs at singular points and along convergence lines emanating from these points. Most of the engineering turbulence research is driven by the need to gain knowledge of the flow field that can be used to improve modeling predictions. This work is motivated by the need for a detailed study of 3-D separation in asymmetric diffusers, to understand the separation phenomena using eddy-resolving simulation methods, assess the predictability of existing RANS turbulence models and propose modeling improvements. The Cherry diffuser has been used as a benchmark. All existing linear eddy-viscosity RANS models k--o SST,k--epsilon and v2- f fail in predicting such flows, predicting separation on the wrong side. The geometry has a doubly-sloped wall, with the other two walls orthogonal to each other and aligned with the diffuser inlet giving the diffuser an asymmetry. The top and side flare angles are different and this gives rise to different pressure gradient in each transverse direction. Eddyresolving simulations using the Scale adaptive simulation (SAS) and Large Eddy Simulation (LES) method have been used to predict separation in benchmark diffuser and validated. A series of diffusers with the same configuration have been generated, each having the same streamwise pressure gradient and parametrized only by the inlet aspect ratio. The RANS models were put to test and the flow physics explored using SAS-generated flow field. The RANS model indicate a transition in separation surface from top sloped wall to the side sloped wall at an inlet aspect ratio much lower than observed in LES results. This over-sensitivity of RANS models to transverse pressure gradients is due to lack of anisotropy in the linear Reynolds stress formulation. The complexity of the flow separation is due to effects of lateral straining, streamline curvature, secondary flow of second kind, transverse pressure gradient on turbulence. Resolving these effects is possible with anisotropy turbulence models as the Explicit Algebraic Reynolds stress model (EARSM). This model has provided accurate prediction of streamwise and transverse velocity, however the wall pressure is under predicted. An improved EARSM model is developed by correcting the coefficients, which predicts a more accurate wall pressure. There exists scope for improvement of this model, by including convective effects and dynamics of velocity gradient invariants.

  2. The oral health condition and treatment needs assessment of nursing home residents in Flanders (Belgium).

    PubMed

    Janssens, B; Vanobbergen, J; Petrovic, M; Jacquet, W; Schols, J M G A; De Visschere, L

    2017-09-01

    A study was conducted of nursing home residents with limited access to regular oral health care services to evaluate their oral health status, to perform an assessment of the need for oral treatment and to determine the possible predicting value of age, gender, care dependency and income level on their oral health status and treatment needs. Three experienced dentists collected clinical oral health data with a mobile dental unit in 23 nursing homes. Socio-demographic data were extracted from the residents' records in the nursing home. Besides the descriptive and bivariate analysis, a general linear mixed model analysis was also performed with the nursing home as random effect. The study sample consisted of 1,226 residents with a mean age of 83.9 years, of which 41.9% were edentulous. The mean D₃MFt in the dentate group was 24.5 and 77% needed extractions or fillings. In the group of residents wearing removable dentures, 36.9% needed repair, rebasing or renewal of the denture. The mixed model analysis demonstrated that with each year a resident gets older, the oral health outcomes get worse and that men have worse oral health and higher treatment needs than women. However, the level of income and care dependency had a less extensive role in predicting the oral health outcomes. The nursing home residents presented a poor overall oral health status and high dental and prosthetic treatment needs. Gender and age were important predicting variables for the oral health outcomes. Copyright© 2017 Dennis Barber Ltd.

  3. Reverse engineering of legacy agricultural phenology modeling system

    USDA-ARS?s Scientific Manuscript database

    A program which implements predictive phenology modeling is a valuable tool for growers and scientists. Such a program was created in the late 1980's by the creators of general phenology modeling as proof of their techniques. However, this first program could not continue to meet the needs of the fi...

  4. Wheat mill stream properties for discrete element method modeling

    USDA-ARS?s Scientific Manuscript database

    A discrete phase approach based on individual wheat kernel characteristics is needed to overcome the limitations of previous statistical models and accurately predict the milling behavior of wheat. As a first step to develop a discrete element method (DEM) model for the wheat milling process, this s...

  5. Dynamic Evaluation of a Regional Air Quality Model: Assessing the Emissions-Induced Weekly Ozone Cycle

    EPA Science Inventory

    Air quality models are used to predict changes in pollutant concentrations resulting from envisioned emission control policies. Recognizing the need to assess the credibility of air quality models in a policy-relevant context, we perform a dynamic evaluation of the community Mult...

  6. Modeling chlorophyll-a and dissolved oxygen concentration in tropical floodplain lakes (Paraná River, Brazil).

    PubMed

    Rocha, R R A; Thomaz, S M; Carvalho, P; Gomes, L C

    2009-06-01

    The need for prediction is widely recognized in limnology. In this study, data from 25 lakes of the Upper Paraná River floodplain were used to build models to predict chlorophyll-a and dissolved oxygen concentrations. Akaike's information criterion (AIC) was used as a criterion for model selection. Models were validated with independent data obtained in the same lakes in 2001. Predictor variables that significantly explained chlorophyll-a concentration were pH, electrical conductivity, total seston (positive correlation) and nitrate (negative correlation). This model explained 52% of chlorophyll variability. Variables that significantly explained dissolved oxygen concentration were pH, lake area and nitrate (all positive correlations); water temperature and electrical conductivity were negatively correlated with oxygen. This model explained 54% of oxygen variability. Validation with independent data showed that both models had the potential to predict algal biomass and dissolved oxygen concentration in these lakes. These findings suggest that multiple regression models are valuable and practical tools for understanding the dynamics of ecosystems and that predictive limnology may still be considered a powerful approach in aquatic ecology.

  7. Ground-water models for water resource planning

    USGS Publications Warehouse

    Moore, J.E.

    1983-01-01

    In the past decade hydrogeologists have emphasized the development of computer-based mathematical models to aid in the understanding of flow, the transport of solutes, transport of heat, and deformation in the ground-water system. These models have been used to provide information and predictions for water managers. Too frequently, ground-water was neglected in water resource planning because managers believed that it could not be adequately evaluated in terms of availability, quality, and effect of development on surface-water supplies. Now, however, with newly developed digital ground-water models, effects of development can be predicted. Such models have been used to predict hydrologic and quality changes under different stresses. These models have grown in complexity over the last ten years from simple one-layer models to three-dimensional simulations of ground-water flow, which may include solute transport, heat transport, effects of land subsidence, and encroachment of saltwater. Case histories illustrate how predictive ground-water models have provided the information needed for the sound planning and management of water resources in the USA. ?? 1983 D. Reidel Publishing Company.

  8. Two-Equation Low-Reynolds-Number Turbulence Modeling of Transitional Boundary Layer Flows Characteristic of Gas Turbine Blades. Ph.D. Thesis. Final Contractor Report

    NASA Technical Reports Server (NTRS)

    Schmidt, Rodney C.; Patankar, Suhas V.

    1988-01-01

    The use of low Reynolds number (LRN) forms of the k-epsilon turbulence model in predicting transitional boundary layer flow characteristic of gas turbine blades is developed. The research presented consists of: (1) an evaluation of two existing models; (2) the development of a modification to current LRN models; and (3) the extensive testing of the proposed model against experimental data. The prediction characteristics and capabilities of the Jones-Launder (1972) and Lam-Bremhorst (1981) LRN k-epsilon models are evaluated with respect to the prediction of transition on flat plates. Next, the mechanism by which the models simulate transition is considered and the need for additional constraints is discussed. Finally, the transition predictions of a new model are compared with a wide range of different experiments, including transitional flows with free-stream turbulence under conditions of flat plate constant velocity, flat plate constant acceleration, flat plate but strongly variable acceleration, and flow around turbine blade test cascades. In general, calculational procedure yields good agreement with most of the experiments.

  9. Data-Based Predictive Control with Multirate Prediction Step

    NASA Technical Reports Server (NTRS)

    Barlow, Jonathan S.

    2010-01-01

    Data-based predictive control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. One challenge of MPC is computational requirements increasing with prediction horizon length. This paper develops a closed-loop dynamic output feedback controller that minimizes a multi-step-ahead receding-horizon cost function with multirate prediction step. One result is a reduced influence of prediction horizon and the number of system outputs on the computational requirements of the controller. Another result is an emphasis on portions of the prediction window that are sampled more frequently. A third result is the ability to include more outputs in the feedback path than in the cost function.

  10. Simple Decision-Analytic Functions of the AUC for Ruling Out a Risk Prediction Model and an Added Predictor.

    PubMed

    Baker, Stuart G

    2018-02-01

    When using risk prediction models, an important consideration is weighing performance against the cost (monetary and harms) of ascertaining predictors. The minimum test tradeoff (MTT) for ruling out a model is the minimum number of all-predictor ascertainments per correct prediction to yield a positive overall expected utility. The MTT for ruling out an added predictor is the minimum number of added-predictor ascertainments per correct prediction to yield a positive overall expected utility. An approximation to the MTT for ruling out a model is 1/[P (H(AUC model )], where H(AUC) = AUC - {½ (1-AUC)} ½ , AUC is the area under the receiver operating characteristic (ROC) curve, and P is the probability of the predicted event in the target population. An approximation to the MTT for ruling out an added predictor is 1 /[P {(H(AUC Model:2 ) - H(AUC Model:1 )], where Model 2 includes an added predictor relative to Model 1. The latter approximation requires the Tangent Condition that the true positive rate at the point on the ROC curve with a slope of 1 is larger for Model 2 than Model 1. These approximations are suitable for back-of-the-envelope calculations. For example, in a study predicting the risk of invasive breast cancer, Model 2 adds to the predictors in Model 1 a set of 7 single nucleotide polymorphisms (SNPs). Based on the AUCs and the Tangent Condition, an MTT of 7200 was computed, which indicates that 7200 sets of SNPs are needed for every correct prediction of breast cancer to yield a positive overall expected utility. If ascertaining the SNPs costs $500, this MTT suggests that SNP ascertainment is not likely worthwhile for this risk prediction.

  11. Spousal autonomy support, need satisfaction, and well-being in individuals with chronic pain: A longitudinal study.

    PubMed

    Uysal, Ahmet; Ascigil, Esra; Turunc, Gamze

    2017-04-01

    The present research examined the effect of spousal autonomy support on the need satisfaction and well-being of individuals with chronic pain. Married individuals with a diagnosed musculoskeletal chronic pain condition (N = 109) completed a baseline questionnaire and a follow-up questionnaire after a 6-month time period. Cross-lagged analyses indicated that spousal autonomy support predicted increases in basic need satisfaction, and need satisfaction predicted increases in well-being. Moreover, the analyses in the opposite direction were not significant. Similarly, cross-lagged analyses were more supportive of the direction from pain intensity to lower well-being, rather than well-being to pain intensity. Finally, we tested a longitudinal structural model using pain intensity and spousal autonomy support as the predictors, basic needs as the mediator, and well-being as the outcome. The model provided a good fit to the data. Results showed that spousal autonomy support had a positive effect on the need satisfaction and well-being of individuals with chronic pain, independent of pain intensity. These findings extend self-determination theory to the chronic pain context and lay the groundwork for future chronic pain studies using the self-determination theory framework.

  12. Proposals for enhanced health risk assessment and stratification in an integrated care scenario.

    PubMed

    Dueñas-Espín, Ivan; Vela, Emili; Pauws, Steffen; Bescos, Cristina; Cano, Isaac; Cleries, Montserrat; Contel, Joan Carles; de Manuel Keenoy, Esteban; Garcia-Aymerich, Judith; Gomez-Cabrero, David; Kaye, Rachelle; Lahr, Maarten M H; Lluch-Ariet, Magí; Moharra, Montserrat; Monterde, David; Mora, Joana; Nalin, Marco; Pavlickova, Andrea; Piera, Jordi; Ponce, Sara; Santaeugenia, Sebastià; Schonenberg, Helen; Störk, Stefan; Tegner, Jesper; Velickovski, Filip; Westerteicher, Christoph; Roca, Josep

    2016-04-15

    Population-based health risk assessment and stratification are considered highly relevant for large-scale implementation of integrated care by facilitating services design and case identification. The principal objective of the study was to analyse five health-risk assessment strategies and health indicators used in the five regions participating in the Advancing Care Coordination and Telehealth Deployment (ACT) programme (http://www.act-programme.eu). The second purpose was to elaborate on strategies toward enhanced health risk predictive modelling in the clinical scenario. The five ACT regions: Scotland (UK), Basque Country (ES), Catalonia (ES), Lombardy (I) and Groningen (NL). Responsible teams for regional data management in the five ACT regions. We characterised and compared risk assessment strategies among ACT regions by analysing operational health risk predictive modelling tools for population-based stratification, as well as available health indicators at regional level. The analysis of the risk assessment tool deployed in Catalonia in 2015 (GMAs, Adjusted Morbidity Groups) was used as a basis to propose how population-based analytics could contribute to clinical risk prediction. There was consensus on the need for a population health approach to generate health risk predictive modelling. However, this strategy was fully in place only in two ACT regions: Basque Country and Catalonia. We found marked differences among regions in health risk predictive modelling tools and health indicators, and identified key factors constraining their comparability. The research proposes means to overcome current limitations and the use of population-based health risk prediction for enhanced clinical risk assessment. The results indicate the need for further efforts to improve both comparability and flexibility of current population-based health risk predictive modelling approaches. Applicability and impact of the proposals for enhanced clinical risk assessment require prospective evaluation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  13. Predicting the effects of unmodeled dynamics on an aircraft flight control system design using eigenspace assignment

    NASA Technical Reports Server (NTRS)

    Johnson, Eric N.; Davidson, John B.; Murphy, Patrick C.

    1994-01-01

    When using eigenspace assignment to design an aircraft flight control system, one must first develop a model of the plant. Certain questions arise when creating this model as to which dynamics of the plant need to be included in the model and which dynamics can be left out or approximated. The answers to these questions are important because a poor choice can lead to closed-loop dynamics that are unpredicted by the design model. To alleviate this problem, a method has been developed for predicting the effect of not including certain dynamics in the design model on the final closed-loop eigenspace. This development provides insight as to which characteristics of unmodeled dynamics will ultimately affect the closed-loop rigid-body dynamics. What results from this insight is a guide for eigenstructure control law designers to aid them in determining which dynamics need or do not need to be included and a new way to include these dynamics in the flight control system design model to achieve a required accuracy in the closed-loop rigid-body dynamics. The method is illustrated for a lateral-directional flight control system design using eigenspace assignment for the NASA High Alpha Research Vehicle (HARV).

  14. Modeling and Predicting the Stress Relaxation of Composites with Short and Randomly Oriented Fibers

    PubMed Central

    Obaid, Numaira; Sain, Mohini

    2017-01-01

    The addition of short fibers has been experimentally observed to slow the stress relaxation of viscoelastic polymers, producing a change in the relaxation time constant. Our recent study attributed this effect of fibers on stress relaxation behavior to the interfacial shear stress transfer at the fiber-matrix interface. This model explained the effect of fiber addition on stress relaxation without the need to postulate structural changes at the interface. In our previous study, we developed an analytical model for the effect of fully aligned short fibers, and the model predictions were successfully compared to finite element simulations. However, in most industrial applications of short-fiber composites, fibers are not aligned, and hence it is necessary to examine the time dependence of viscoelastic polymers containing randomly oriented short fibers. In this study, we propose an analytical model to predict the stress relaxation behavior of short-fiber composites where the fibers are randomly oriented. The model predictions were compared to results obtained from Monte Carlo finite element simulations, and good agreement between the two was observed. The analytical model provides an excellent tool to accurately predict the stress relaxation behavior of randomly oriented short-fiber composites. PMID:29053601

  15. A prediction model of drug-induced ototoxicity developed by an optimal support vector machine (SVM) method.

    PubMed

    Zhou, Shu; Li, Guo-Bo; Huang, Lu-Yi; Xie, Huan-Zhang; Zhao, Ying-Lan; Chen, Yu-Zong; Li, Lin-Li; Yang, Sheng-Yong

    2014-08-01

    Drug-induced ototoxicity, as a toxic side effect, is an important issue needed to be considered in drug discovery. Nevertheless, current experimental methods used to evaluate drug-induced ototoxicity are often time-consuming and expensive, indicating that they are not suitable for a large-scale evaluation of drug-induced ototoxicity in the early stage of drug discovery. We thus, in this investigation, established an effective computational prediction model of drug-induced ototoxicity using an optimal support vector machine (SVM) method, GA-CG-SVM. Three GA-CG-SVM models were developed based on three training sets containing agents bearing different risk levels of drug-induced ototoxicity. For comparison, models based on naïve Bayesian (NB) and recursive partitioning (RP) methods were also used on the same training sets. Among all the prediction models, the GA-CG-SVM model II showed the best performance, which offered prediction accuracies of 85.33% and 83.05% for two independent test sets, respectively. Overall, the good performance of the GA-CG-SVM model II indicates that it could be used for the prediction of drug-induced ototoxicity in the early stage of drug discovery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Transition Heat Transfer Modeling Based on the Characteristics of Turbulent Spots

    NASA Technical Reports Server (NTRS)

    Simon, Fred; Boyle, Robert

    1998-01-01

    While turbulence models are being developed which show promise for simulating the transition region on a turbine blade or vane, it is believed that the best approach with the greatest potential for practical use is the use of models which incorporate the physics of turbulent spots present in the transition region. This type of modeling results in the prediction of transition region intermittency which when incorporated in turbulence models give a good to excellent prediction of the transition region heat transfer. Some models are presented which show how turbulent spot characteristics and behavior can be employed to predict the effect of pressure gradient and Mach number on the transition region. The models predict the spot formation rate which is needed, in addition to the transition onset location, in the Narasimha concentrated breakdown intermittency equation. A simplified approach is taken for modeling turbulent spot growth and interaction in the transition region which utilizes the turbulent spot variables governing transition length and spot generation rate. The models are expressed in terms of spot spreading angle, dimensionless spot velocity, dimensionless spot area, disturbance frequency and Mach number. The models are used in conjunction with a computer code to predict the effects of pressure gradient and Mach number on the transition region and compared with VKI experimental turbine data.

  17. Impact of modellers' decisions on hydrological a priori predictions

    NASA Astrophysics Data System (ADS)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2013-07-01

    The purpose of this paper is to stimulate a re-thinking of how we, the catchment hydrologists, could become reliable forecasters. A group of catchment modellers predicted the hydrological response of a man-made 6 ha catchment in its initial phase (Chicken Creek) without having access to the observed records. They used conceptually different model families. Their modelling experience differed largely. The prediction exercise was organized in three steps: (1) for the 1st prediction modellers received a basic data set describing the internal structure of the catchment (somewhat more complete than usually available to a priori predictions in ungauged catchments). They did not obtain time series of stream flow, soil moisture or groundwater response. (2) Before the 2nd improved prediction they inspected the catchment on-site and attended a workshop where the modellers presented and discussed their first attempts. (3) For their improved 3rd prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step 1. Here, we detail the modeller's decisions in accounting for the various processes based on what they learned during the field visit (step 2) and add the final outcome of step 3 when the modellers made use of additional data. We document the prediction progress as well as the learning process resulting from the availability of added information. For the 2nd and 3rd step, the progress in prediction quality could be evaluated in relation to individual modelling experience and costs of added information. We learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing), and (iii) that added process understanding can be as efficient as adding data for improving parameters needed to satisfy model requirements.

  18. Bridging the gap between theoretical ecology and real ecosystems: modeling invertebrate community composition in streams.

    PubMed

    Schuwirth, Nele; Reichert, Peter

    2013-02-01

    For the first time, we combine concepts of theoretical food web modeling, the metabolic theory of ecology, and ecological stoichiometry with the use of functional trait databases to predict the coexistence of invertebrate taxa in streams. We developed a mechanistic model that describes growth, death, and respiration of different taxa dependent on various environmental influence factors to estimate survival or extinction. Parameter and input uncertainty is propagated to model results. Such a model is needed to test our current quantitative understanding of ecosystem structure and function and to predict effects of anthropogenic impacts and restoration efforts. The model was tested using macroinvertebrate monitoring data from a catchment of the Swiss Plateau. Even without fitting model parameters, the model is able to represent key patterns of the coexistence structure of invertebrates at sites varying in external conditions (litter input, shading, water quality). This confirms the suitability of the model concept. More comprehensive testing and resulting model adaptations will further increase the predictive accuracy of the model.

  19. Evaluation of a Mysis bioenergetics model

    USGS Publications Warehouse

    Chipps, S.R.; Bennett, D.H.

    2002-01-01

    Direct approaches for estimating the feeding rate of the opossum shrimp Mysis relicta can be hampered by variable gut residence time (evacuation rate models) and non-linear functional responses (clearance rate models). Bioenergetics modeling provides an alternative method, but the reliability of this approach needs to be evaluated using independent measures of growth and food consumption. In this study, we measured growth and food consumption for M. relicta and compared experimental results with those predicted from a Mysis bioenergetics model. For Mysis reared at 10??C, model predictions were not significantly different from observed values. Moreover, decomposition of mean square error indicated that 70% of the variation between model predictions and observed values was attributable to random error. On average, model predictions were within 12% of observed values. A sensitivity analysis revealed that Mysis respiration and prey energy density were the most sensitive parameters affecting model output. By accounting for uncertainty (95% CLs) in Mysis respiration, we observed a significant improvement in the accuracy of model output (within 5% of observed values), illustrating the importance of sensitive input parameters for model performance. These findings help corroborate the Mysis bioenergetics model and demonstrate the usefulness of this approach for estimating Mysis feeding rate.

  20. Predictive Models for Carcinogenicity and Mutagenicity ...

    EPA Pesticide Factsheets

    Mutagenicity and carcinogenicity are endpoints of major environmental and regulatory concern. These endpoints are also important targets for development of alternative methods for screening and prediction due to the large number of chemicals of potential concern and the tremendous cost (in time, money, animals) of rodent carcinogenicity bioassays. Both mutagenicity and carcinogenicity involve complex, cellular processes that are only partially understood. Advances in technologies and generation of new data will permit a much deeper understanding. In silico methods for predicting mutagenicity and rodent carcinogenicity based on chemical structural features, along with current mutagenicity and carcinogenicity data sets, have performed well for local prediction (i.e., within specific chemical classes), but are less successful for global prediction (i.e., for a broad range of chemicals). The predictivity of in silico methods can be improved by improving the quality of the data base and endpoints used for modelling. In particular, in vitro assays for clastogenicity need to be improved to reduce false positives (relative to rodent carcinogenicity) and to detect compounds that do not interact directly with DNA or have epigenetic activities. New assays emerging to complement or replace some of the standard assays include VitotoxTM, GreenScreenGC, and RadarScreen. The needs of industry and regulators to assess thousands of compounds necessitate the development of high-t

  1. Application of remote sensing for prediction and detection of thermal pollution

    NASA Technical Reports Server (NTRS)

    Veziroglu, T. N.; Lee, S. S.

    1974-01-01

    The first phase is described of a three year project for the development of a mathematical model for predicting thermal pollution by use of remote sensing measurements. A rigid-lid model was developed, and results were obtained for different wind conditions at Biscayne Bay in South Florida. The design of the measurement system was completed, and instruments needed for the first stage of experiment were acquired, tested, and calibrated. A preliminary research flight was conducted.

  2. Understanding and predicting profile structure and parametric scaling of intrinsic rotation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, W. X.; Grierson, B. A.; Ethier, S.

    2017-08-10

    This study reports on a recent advance in developing physical understanding and a first-principles-based model for predicting intrinsic rotation profiles in magnetic fusion experiments. It is shown for the first time that turbulent fluctuation-driven residual stress (a non-diffusive component of momentum flux) along with diffusive momentum flux can account for both the shape and magnitude of the observed intrinsic toroidal rotation profile. Both the turbulence intensity gradient and zonal flow E×B shear are identified as major contributors to the generation of the k ∥-asymmetry needed for the residual stress generation. The model predictions of core rotation based on global gyrokineticmore » simulations agree well with the experimental measurements of main ion toroidal rotation for a set of DIII-D ECH discharges. The validated model is further used to investigate the characteristic dependence of residual stress and intrinsic rotation profile structure on the multi-dimensional parametric space covering the turbulence type, q-profile structure, and up-down asymmetry in magnetic geometry with the goal of developing the physics understanding needed for rotation profile control and optimization. It is shown that in the flat-q profile regime, intrinsic rotations driven by ITG and TEM turbulence are in the opposite direction (i.e., intrinsic rotation reverses). The predictive model also produces reversed intrinsic rotation for plasmas with weak and normal shear q-profiles.« less

  3. Improved predictions of atmospheric icing in Norway

    NASA Astrophysics Data System (ADS)

    Engdahl, Bjørg Jenny; Nygaard, Bjørn Egil; Thompson, Gregory; Bengtsson, Lisa; Berntsen, Terje

    2017-04-01

    Atmospheric icing of ground structures is a problem in cold climate locations such as Norway. During the 2013/2014 winter season two major power lines in southern Norway suffered severe damage due to ice loads exceeding their design values by two to three times. Better methods are needed to estimate the ice loads that affect various infrastructure, and better models are needed to improve the prediction of severe icing events. The Wind, Ice and Snow loads Impact on Infrastructure and the Natural Environment (WISLINE) project, was initiated to address this problem and to explore how a changing climate may affect the ice loads in Norway. Creating better forecasts of icing requires a proper simulation of supercooled liquid water (SLW). Preliminary results show that the operational numerical weather prediction model (HARMONIE-AROME) at MET-Norway generates considerably lower values of SLW as compared with the WRF model when run with the Thompson microphysics scheme. Therefore, we are piecewise implementing specific processes found in the Thompson scheme into the AROME model and testing the resulting impacts to prediction of SLW and structural icing. Both idealized and real icing cases are carried out to test the newly modified AROME microphysics scheme. Besides conventional observations, a unique set of specialized instrumentation for icing measurements are used for validation. Initial results of this investigation will be presented at the conference.

  4. Feature selection through validation and un-censoring of endovascular repair survival data for predicting the risk of re-intervention.

    PubMed

    Attallah, Omneya; Karthikesalingam, Alan; Holt, Peter J E; Thompson, Matthew M; Sayers, Rob; Bown, Matthew J; Choke, Eddie C; Ma, Xianghong

    2017-08-03

    Feature selection (FS) process is essential in the medical area as it reduces the effort and time needed for physicians to measure unnecessary features. Choosing useful variables is a difficult task with the presence of censoring which is the unique characteristic in survival analysis. Most survival FS methods depend on Cox's proportional hazard model; however, machine learning techniques (MLT) are preferred but not commonly used due to censoring. Techniques that have been proposed to adopt MLT to perform FS with survival data cannot be used with the high level of censoring. The researcher's previous publications proposed a technique to deal with the high level of censoring. It also used existing FS techniques to reduce dataset dimension. However, in this paper a new FS technique was proposed and combined with feature transformation and the proposed uncensoring approaches to select a reduced set of features and produce a stable predictive model. In this paper, a FS technique based on artificial neural network (ANN) MLT is proposed to deal with highly censored Endovascular Aortic Repair (EVAR). Survival data EVAR datasets were collected during 2004 to 2010 from two vascular centers in order to produce a final stable model. They contain almost 91% of censored patients. The proposed approach used a wrapper FS method with ANN to select a reduced subset of features that predict the risk of EVAR re-intervention after 5 years to patients from two different centers located in the United Kingdom, to allow it to be potentially applied to cross-centers predictions. The proposed model is compared with the two popular FS techniques; Akaike and Bayesian information criteria (AIC, BIC) that are used with Cox's model. The final model outperforms other methods in distinguishing the high and low risk groups; as they both have concordance index and estimated AUC better than the Cox's model based on AIC, BIC, Lasso, and SCAD approaches. These models have p-values lower than 0.05, meaning that patients with different risk groups can be separated significantly and those who would need re-intervention can be correctly predicted. The proposed approach will save time and effort made by physicians to collect unnecessary variables. The final reduced model was able to predict the long-term risk of aortic complications after EVAR. This predictive model can help clinicians decide patients' future observation plan.

  5. Regional analysis of drought and heat impacts on forests: current and future science directions.

    PubMed

    Law, Beverly E

    2014-12-01

    Accurate assessments of forest response to current and future climate and human actions are needed at regional scales. Predicting future impacts on forests will require improved analysis of species-level adaptation, resilience, and vulnerability to mortality. Land system models can be enhanced by creating trait-based groupings of species that better represent climate sensitivity, such as risk of hydraulic failure from drought. This emphasizes the need for more coordinated in situ and remote sensing observations to track changes in ecosystem function, and to improve model inputs, spatio-temporal diagnosis, and predictions of future conditions, including implications of actions to mitigate climate change. © 2014 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  6. An improved method for predicting the lightning performance of high and extra-high-voltage substation shielding

    NASA Astrophysics Data System (ADS)

    Vinh, T.

    1980-08-01

    There is a need for better and more effective lightning protection for transmission and switching substations. In the past, a number of empirical methods were utilized to design systems to protect substations and transmission lines from direct lightning strokes. The need exists for convenient analytical lightning models adequate for engineering usage. In this study, analytical lightning models were developed along with a method for improved analysis of the physical properties of lightning through their use. This method of analysis is based upon the most recent statistical field data. The result is an improved method for predicting the occurrence of sheilding failure and for designing more effective protection for high and extra high voltage substations from direct strokes.

  7. Predicting survival of de novo metastatic breast cancer in Asian women: systematic review and validation study.

    PubMed

    Miao, Hui; Hartman, Mikael; Bhoo-Pathy, Nirmala; Lee, Soo-Chin; Taib, Nur Aishah; Tan, Ern-Yu; Chan, Patrick; Moons, Karel G M; Wong, Hoong-Seam; Goh, Jeremy; Rahim, Siti Mastura; Yip, Cheng-Har; Verkooijen, Helena M

    2014-01-01

    In Asia, up to 25% of breast cancer patients present with distant metastases at diagnosis. Given the heterogeneous survival probabilities of de novo metastatic breast cancer, individual outcome prediction is challenging. The aim of the study is to identify existing prognostic models for patients with de novo metastatic breast cancer and validate them in Asia. We performed a systematic review to identify prediction models for metastatic breast cancer. Models were validated in 642 women with de novo metastatic breast cancer registered between 2000 and 2010 in the Singapore Malaysia Hospital Based Breast Cancer Registry. Survival curves for low, intermediate and high-risk groups according to each prognostic score were compared by log-rank test and discrimination of the models was assessed by concordance statistic (C-statistic). We identified 16 prediction models, seven of which were for patients with brain metastases only. Performance status, estrogen receptor status, metastatic site(s) and disease-free interval were the most common predictors. We were able to validate nine prediction models. The capacity of the models to discriminate between poor and good survivors varied from poor to fair with C-statistics ranging from 0.50 (95% CI, 0.48-0.53) to 0.63 (95% CI, 0.60-0.66). The discriminatory performance of existing prediction models for de novo metastatic breast cancer in Asia is modest. Development of an Asian-specific prediction model is needed to improve prognostication and guide decision making.

  8. Assessment of quantitative structure-activity relationship of toxicity prediction models for Korean chemical substance control legislation

    PubMed Central

    Kim, Kwang-Yon; Shin, Seong Eun; No, Kyoung Tai

    2015-01-01

    Objectives For successful adoption of legislation controlling registration and assessment of chemical substances, it is important to obtain sufficient toxicological experimental evidence and other related information. It is also essential to obtain a sufficient number of predicted risk and toxicity results. Particularly, methods used in predicting toxicities of chemical substances during acquisition of required data, ultimately become an economic method for future dealings with new substances. Although the need for such methods is gradually increasing, the-required information about reliability and applicability range has not been systematically provided. Methods There are various representative environmental and human toxicity models based on quantitative structure-activity relationships (QSAR). Here, we secured the 10 representative QSAR-based prediction models and its information that can make predictions about substances that are expected to be regulated. We used models that predict and confirm usability of the information expected to be collected and submitted according to the legislation. After collecting and evaluating each predictive model and relevant data, we prepared methods quantifying the scientific validity and reliability, which are essential conditions for using predictive models. Results We calculated predicted values for the models. Furthermore, we deduced and compared adequacies of the models using the Alternative non-testing method assessed for Registration, Evaluation, Authorization, and Restriction of Chemicals Substances scoring system, and deduced the applicability domains for each model. Additionally, we calculated and compared inclusion rates of substances expected to be regulated, to confirm the applicability. Conclusions We evaluated and compared the data, adequacy, and applicability of our selected QSAR-based toxicity prediction models, and included them in a database. Based on this data, we aimed to construct a system that can be used with predicted toxicity results. Furthermore, by presenting the suitability of individual predicted results, we aimed to provide a foundation that could be used in actual assessments and regulations. PMID:26206368

  9. Predictability of the Ningaloo Niño/Niña

    PubMed Central

    Doi, Takeshi; Behera, Swadhin K.; Yamagata, Toshio

    2013-01-01

    The seasonal prediction of the coastal oceanic warm event off West Australia, recently named the Ningaloo Niño, is explored by use of a state-of-the-art ocean-atmosphere coupled general circulation model. The Ningaloo Niño/Niña, which generally matures in austral summer, is found to be predictable two seasons ahead. In particular, the unprecedented extreme warm event in February 2011 was successfully predicted 9 months in advance. The successful prediction of the Ningaloo Niño is mainly due to the high prediction skill of La Niña in the Pacific. However, the model deficiency to underestimate its early evolution and peak amplitude needs to be improved. Since the Ningaloo Niño/Niña has potential impacts on regional societies and industries through extreme events, the present success of its prediction may encourage development of its early warning system. PMID:24100593

  10. A needs index for mental health care.

    PubMed

    Glover, G R; Robin, E; Emami, J; Arabscheibani, G R

    1998-02-01

    The study aimed to develop a mental illness needs index to help local managers, district purchasers and national policy makers in allocating resources. Formulae were developed by regression analysis using 1991 census data to predict the period prevalence of acute psychiatric admission from electoral wards. Census variables used were chosen on the basis of an established association with mental illness rates. Data from one English Health Service region were analysed for patterns common to wards at hospital catchment area level and patterns common to district health authorities at regional level. The North East Thames region was chosen as the setting for the study, with 7096 patients being admitted during 1991. In most, but not all, catchment areas reasonable prediction of the pattern of admission prevalence was possible using the variables chosen. However, different population characteristics predicted admission prevalence in rural and urban areas. Prediction methods based on one or two variables are thus unlikely to work in both settings. A Mental Illness Needs Index (MINI) based on social isolation, poverty, unemployment, permanent sickness and temporary and insecure housing predicted differences in admission prevalence between wards at catchment area level better than Jarman's Underprivileged Area (UPA) score [1] and between districts at regional level better than the UPA score and comparably to the York Psychiatric Index [2] (adjusted r2 at regional level (MINI 0.82, UPA 0.53, York index 0.70). District admission prevalence rates vary by a factor of three between rural and inner city areas; this difference may not fully reflect the variation in the cost of providing care. It did not prove possible to incorporate factors related to bed availability in the models used; reasons for this are discussed. Data covering other aspects of mental health care in addition to hospital admission are needed for more satisfactory modelling.

  11. Volunteering for Job Enrichment: A Test of Expectancy Theory Predictions

    ERIC Educational Resources Information Center

    Giles, William F.

    1977-01-01

    In order to test predictions derived from an expectancy theory model developed by E. E. Lawler, measures of higher-order need satisfaction, locus of control, and intrinsic motivation were obtained from 252 female assembly line workers. Implications of the results for placement of individuals in enriched jobs are discussed. (Editor/RK)

  12. GM(1,N) method for the prediction of anaerobic digestion system and sensitivity analysis of influential factors.

    PubMed

    Ren, Jingzheng

    2018-01-01

    Anaerobic digestion process has been recognized as a promising way for waste treatment and energy recovery in a sustainable way. Modelling of anaerobic digestion system is significantly important for effectively and accurately controlling, adjusting, and predicting the system for higher methane yield. The GM(1,N) approach which does not need the mechanism or a large number of samples was employed to model the anaerobic digestion system to predict methane yield. In order to illustrate the proposed model, an illustrative case about anaerobic digestion of municipal solid waste for methane yield was studied, and the results demonstrate that GM(1,N) model can effectively simulate anaerobic digestion system at the cases of poor information with less computational expense. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. A predictive model for failure properties of thermoset resins

    NASA Technical Reports Server (NTRS)

    Caruthers, James M.; Bowles, Kenneth J.

    1989-01-01

    A predictive model for the three-dimensional failure behavior of engineering polymers has been developed in a recent NASA-sponsored research program. This model acknowledges the underlying molecular deformation mechanisms and thus accounts for the effects of different chemical compositions, crosslink density, functionality of the curing agent, etc., on the complete nonlinear stress-strain response including yield. The material parameters required by the model can be determined from test-tube quantities of a new resin in only a few days. Thus, we can obtain a first-order prediction of the applicability of a new resin for an advanced aerospace application without synthesizing the large quantities of material needed for failure testing. This technology will effect order-of-magnitude reductions in the time and expense required to develop new engineering polymers.

  14. Sensor-model prediction, monitoring and in-situ control of liquid RTM advanced fiber architecture composite processing

    NASA Technical Reports Server (NTRS)

    Kranbuehl, D.; Kingsley, P.; Hart, S.; Loos, A.; Hasko, G.; Dexter, B.

    1992-01-01

    In-situ frequency dependent electromagnetic sensors (FDEMS) and the Loos resin transfer model have been used to select and control the processing properties of an epoxy resin during liquid pressure RTM impregnation and cure. Once correlated with viscosity and degree of cure the FDEMS sensor monitors and the RTM processing model predicts the reaction advancement of the resin, viscosity and the impregnation of the fabric. This provides a direct means for predicting, monitoring, and controlling the liquid RTM process in-situ in the mold throughout the fabrication process and the effects of time, temperature, vacuum and pressure. Most importantly, the FDEMS-sensor model system has been developed to make intelligent decisions, thereby automating the liquid RTM process and removing the need for operator direction.

  15. Predicting Human Preferences Using the Block Structure of Complex Social Networks

    PubMed Central

    Guimerà, Roger; Llorente, Alejandro; Moro, Esteban; Sales-Pardo, Marta

    2012-01-01

    With ever-increasing available data, predicting individuals' preferences and helping them locate the most relevant information has become a pressing need. Understanding and predicting preferences is also important from a fundamental point of view, as part of what has been called a “new” computational social science. Here, we propose a novel approach based on stochastic block models, which have been developed by sociologists as plausible models of complex networks of social interactions. Our model is in the spirit of predicting individuals' preferences based on the preferences of others but, rather than fitting a particular model, we rely on a Bayesian approach that samples over the ensemble of all possible models. We show that our approach is considerably more accurate than leading recommender algorithms, with major relative improvements between 38% and 99% over industry-level algorithms. Besides, our approach sheds light on decision-making processes by identifying groups of individuals that have consistently similar preferences, and enabling the analysis of the characteristics of those groups. PMID:22984533

  16. The EST Model for Predicting Progressive Damage and Failure of Open Hole Bending Specimens

    NASA Technical Reports Server (NTRS)

    Joseph, Ashith P. K.; Waas, Anthony M.; Pineda, Evan J.

    2016-01-01

    Progressive damage and failure in open hole composite laminate coupons subjected to flexural loading is modeled using Enhanced Schapery Theory (EST). Previous studies have demonstrated that EST can accurately predict the strength of open hole coupons under remote tensile and compressive loading states. This homogenized modeling approach uses single composite shell elements to represent the entire laminate in the thickness direction and significantly reduces computational cost. Therefore, when delaminations are not of concern or are active in the post-peak regime, the version of EST presented here is a good engineering tool for predicting deformation response. Standard coupon level tests provides all the input data needed for the model and they are interpreted in conjunction with finite element (FE) based simulations. Open hole bending test results of three different IM7/8552 carbon fiber composite layups agree well with EST predictions. The model is able to accurately capture the curvature change and deformation localization in the specimen at and during the post catastrophic load drop event.

  17. Early Prediction of Intensive Care Unit-Acquired Weakness: A Multicenter External Validation Study.

    PubMed

    Witteveen, Esther; Wieske, Luuk; Sommers, Juultje; Spijkstra, Jan-Jaap; de Waard, Monique C; Endeman, Henrik; Rijkenberg, Saskia; de Ruijter, Wouter; Sleeswijk, Mengalvio; Verhamme, Camiel; Schultz, Marcus J; van Schaik, Ivo N; Horn, Janneke

    2018-01-01

    An early diagnosis of intensive care unit-acquired weakness (ICU-AW) is often not possible due to impaired consciousness. To avoid a diagnostic delay, we previously developed a prediction model, based on single-center data from 212 patients (development cohort), to predict ICU-AW at 2 days after ICU admission. The objective of this study was to investigate the external validity of the original prediction model in a new, multicenter cohort and, if necessary, to update the model. Newly admitted ICU patients who were mechanically ventilated at 48 hours after ICU admission were included. Predictors were prospectively recorded, and the outcome ICU-AW was defined by an average Medical Research Council score <4. In the validation cohort, consisting of 349 patients, we analyzed performance of the original prediction model by assessment of calibration and discrimination. Additionally, we updated the model in this validation cohort. Finally, we evaluated a new prediction model based on all patients of the development and validation cohort. Of 349 analyzed patients in the validation cohort, 190 (54%) developed ICU-AW. Both model calibration and discrimination of the original model were poor in the validation cohort. The area under the receiver operating characteristics curve (AUC-ROC) was 0.60 (95% confidence interval [CI]: 0.54-0.66). Model updating methods improved calibration but not discrimination. The new prediction model, based on all patients of the development and validation cohort (total of 536 patients) had a fair discrimination, AUC-ROC: 0.70 (95% CI: 0.66-0.75). The previously developed prediction model for ICU-AW showed poor performance in a new independent multicenter validation cohort. Model updating methods improved calibration but not discrimination. The newly derived prediction model showed fair discrimination. This indicates that early prediction of ICU-AW is still challenging and needs further attention.

  18. Comparing niche- and process-based models to reduce prediction uncertainty in species range shifts under climate change.

    PubMed

    Morin, Xavier; Thuiller, Wilfried

    2009-05-01

    Obtaining reliable predictions of species range shifts under climate change is a crucial challenge for ecologists and stakeholders. At the continental scale, niche-based models have been widely used in the last 10 years to predict the potential impacts of climate change on species distributions all over the world, although these models do not include any mechanistic relationships. In contrast, species-specific, process-based predictions remain scarce at the continental scale. This is regrettable because to secure relevant and accurate predictions it is always desirable to compare predictions derived from different kinds of models applied independently to the same set of species and using the same raw data. Here we compare predictions of range shifts under climate change scenarios for 2100 derived from niche-based models with those of a process-based model for 15 North American boreal and temperate tree species. A general pattern emerged from our comparisons: niche-based models tend to predict a stronger level of extinction and a greater proportion of colonization than the process-based model. This result likely arises because niche-based models do not take phenotypic plasticity and local adaptation into account. Nevertheless, as the two kinds of models rely on different assumptions, their complementarity is revealed by common findings. Both modeling approaches highlight a major potential limitation on species tracking their climatic niche because of migration constraints and identify similar zones where species extirpation is likely. Such convergent predictions from models built on very different principles provide a useful way to offset uncertainties at the continental scale. This study shows that the use in concert of both approaches with their own caveats and advantages is crucial to obtain more robust results and that comparisons among models are needed in the near future to gain accuracy regarding predictions of range shifts under climate change.

  19. Climate-based models for pulsed resources improve predictability of consumer population dynamics: outbreaks of house mice in forest ecosystems.

    PubMed

    Holland, E Penelope; James, Alex; Ruscoe, Wendy A; Pech, Roger P; Byrom, Andrea E

    2015-01-01

    Accurate predictions of the timing and magnitude of consumer responses to episodic seeding events (masts) are important for understanding ecosystem dynamics and for managing outbreaks of invasive species generated by masts. While models relating consumer populations to resource fluctuations have been developed successfully for a range of natural and modified ecosystems, a critical gap that needs addressing is better prediction of resource pulses. A recent model used change in summer temperature from one year to the next (ΔT) for predicting masts for forest and grassland plants in New Zealand. We extend this climate-based method in the framework of a model for consumer-resource dynamics to predict invasive house mouse (Mus musculus) outbreaks in forest ecosystems. Compared with previous mast models based on absolute temperature, the ΔT method for predicting masts resulted in an improved model for mouse population dynamics. There was also a threshold effect of ΔT on the likelihood of an outbreak occurring. The improved climate-based method for predicting resource pulses and consumer responses provides a straightforward rule of thumb for determining, with one year's advance warning, whether management intervention might be required in invaded ecosystems. The approach could be applied to consumer-resource systems worldwide where climatic variables are used to model the size and duration of resource pulses, and may have particular relevance for ecosystems where global change scenarios predict increased variability in climatic events.

  20. A neighborhood statistics model for predicting stream pathogen indicator levels.

    PubMed

    Pandey, Pramod K; Pasternack, Gregory B; Majumder, Mahbubul; Soupir, Michelle L; Kaiser, Mark S

    2015-03-01

    Because elevated levels of water-borne Escherichia coli in streams are a leading cause of water quality impairments in the U.S., water-quality managers need tools for predicting aqueous E. coli levels. Presently, E. coli levels may be predicted using complex mechanistic models that have a high degree of unchecked uncertainty or simpler statistical models. To assess spatio-temporal patterns of instream E. coli levels, herein we measured E. coli, a pathogen indicator, at 16 sites (at four different times) within the Squaw Creek watershed, Iowa, and subsequently, the Markov Random Field model was exploited to develop a neighborhood statistics model for predicting instream E. coli levels. Two observed covariates, local water temperature (degrees Celsius) and mean cross-sectional depth (meters), were used as inputs to the model. Predictions of E. coli levels in the water column were compared with independent observational data collected from 16 in-stream locations. The results revealed that spatio-temporal averages of predicted and observed E. coli levels were extremely close. Approximately 66 % of individual predicted E. coli concentrations were within a factor of 2 of the observed values. In only one event, the difference between prediction and observation was beyond one order of magnitude. The mean of all predicted values at 16 locations was approximately 1 % higher than the mean of the observed values. The approach presented here will be useful while assessing instream contaminations such as pathogen/pathogen indicator levels at the watershed scale.

  1. Semiparametric Identification of Human Arm Dynamics for Flexible Control of a Functional Electrical Stimulation Neuroprosthesis

    PubMed Central

    Schearer, Eric M.; Liao, Yu-Wei; Perreault, Eric J.; Tresch, Matthew C.; Memberg, William D.; Kirsch, Robert F.; Lynch, Kevin M.

    2016-01-01

    We present a method to identify the dynamics of a human arm controlled by an implanted functional electrical stimulation neuroprosthesis. The method uses Gaussian process regression to predict shoulder and elbow torques given the shoulder and elbow joint positions and velocities and the electrical stimulation inputs to muscles. We compare the accuracy of torque predictions of nonparametric, semiparametric, and parametric model types. The most accurate of the three model types is a semiparametric Gaussian process model that combines the flexibility of a black box function approximator with the generalization power of a parameterized model. The semiparametric model predicted torques during stimulation of multiple muscles with errors less than 20% of the total muscle torque and passive torque needed to drive the arm. The identified model allows us to define an arbitrary reaching trajectory and approximately determine the muscle stimulations required to drive the arm along that trajectory. PMID:26955041

  2. Damage and strength of composite materials: Trends, predictions, and challenges

    NASA Technical Reports Server (NTRS)

    Obrien, T. Kevin

    1994-01-01

    Research on damage mechanisms and ultimate strength of composite materials relevant to scaling issues will be addressed in this viewgraph presentation. The use of fracture mechanics and Weibull statistics to predict scaling effects for the onset of isolated damage mechanisms will be highlighted. The ability of simple fracture mechanics models to predict trends that are useful in parametric or preliminary designs studies will be reviewed. The limitations of these simple models for complex loading conditions will also be noted. The difficulty in developing generic criteria for the growth of these mechanisms needed in progressive damage models to predict strength will be addressed. A specific example for a problem where failure is a direct consequence of progressive delamination will be explored. A damage threshold/fail-safety concept for addressing composite damage tolerance will be discussed.

  3. Deep Visual Attention Prediction

    NASA Astrophysics Data System (ADS)

    Wang, Wenguan; Shen, Jianbing

    2018-05-01

    In this work, we aim to predict human eye fixation with view-free scenes based on an end-to-end deep learning architecture. Although Convolutional Neural Networks (CNNs) have made substantial improvement on human attention prediction, it is still needed to improve CNN based attention models by efficiently leveraging multi-scale features. Our visual attention network is proposed to capture hierarchical saliency information from deep, coarse layers with global saliency information to shallow, fine layers with local saliency response. Our model is based on a skip-layer network structure, which predicts human attention from multiple convolutional layers with various reception fields. Final saliency prediction is achieved via the cooperation of those global and local predictions. Our model is learned in a deep supervision manner, where supervision is directly fed into multi-level layers, instead of previous approaches of providing supervision only at the output layer and propagating this supervision back to earlier layers. Our model thus incorporates multi-level saliency predictions within a single network, which significantly decreases the redundancy of previous approaches of learning multiple network streams with different input scales. Extensive experimental analysis on various challenging benchmark datasets demonstrate our method yields state-of-the-art performance with competitive inference time.

  4. Multimethod Prediction of Physical Parent-Child Aggression Risk in Expectant Mothers and Fathers with Social Information Processing Theory

    PubMed Central

    Rodriguez, Christina M.; Smith, Tamika L.; Silvia, Paul J.

    2015-01-01

    The Social Information Processing (SIP) model postulates that parents undergo a series of stages in implementing physical discipline that can escalate into physical child abuse. The current study utilized a multimethod approach to investigate whether SIP factors can predict risk of parent-child aggression (PCA) in a diverse sample of expectant mothers and fathers. SIP factors of PCA attitudes, negative child attributions, reactivity, and empathy were considered as potential predictors of PCA risk; additionally, analyses considered whether personal history of PCA predicted participants’ own PCA risk through its influence on their attitudes and attributions. Findings indicate that, for both mothers and fathers, history influenced attitudes but not attributions in predicting PCA risk, and attitudes and attributions predicted PCA risk; empathy and reactivity predicted negative child attributions for expectant mothers, but only reactivity significantly predicted attributions for expectant fathers. Path models for expectant mothers and fathers were remarkably similar. Overall, the findings provide support for major aspects of the SIP model. Continued work is needed in studying the progression of these factors across time for both mothers and fathers as well as the inclusion of other relevant ecological factors to the SIP model. PMID:26631420

  5. The Use of High Performance Computing (HPC) to Strengthen the Development of Army Systems

    DTIC Science & Technology

    2011-11-01

    accurately predicting the supersonic magus effect about spinning cones, ogive- cylinders , and boat-tailed afterbodies. This work led to the successful...successful computer model of the proposed product or system, one can then build prototypes on the computer and study the effects on the performance of...needed. The NRC report discusses the requirements for effective use of such computing power. One needs “models, algorithms, software, hardware

  6. [Study on the ARIMA model application to predict echinococcosis cases in China].

    PubMed

    En-Li, Tan; Zheng-Feng, Wang; Wen-Ce, Zhou; Shi-Zhu, Li; Yan, Lu; Lin, Ai; Yu-Chun, Cai; Xue-Jiao, Teng; Shun-Xian, Zhang; Zhi-Sheng, Dang; Chun-Li, Yang; Jia-Xu, Chen; Wei, Hu; Xiao-Nong, Zhou; Li-Guang, Tian

    2018-02-26

    To predict the monthly reported echinococcosis cases in China with the autoregressive integrated moving average (ARIMA) model, so as to provide a reference for prevention and control of echinococcosis. SPSS 24.0 software was used to construct the ARIMA models based on the monthly reported echinococcosis cases of time series from 2007 to 2015 and 2007 to 2014, respectively, and the accuracies of the two ARIMA models were compared. The model based on the data of the monthly reported cases of echinococcosis in China from 2007 to 2015 was ARIMA (1, 0, 0) (1, 1, 0) 12 , the relative error among reported cases and predicted cases was -13.97%, AR (1) = 0.367 ( t = 3.816, P < 0.001), SAR (1) = -0.328 ( t = -3.361, P = 0.001), and Ljung-Box Q = 14.119 ( df = 16, P = 0.590) . The model based on the data of the monthly reported cases of echinococcosis in China from 2007 to 2014 was ARIMA (1, 0, 0) (1, 0, 1) 12 , the relative error among reported cases and predicted cases was 0.56%, AR (1) = 0.413 ( t = 4.244, P < 0.001), SAR (1) = 0.809 ( t = 9.584, P < 0.001), SMA (1) = 0.356 ( t = 2.278, P = 0.025), and Ljung-Box Q = 18.924 ( df = 15, P = 0.217). The different time series may have different ARIMA models as for the same infectious diseases. It is needed to be further verified that the more data are accumulated, the shorter time of predication is, and the smaller the average of the relative error is. The establishment and prediction of an ARIMA model is a dynamic process that needs to be adjusted and optimized continuously according to the accumulated data, meantime, we should give full consideration to the intensity of the work related to infectious diseases reported (such as disease census and special investigation).

  7. Prediction of biodiversity hotspots in the Anthropocene: The case of veteran oaks.

    PubMed

    Skarpaas, Olav; Blumentrath, Stefan; Evju, Marianne; Sverdrup-Thygeson, Anne

    2017-10-01

    Over the past centuries, humans have transformed large parts of the biosphere, and there is a growing need to understand and predict the distribution of biodiversity hotspots influenced by the presence of humans. Our basic hypothesis is that human influence in the Anthropocene is ubiquitous, and we predict that biodiversity hot spot modeling can be improved by addressing three challenges raised by the increasing ecological influence of humans: (i) anthropogenically modified responses to individual ecological factors, (ii) fundamentally different processes and predictors in landscape types shaped by different land use histories and (iii) a multitude and complexity of natural and anthropogenic processes that may require many predictors and even multiple models in different landscape types. We modeled the occurrence of veteran oaks in Norway, and found, in accordance with our basic hypothesis and predictions, that humans influence the distribution of veteran oaks throughout its range, but in different ways in forests and open landscapes. In forests, geographical and topographic variables related to the oak niche are still important, but the occurrence of veteran oaks is shifted toward steeper slopes, where logging is difficult. In open landscapes, land cover variables are more important, and veteran oaks are more common toward the north than expected from the fundamental oak niche. In both landscape types, multiple predictor variables representing ecological and human-influenced processes were needed to build a good model, and several models performed almost equally well. Models accounting for the different anthropogenic influences on landscape structure and processes consistently performed better than models based exclusively on natural biogeographical and ecological predictors. Thus, our results for veteran oaks clearly illustrate the challenges to distribution modeling raised by the ubiquitous influence of humans, even in a moderately populated region, but also show that predictions can be improved by explicitly addressing these anthropogenic complexities.

  8. Models for predicting disinfection byproduct (DBP) formation in drinking waters: a chronological review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-07-01

    Disinfection for the supply of safe drinking water forms a variety of known and unknown byproducts through reactions between the disinfectants and natural organic matter. Chronic exposure to disinfection byproducts through the ingestion of drinking water, inhalation and dermal contact during regular indoor activities (e.g., showering, bathing, cooking) may pose cancer and non-cancer risks to human health. Since their discovery in drinking water in 1974, numerous studies have presented models to predict DBP formation in drinking water. To date, more than 48 scientific publications have reported 118 models to predict DBP formation in drinking waters. These models were developed through laboratory and field-scale experiments using raw, pretreated and synthetic waters. This paper aims to review DBP predictive models, analyze the model variables, assess the model advantages and limitations, and to determine their applicability to different water supply systems. The paper identifies the current challenges and future research needs to better control DBP formation. Finally, important directions for future research are recommended to protect human health and to follow the best management practices.

  9. Combined heat transfer and kinetic models to predict cooking loss during heat treatment of beef meat.

    PubMed

    Kondjoyan, Alain; Oillic, Samuel; Portanguen, Stéphane; Gros, Jean-Bernard

    2013-10-01

    A heat transfer model was used to simulate the temperature in 3 dimensions inside the meat. This model was combined with a first-order kinetic models to predict cooking losses. Identification of the parameters of the kinetic models and first validations were performed in a water bath. Afterwards, the performance of the combined model was determined in a fan-assisted oven under different air/steam conditions. Accurate knowledge of the heat transfer coefficient values and consideration of the retraction of the meat pieces are needed for the prediction of meat temperature. This is important since the temperature at the center of the product is often used to determine the cooking time. The combined model was also able to predict cooking losses from meat pieces of different sizes and subjected to different air/steam conditions. It was found that under the studied conditions, most of the water loss comes from the juice expelled by protein denaturation and contraction and not from evaporation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Rheumatology in the community of Madrid: current availability of rheumatologists and future needs using a predictive model.

    PubMed

    Lázaro y De Mercado, Pablo; Blasco Bravo, Antonio Javier; Lázaro y De Mercado, Ignacio; Castañeda, Santos; López Robledillo, Juan Carlos

    2013-01-01

    To: 1) describe the distribution of the public sector rheumatologists; 2) identify variables on which the workload in Rheumatology depends; and 3) build a predictive model on the need of rheumatologists for the next 10 years, in the Community of Madrid (CM). The information was obtained through structured questionnaires sent to all services/units of Rheumatology of public hospitals in the CM. The population figures, current and forecasted, were obtained from the National Statistics Institute. A predictive model was built based on information about the current and foreseeable supply, current and foreseeable demand, and the assumptions and criteria used to match supply with demand. The underlying uncertainty in the model was assessed by sensitivity analysis. In the CM in 2011 there were 150 staff rheumatologists and 49 residents in 27 centers, which is equivalent to one rheumatologist for every 33,280 inhabitants in the general population, and one for every 4,996 inhabitants over 65 years. To keep the level of assistance of 2011 in 2021 in the general population, it would be necessary to train more residents or hire more rheumatologists in scenarios of demand higher than 15%. However, to keep the level of assistance in the population over 65 years of age it would be necessary to train more residents or hire more specialists even without increased demand. The model developed may be very useful for planning, with the CM policy makers, the needs of human resources in Rheumatology in the coming years. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  11. Differences in care burden of patients undergoing dialysis in different centres in the netherlands.

    PubMed

    de Kleijn, Ria; Uyl-de Groot, Carin; Hagen, Chris; Diepenbroek, Adry; Pasker-de Jong, Pieternel; Ter Wee, Piet

    2017-06-01

    A classification model was developed to simplify planning of personnel at dialysis centres. This model predicted the care burden based on dialysis characteristics. However, patient characteristics and different dialysis centre categories might also influence the amount of care time required. To determine if there is a difference in care burden between different categories of dialysis centres and if specific patient characteristics predict nursing time needed for patient treatment. An observational study. Two hundred and forty-two patients from 12 dialysis centres. In 12 dialysis centres, nurses filled out the classification list per patient and completed a form with patient characteristics. Nephrologists filled out the Charlson Comorbidity Index. Independent observers clocked the time nurses spent on separate steps of the dialysis for each patient. Dialysis centres were categorised into four types. Data were analysed using regression models. In contrast to other dialysis centres, academic centres needed 14 minutes more care time per patient per dialysis treatment than predicted in the classification model. No patient characteristics were found that influenced this difference. The only patient characteristic that predicted the time required was gender, with more time required to treat women. Gender did not affect the difference between measured and predicted care time. Differences in care burden were observed between academic and other centres, with more time required for treatment in academic centres. Contribution of patient characteristics to the time difference was minimal. The only patient characteristics that predicted care time were previous transplantation, which reduced the time required, and gender, with women requiring more care time. © 2017 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  12. Feeding modes in stream salmonid population models: Is drift feeding the whole story?

    Treesearch

    Bret Harvey; Steve Railsback

    2014-01-01

    Drift-feeding models are essential components of broader models that link stream habitat to salmonid populations and community dynamics. But is an additional feeding mode needed for understanding and predicting salmonid population responses to streamflow and other environmental factors? We addressed this question by applying two versions of the individual-based model...

  13. Enhancing model prediction reliability through improved soil representation and constrained model auto calibration - A paired waterhsed study

    USDA-ARS?s Scientific Manuscript database

    Process based and distributed watershed models possess a large number of parameters that are not directly measured in field and need to be calibrated through matching modeled in-stream fluxes with monitored data. Recently, there have been waves of concern about the reliability of this common practic...

  14. Prediction of drinking water intake by dairy cows.

    PubMed

    Appuhamy, J A D R N; Judy, J V; Kebreab, E; Kononoff, P J

    2016-09-01

    Mathematical models that predict water intake by drinking, also known as free water intake (FWI), are useful in understanding water supply needed by animals on dairy farms. The majority of extant mathematical models for predicting FWI of dairy cows have been developed with data sets representing similar experimental conditions, not evaluated with modern cows, and often require dry matter intake (DMI) data, which may not be routinely available. The objectives of the study were to (1) develop a set of new empirical models for predicting FWI of lactating and dry cows with and without DMI using literature data, and (2) evaluate the new and the extant models using an independent set of FWI measurements made on modern cows. Random effect meta-regression analyses were conducted using 72 and 188 FWI treatment means with and without dietary electrolyte and daily mean ambient temperature (TMP) records, respectively, for lactating cows, and 19 FWI treatment means for dry cows. Milk yield, DMI, body weight, days in milk, dietary macro-nutrient contents, an aggregate milliequivalent concentration of dietary sodium and potassium (NaK), and TMP were used as potential covariates to the models. A model having positive relationships of DMI, dietary dry matter (DM%), and CP (CP%) contents, NaK, and TMP explained 76% of variability in FWI treatment means of lactating cows. When challenged on an independent data set (n=261), the model more accurately predicted FWI [root mean square prediction error as a percentage of average observed value (RMSPE%)=14.4%] compared with a model developed without NaK and TMP (RMSPE%=17.3%), and all extant models (RMSPE%≥15.7%). A model without DMI included positive relationships of milk yield, DM%, NaK, TMP, and days in milk, and explained 63% of variability in the FWI treatment means and performed well (RMSPE%=17.9%), when challenged on the independent data. New models for dry cows included positive relationships of DM% and TMP along with DMI or body weight. The new models with and without DMI explained 75 and 54% of the variability in FWI treatment means of dry cows and had RMSPE% of 12.8 and 15.2%, respectively, when evaluated with the literature data. The study offers a set of empirical models that can assist in determining drinking water needs of dairy farms. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. AIR QUALITY MODELING OF HAZARDOUS POLLUTANTS: CURRENT STATUS AND FUTURE DIRECTIONS

    EPA Science Inventory

    The paper presents a review of current air toxics modeling applications and discusses possible advanced approaches. Many applications require the ability to predict hot spots from industrial sources or large roadways that are needed for community health and Environmental Justice...

  16. Regression models for predicting peak and continuous three-dimensional spinal loads during symmetric and asymmetric lifting tasks.

    PubMed

    Fathallah, F A; Marras, W S; Parnianpour, M

    1999-09-01

    Most biomechanical assessments of spinal loading during industrial work have focused on estimating peak spinal compressive forces under static and sagittally symmetric conditions. The main objective of this study was to explore the potential of feasibly predicting three-dimensional (3D) spinal loading in industry from various combinations of trunk kinematics, kinetics, and subject-load characteristics. The study used spinal loading, predicted by a validated electromyography-assisted model, from 11 male participants who performed a series of symmetric and asymmetric lifts. Three classes of models were developed: (a) models using workplace, subject, and trunk motion parameters as independent variables (kinematic models); (b) models using workplace, subject, and measured moments variables (kinetic models); and (c) models incorporating workplace, subject, trunk motion, and measured moments variables (combined models). The results showed that peak 3D spinal loading during symmetric and asymmetric lifting were predicted equally well using all three types of regression models. Continuous 3D loading was predicted best using the combined models. When the use of such models is infeasible, the kinematic models can provide adequate predictions. Finally, lateral shear forces (peak and continuous) were consistently underestimated using all three types of models. The study demonstrated the feasibility of predicting 3D loads on the spine under specific symmetric and asymmetric lifting tasks without the need for collecting EMG information. However, further validation and development of the models should be conducted to assess and extend their applicability to lifting conditions other than those presented in this study. Actual or potential applications of this research include exposure assessment in epidemiological studies, ergonomic intervention, and laboratory task assessment.

  17. A view on thermodynamics of concentrated electrolytes: Modification necessity for electrostatic contribution of osmotic coefficient

    NASA Astrophysics Data System (ADS)

    Sahu, Jyoti; Juvekar, Vinay A.

    2018-05-01

    Prediction of the osmotic coefficient of concentrated electrolytes is needed in a wide variety of industrial applications. There is a need to correctly segregate the electrostatic contribution to osmotic coefficient from nonelectrostatic contribution. This is achieved in a rational way in this work. Using the Robinson-Stokes-Glueckauf hydrated ion model to predict non-electrostatic contribution to the osmotic coefficient, it is shown that hydration number should be independent of concentration so that the observed linear dependence of osmotic coefficient on electrolyte concentration in high concentration range could be predicted. The hydration number of several electrolytes (LiCl, NaCl, KCl, MgCl2, and MgSO4) has been estimated by this method. The hydration number predicted by this model shows correct dependence on temperature. It is also shown that the electrostatic contribution to osmotic coefficient is underpredicted by the Debye-Hückel theory at concentration beyond 0.1 m. The Debye-Hückel theory is modified by introducing a concentration dependent hydrated ionic size. Using the present analysis, it is possible to correctly estimate the electrostatic contribution to the osmotic coefficient, beyond the range of validation of the D-H theory. This would allow development of a more fundamental model for electrostatic interaction at high electrolyte concentrations.

  18. [Validity of self-perceived dental caries as a diagnostic test and associated factors in adults].

    PubMed

    Haikal, Desirée Sant'Ana; Roberto, Luana Leal; Martins, Andréa Maria Eleutério de Barros Lima; Paula, Alfredo Maurício Batista de; Ferreira, Efigênia Ferreira E

    2017-08-21

    This study aimed to analyze the validity of self-perceived dental caries and associated factors in a sample of 795 adults (35-44 years). The dependent variable was self-perceived dental caries, and the independent variables were combined in blocks. Three logistic models were performed: (1) all adults; (2) adults with a formal diagnosis of caries; and (3) adults without such caries. Self-perceived dental caries showed 77.7% sensitivity, 58% specificity, 65% accuracy, 52% positive predictive value, and 81% negative predictive value. In Model 1, self-perceived dental caries was associated with time of use of dental services, access to information, flossing, formal diagnosis of caries, self-perceived need for treatment, toothache, and dissatisfaction with oral health and general health. In Model 2, self-perceived dental caries was associated with time of use of dental services, self-perceived need for treatment, and dissatisfaction with oral health and general health. In Model 3, self-perceived dental caries was associated with time of use of dental services, access to information, flossing, self-perceived need for treatment, and dissatisfaction with oral health. Self-perceived dental caries showed limited utility as a diagnostic method.

  19. Fish species of greatest conservation need in wadeable Iowa streams: current status and effectiveness of Aquatic Gap Program distribution models

    USGS Publications Warehouse

    Sindt, Anthony R.; Pierce, Clay; Quist, Michael C.

    2012-01-01

    Effective conservation of fish species of greatest conservation need (SGCN) requires an understanding of species–habitat relationships and distributional trends. Thus, modeling the distribution of fish species across large spatial scales may be a valuable tool for conservation planning. Our goals were to evaluate the status of 10 fish SGCN in wadeable Iowa streams and to test the effectiveness of Iowa Aquatic Gap Analysis Project (IAGAP) species distribution models. We sampled fish assemblages from 86 wadeable stream segments in the Mississippi River drainage of Iowa during 2009 and 2010 to provide contemporary, independent fish species presence–absence data. The frequencies of occurrence in stream segments where species were historically documented varied from 0.0% for redfin shiner Lythrurus umbratilis to 100.0% for American brook lampreyLampetra appendix, with a mean of 53.0%, suggesting that the status of Iowa fish SGCN is highly variable. Cohen's kappa values and other model performance measures were calculated by comparing field-collected presence–absence data with IAGAP model–predicted presences and absences for 12 fish SGCN. Kappa values varied from 0.00 to 0.50, with a mean of 0.15. The models only predicted the occurrences of banded darterEtheostoma zonale, southern redbelly dace Phoxinus erythrogaster, and longnose daceRhinichthys cataractae more accurately than would be expected by chance. Overall, the accuracy of the twelve models was low, with a mean correct classification rate of 58.3%. Poor model performance probably reflects the difficulties associated with modeling the distribution of rare species and the inability of the large-scale habitat variables used in IAGAP models to explain the variation in fish species occurrences. Our results highlight the importance of quantifying the confidence in species distribution model predictions with an independent data set and the need for long-term monitoring to better understand the distributional trends and habitat associations of fish SGCN.

  20. Effect of Material Thermo-viscoplastic Modeling on the Prediction of Forming Limit Curves of Aluminum Alloy 5086

    NASA Astrophysics Data System (ADS)

    Chu, Xingrong; Leotoing, Lionel; Guines, Dominique; Ragneau, Eric

    2015-09-01

    A solution to improve the formability of aluminum alloy sheets can consist in investigating warm forming processes. The optimization of forming process parameters needs a precise evaluation of material properties and sheet metal formability for actual operating environment. Based on the analytical M-K theory, a finite element (FE) M-K model was proposed to predict forming limit curves (FLCs) at different temperatures and strain rates. The influences of initial imperfection value ( f 0) and material thermos-viscoplastic model on the FLCs are discussed in this work. The flow stresses of AA5086 were characterized by uniaxial tensile tests at different temperatures (20, 150, and 200 °C) and equivalent strain rates (0.0125, 0.125, and 1.25 s-1). Three types of hardening models (power law model, saturation model, and mixed model) were proposed and adapted to correlate the experimental flow stresses. The three hardening models were implemented into the FE M-K model in order to predict FLCs for different forming conditions. The predicted limit strains are very sensitive to the thermo-viscoplastic modeling of AA5086 and to the calibration of the initial geometrical imperfection which controls the onset of necking.

  1. A combined ultrasound and clinical scoring model for the prediction of peripartum complications in pregnancies complicated by placenta previa.

    PubMed

    Yoon, So-Yeon; You, Ji Yeon; Choi, Suk-Joo; Oh, Soo-Young; Kim, Jong-Hwa; Roh, Cheong-Rae

    2014-09-01

    To generate a combined ultrasound and clinical model predictive for peripartum complications in pregnancies complicated by placenta previa. This study included 110 singleton pregnant women with placenta previa delivered by cesarean section (CS) from July 2011 to November 2013. We prospectively collected ultrasound and clinical data before CS and observed the occurrence of blood transfusion, uterine artery embolization and cesarean hysterectomy. We formulated a scoring model including type of previa (0: partials, 2: totalis), lacunae (0: none, 1: 1-3, 2: 4-6, 3: whole), uteroplacental hypervascularity (0: normal, 1: moderate, 2: severe), multiparity (0: no, 1: yes), history of CS (0: none, 1: once, 2: ≥ twice) and history of placenta previa (0: no, 1: yes) to predict the risk of peripartum complications. In our study population, the risk of perioperative transfusion, uterine artery embolization, and cesarean hysterectomy were 26.4, 1.8 and 6.4%, respectively. The type of previa, lacunae, uteroplacental hypervascularity, parity, history of CS, and history of placenta previa were associated with complications in univariable analysis. However, no factor was independently predictive for any complication in exact logistic regression analysis. Using the scoring model, we found that total score significantly correlated with perioperative transfusion, cesarean hysterectomy and composite complication (p<0.0001, Cochrane Armitage test). Notably, all patients with total score ≥7 needed cesarean hysterectomy. When total score was ≥6, three fourths of patients needed blood transfusion. This combined scoring model may provide useful information for prediction of peripartum complications in women with placenta previa. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Variation and Grey GM(1, 1) Prediction of Melting Peak Temperature of Polypropylene During Ultraviolet Radiation Aging

    NASA Astrophysics Data System (ADS)

    Chen, K.; Y Zhang, T.; Zhang, F.; Zhang, Z. R.

    2017-12-01

    Grey system theory regards uncertain system in which information is known partly and unknown partly as research object, extracts useful information from part known, and thereby revealing the potential variation rule of the system. In order to research the applicability of data-driven modelling method in melting peak temperature (T m) fitting and prediction of polypropylene (PP) during ultraviolet radiation aging, the T m of homo-polypropylene after different ultraviolet radiation exposure time investigated by differential scanning calorimeter was fitted and predicted by grey GM(1, 1) model based on grey system theory. The results show that the T m of PP declines with the prolong of aging time, and fitting and prediction equation obtained by grey GM(1, 1) model is T m = 166.567472exp(-0.00012t). Fitting effect of the above equation is excellent and the maximum relative error between prediction value and actual value of T m is 0.32%. Grey system theory needs less original data, has high prediction accuracy, and can be used to predict aging behaviour of PP.

  3. Predicting mortality over different time horizons: which data elements are needed?

    PubMed

    Goldstein, Benjamin A; Pencina, Michael J; Montez-Rath, Maria E; Winkelmayer, Wolfgang C

    2017-01-01

    Electronic health records (EHRs) are a resource for "big data" analytics, containing a variety of data elements. We investigate how different categories of information contribute to prediction of mortality over different time horizons among patients undergoing hemodialysis treatment. We derived prediction models for mortality over 7 time horizons using EHR data on older patients from a national chain of dialysis clinics linked with administrative data using LASSO (least absolute shrinkage and selection operator) regression. We assessed how different categories of information relate to risk assessment and compared discrete models to time-to-event models. The best predictors used all the available data (c-statistic ranged from 0.72-0.76), with stronger models in the near term. While different variable groups showed different utility, exclusion of any particular group did not lead to a meaningfully different risk assessment. Discrete time models performed better than time-to-event models. Different variable groups were predictive over different time horizons, with vital signs most predictive for near-term mortality and demographic and comorbidities more important in long-term mortality. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Observational data needs useful for modeling the coma

    NASA Technical Reports Server (NTRS)

    Huebner, W. F.; Giguere, P. T.

    1981-01-01

    A computer model of comet comae is described; results from assumed composition of frozen gases are summarized and compared to coma observations. Restrictions on relative abundance of some frozen constituents are illustrated. Modeling, when tightly coupled to observational data, can be important for comprehensive analysis of observations, for predicting undetected molecular species and for improved understanding of coma and nucleus. To accomplish this, total gas production rates and relative elemental abundances of H:C:N:O:S are needed as a function of heliocentric distance of the comet. Also needed are relative column densitites and column density profiles with well defined diaphragm range and pointing position on the coma. Production rates are less desirable since they are model dependent. Total number (or upper limits) of molecules in the coma and analysis of unidentified spectral lines are needed also.

  5. Internal models and prediction of visual gravitational motion.

    PubMed

    Zago, Myrka; McIntyre, Joseph; Senot, Patrice; Lacquaniti, Francesco

    2008-06-01

    Baurès et al. [Baurès, R., Benguigui, N., Amorim, M.-A., & Siegler, I. A. (2007). Intercepting free falling objects: Better use Occam's razor than internalize Newton's law. Vision Research, 47, 2982-2991] rejected the hypothesis that free-falling objects are intercepted using a predictive model of gravity. They argued instead for "a continuous guide for action timing" based on visual information updated till target capture. Here we show that their arguments are flawed, because they fail to consider the impact of sensori-motor delays on interception behaviour and the need for neural compensation of such delays. When intercepting a free-falling object, the delays can be overcome by a predictive model of the effects of gravity on target motion.

  6. The use of predictive models to optimize risk of decisions.

    PubMed

    Baranyi, József; Buss da Silva, Nathália

    2017-01-02

    The purpose of this paper is to set up a mathematical framework that risk assessors and regulators could use to quantify the "riskiness" of a particular recommendation (choice/decision). The mathematical theory introduced here can be used for decision support systems. We point out that efficient use of predictive models in decision making for food microbiology needs to consider three major points: (1) the uncertainty and variability of the used information based on which the decision is to be made; (2) the validity of the predictive models aiding the assessor; and (3) the cost generated by the difference between the a-priory choice and the a-posteriori outcome. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Spanish Validation of the Basic Psychological Needs at Work Scale: A Measure to Predict Teachers' Well-Being in the Workplace

    ERIC Educational Resources Information Center

    Abós Catalán, Ángel; Sevil Serrano, Javier; Julián Clemente, José Antonio; Martín-Albo Lucas, José; García-González, Luis

    2018-01-01

    The present study aimed to validate a Spanish-version of the Basic Psychological Needs at Work Scale (BPNWS-Sp) and to examine the associations between needs satisfaction and engagement and burnout in secondary education teachers. Using a sample of 584 secondary education teachers, the results supported the three-factor model, composite…

  8. An evaluation of the predictive performance of distributional models for flora and fauna in north-east New South Wales.

    PubMed

    Pearce, J; Ferrier, S; Scotts, D

    2001-06-01

    To use models of species distributions effectively in conservation planning, it is important to determine the predictive accuracy of such models. Extensive modelling of the distribution of vascular plant and vertebrate fauna species within north-east New South Wales has been undertaken by linking field survey data to environmental and geographical predictors using logistic regression. These models have been used in the development of a comprehensive and adequate reserve system within the region. We evaluate the predictive accuracy of models for 153 small reptile, arboreal marsupial, diurnal bird and vascular plant species for which independent evaluation data were available. The predictive performance of each model was evaluated using the relative operating characteristic curve to measure discrimination capacity. Good discrimination ability implies that a model's predictions provide an acceptable index of species occurrence. The discrimination capacity of 89% of the models was significantly better than random, with 70% of the models providing high levels of discrimination. Predictions generated by this type of modelling therefore provide a reasonably sound basis for regional conservation planning. The discrimination ability of models was highest for the less mobile biological groups, particularly the vascular plants and small reptiles. In the case of diurnal birds, poor performing models tended to be for species which occur mainly within specific habitats not well sampled by either the model development or evaluation data, highly mobile species, species that are locally nomadic or those that display very broad habitat requirements. Particular care needs to be exercised when employing models for these types of species in conservation planning.

  9. Comparing five modelling techniques for predicting forest characteristics

    Treesearch

    Gretchen G. Moisen; Tracey S. Frescino

    2002-01-01

    Broad-scale maps of forest characteristics are needed throughout the United States for a wide variety of forest land management applications. Inexpensive maps can be produced by modelling forest class and structure variables collected in nationwide forest inventories as functions of satellite-based information. But little work has been directed at comparing modelling...

  10. Development of an in vitro Hepatocyte Model to Investigate Chemical Mode of Action

    EPA Science Inventory

    There is a clear need to identify and characterize the potential of liver in vitro models that can be used to replace animals for mode of action analysis. Our goal is to use in vitro models for mode of action prediction which recapitulate critical cellular processes underlying in...

  11. Impact of tidal density variability on orbital and reentry predictions

    NASA Astrophysics Data System (ADS)

    Leonard, J. M.; Forbes, J. M.; Born, G. H.

    2012-12-01

    Since the first satellites entered Earth orbit in the late 1950's and early 1960's, the influences of solar and geomagnetic variability on the satellite drag environment have been studied, and parameterized in empirical density models with increasing sophistication. However, only within the past 5 years has the realization emerged that "troposphere weather" contributes significantly to the "space weather" of the thermosphere, especially during solar minimum conditions. Much of the attendant variability is attributable to upward-propagating solar tides excited by latent heating due to deep tropical convection, and solar radiation absorption primarily by water vapor and ozone in the stratosphere and mesosphere, respectively. We know that this tidal spectrum significantly modifies the orbital (>200 km) and reentry (60-150 km) drag environments, and that these tidal components induce longitude variability not yet emulated in empirical density models. Yet, current requirements for improvements in orbital prediction make clear that further refinements to density models are needed. In this paper, the operational consequences of longitude-dependent tides are quantitatively assessed through a series of orbital and reentry predictions. We find that in-track prediction differences incurred by tidal effects are typically of order 200 ± 100 m for satellites in 400-km circular orbits and 15 ± 10 km for satellites in 200-km circular orbits for a 24-hour prediction. For an initial 200-km circular orbit, surface impact differences of order 15° ± 15° latitude are incurred. For operational problems with similar accuracy needs, a density model that includes a climatological representation of longitude-dependent tides should significantly reduce errors due to this source.

  12. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs).

    PubMed

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  13. Early prediction of intensive care unit-acquired weakness using easily available parameters: a prospective observational study.

    PubMed

    Wieske, Luuk; Witteveen, Esther; Verhamme, Camiel; Dettling-Ihnenfeldt, Daniela S; van der Schaaf, Marike; Schultz, Marcus J; van Schaik, Ivo N; Horn, Janneke

    2014-01-01

    An early diagnosis of Intensive Care Unit-acquired weakness (ICU-AW) using muscle strength assessment is not possible in most critically ill patients. We hypothesized that development of ICU-AW can be predicted reliably two days after ICU admission, using patient characteristics, early available clinical parameters, laboratory results and use of medication as parameters. Newly admitted ICU patients mechanically ventilated ≥2 days were included in this prospective observational cohort study. Manual muscle strength was measured according to the Medical Research Council (MRC) scale, when patients were awake and attentive. ICU-AW was defined as an average MRC score <4. A prediction model was developed by selecting predictors from an a-priori defined set of candidate predictors, based on known risk factors. Discriminative performance of the prediction model was evaluated, validated internally and compared to the APACHE IV and SOFA score. Of 212 included patients, 103 developed ICU-AW. Highest lactate levels, treatment with any aminoglycoside in the first two days after admission and age were selected as predictors. The area under the receiver operating characteristic curve of the prediction model was 0.71 after internal validation. The new prediction model improved discrimination compared to the APACHE IV and the SOFA score. The new early prediction model for ICU-AW using a set of 3 easily available parameters has fair discriminative performance. This model needs external validation.

  14. SMOQ: a tool for predicting the absolute residue-specific quality of a single protein model with support vector machines

    PubMed Central

    2014-01-01

    Background It is important to predict the quality of a protein structural model before its native structure is known. The method that can predict the absolute local quality of individual residues in a single protein model is rare, yet particularly needed for using, ranking and refining protein models. Results We developed a machine learning tool (SMOQ) that can predict the distance deviation of each residue in a single protein model. SMOQ uses support vector machines (SVM) with protein sequence and structural features (i.e. basic feature set), including amino acid sequence, secondary structures, solvent accessibilities, and residue-residue contacts to make predictions. We also trained a SVM model with two new additional features (profiles and SOV scores) on 20 CASP8 targets and found that including them can only improve the performance when real deviations between native and model are higher than 5Å. The SMOQ tool finally released uses the basic feature set trained on 85 CASP8 targets. Moreover, SMOQ implemented a way to convert predicted local quality scores into a global quality score. SMOQ was tested on the 84 CASP9 single-domain targets. The average difference between the residue-specific distance deviation predicted by our method and the actual distance deviation on the test data is 2.637Å. The global quality prediction accuracy of the tool is comparable to other good tools on the same benchmark. Conclusion SMOQ is a useful tool for protein single model quality assessment. Its source code and executable are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/. PMID:24776231

  15. SMOQ: a tool for predicting the absolute residue-specific quality of a single protein model with support vector machines.

    PubMed

    Cao, Renzhi; Wang, Zheng; Wang, Yiheng; Cheng, Jianlin

    2014-04-28

    It is important to predict the quality of a protein structural model before its native structure is known. The method that can predict the absolute local quality of individual residues in a single protein model is rare, yet particularly needed for using, ranking and refining protein models. We developed a machine learning tool (SMOQ) that can predict the distance deviation of each residue in a single protein model. SMOQ uses support vector machines (SVM) with protein sequence and structural features (i.e. basic feature set), including amino acid sequence, secondary structures, solvent accessibilities, and residue-residue contacts to make predictions. We also trained a SVM model with two new additional features (profiles and SOV scores) on 20 CASP8 targets and found that including them can only improve the performance when real deviations between native and model are higher than 5Å. The SMOQ tool finally released uses the basic feature set trained on 85 CASP8 targets. Moreover, SMOQ implemented a way to convert predicted local quality scores into a global quality score. SMOQ was tested on the 84 CASP9 single-domain targets. The average difference between the residue-specific distance deviation predicted by our method and the actual distance deviation on the test data is 2.637Å. The global quality prediction accuracy of the tool is comparable to other good tools on the same benchmark. SMOQ is a useful tool for protein single model quality assessment. Its source code and executable are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/.

  16. Challenges in predicting climate change impacts on pome fruit phenology

    NASA Astrophysics Data System (ADS)

    Darbyshire, Rebecca; Webb, Leanne; Goodwin, Ian; Barlow, E. W. R.

    2014-08-01

    Climate projection data were applied to two commonly used pome fruit flowering models to investigate potential differences in predicted full bloom timing. The two methods, fixed thermal time and sequential chill-growth, produced different results for seven apple and pear varieties at two Australian locations. The fixed thermal time model predicted incremental advancement of full bloom, while results were mixed from the sequential chill-growth model. To further investigate how the sequential chill-growth model reacts under climate perturbed conditions, four simulations were created to represent a wider range of species physiological requirements. These were applied to five Australian locations covering varied climates. Lengthening of the chill period and contraction of the growth period was common to most results. The relative dominance of the chill or growth component tended to predict whether full bloom advanced, remained similar or was delayed with climate warming. The simplistic structure of the fixed thermal time model and the exclusion of winter chill conditions in this method indicate it is unlikely to be suitable for projection analyses. The sequential chill-growth model includes greater complexity; however, reservations in using this model for impact analyses remain. The results demonstrate that appropriate representation of physiological processes is essential to adequately predict changes to full bloom under climate perturbed conditions with greater model development needed.

  17. CERAPP: Collaborative Estrogen Receptor Activity Prediction ...

    EPA Pesticide Factsheets

    Humans potentially are exposed to thousands of man-made chemicals in the environment. Some chemicals mimic natural endocrine hormones and, thus, have the potential to be endocrine disruptors. Many of these chemicals never have been tested for their ability to interact with the estrogen receptor (ER). Risk assessors need tools to prioritize chemicals for assessment in costly in vivo tests, for instance, within the EPA Endocrine Disruptor Screening Program. Here, we describe a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrating the efficacy of using predictive computational models on high-throughput screening data to screen thousands of chemicals against the ER. CERAPP combined multiple models developed in collaboration among 17 groups in the United States and Europe to predict ER activity of a common set of 32,464 chemical structures. Quantitative structure-activity relationship models and docking approaches were employed, mostly using a common training set of 1677 compounds provided by EPA, to build a total of 40 categorical and 8 continuous models for binding, agonist, and antagonist ER activity. All predictions were tested using an evaluation set of 7522 chemicals collected from the literature. To overcome the limitations of single models, a consensus was built weighting models using a scoring function (0 to 1) based on their accuracies. Individual model scores ranged from 0.69 to 0.85, showing

  18. Use of Fetal Magnetic Resonance Image Analysis and Machine Learning to Predict the Need for Postnatal Cerebrospinal Fluid Diversion in Fetal Ventriculomegaly.

    PubMed

    Pisapia, Jared M; Akbari, Hamed; Rozycki, Martin; Goldstein, Hannah; Bakas, Spyridon; Rathore, Saima; Moldenhauer, Julie S; Storm, Phillip B; Zarnow, Deborah M; Anderson, Richard C E; Heuer, Gregory G; Davatzikos, Christos

    2018-02-01

    Which children with fetal ventriculomegaly, or enlargement of the cerebral ventricles in utero, will develop hydrocephalus requiring treatment after birth is unclear. To determine whether extraction of multiple imaging features from fetal magnetic resonance imaging (MRI) and integration using machine learning techniques can predict which patients require postnatal cerebrospinal fluid (CSF) diversion after birth. This retrospective case-control study used an institutional database of 253 patients with fetal ventriculomegaly from January 1, 2008, through December 31, 2014, to generate a predictive model. Data were analyzed from January 1, 2008, through December 31, 2015. All 25 patients who required postnatal CSF diversion were selected and matched by gestational age with 25 patients with fetal ventriculomegaly who did not require CSF diversion (discovery cohort). The model was applied to a sample of 24 consecutive patients with fetal ventriculomegaly who underwent evaluation at a separate institution (replication cohort) from January 1, 1998, through December 31, 2007. Data were analyzed from January 1, 1998, through December 31, 2009. To generate the model, linear measurements, area, volume, and morphologic features were extracted from the fetal MRI, and a machine learning algorithm analyzed multiple features simultaneously to find the combination that was most predictive of the need for postnatal CSF diversion. Accuracy, sensitivity, and specificity of the model in correctly classifying patients requiring postnatal CSF diversion. A total of 74 patients (41 girls [55%] and 33 boys [45%]; mean [SD] gestational age, 27.0 [5.6] months) were included from both cohorts. In the discovery cohort, median time to CSF diversion was 6 days (interquartile range [IQR], 2-51 days), and patients with fetal ventriculomegaly who did not develop symptoms were followed up for a median of 29 months (IQR, 9-46 months). The model correctly classified patients who required CSF diversion with 82% accuracy, 80% sensitivity, and 84% specificity. In the replication cohort, the model achieved 91% accuracy, 75% sensitivity, and 95% specificity. Image analysis and machine learning can be applied to fetal MRI findings to predict the need for postnatal CSF diversion. The model provides prognostic information that may guide clinical management and select candidates for potential fetal surgical intervention.

  19. Degradation Prediction Model Based on a Neural Network with Dynamic Windows

    PubMed Central

    Zhang, Xinghui; Xiao, Lei; Kang, Jianshe

    2015-01-01

    Tracking degradation of mechanical components is very critical for effective maintenance decision making. Remaining useful life (RUL) estimation is a widely used form of degradation prediction. RUL prediction methods when enough run-to-failure condition monitoring data can be used have been fully researched, but for some high reliability components, it is very difficult to collect run-to-failure condition monitoring data, i.e., from normal to failure. Only a certain number of condition indicators in certain period can be used to estimate RUL. In addition, some existing prediction methods have problems which block RUL estimation due to poor extrapolability. The predicted value converges to a certain constant or fluctuates in certain range. Moreover, the fluctuant condition features also have bad effects on prediction. In order to solve these dilemmas, this paper proposes a RUL prediction model based on neural network with dynamic windows. This model mainly consists of three steps: window size determination by increasing rate, change point detection and rolling prediction. The proposed method has two dominant strengths. One is that the proposed approach does not need to assume the degradation trajectory is subject to a certain distribution. The other is it can adapt to variation of degradation indicators which greatly benefits RUL prediction. Finally, the performance of the proposed RUL prediction model is validated by real field data and simulation data. PMID:25806873

  20. Evaluation of sensor types and environmental controls on mapping biomass of coastal marsh emergent vegetation

    USGS Publications Warehouse

    Byrd, Kristin B.; O'Connell, Jessica L.; Di Tommaso, Stefania; Kelly, Maggi

    2014-01-01

    There is a need to quantify large-scale plant productivity in coastal marshes to understand marsh resilience to sea level rise, to help define eligibility for carbon offset credits, and to monitor impacts from land use, eutrophication and contamination. Remote monitoring of aboveground biomass of emergent wetland vegetation will help address this need. Differences in sensor spatial resolution, bandwidth, temporal frequency and cost constrain the accuracy of biomass maps produced for management applications. In addition the use of vegetation indices to map biomass may not be effective in wetlands due to confounding effects of water inundation on spectral reflectance. To address these challenges, we used partial least squares regression to select optimal spectral features in situ and with satellite reflectance data to develop predictive models of aboveground biomass for common emergent freshwater marsh species, Typha spp. and Schoenoplectus acutus, at two restored marshes in the Sacramento–San Joaquin River Delta, California, USA. We used field spectrometer data to test model errors associated with hyperspectral narrowbands and multispectral broadbands, the influence of water inundation on prediction accuracy, and the ability to develop species specific models. We used Hyperion data, Digital Globe World View-2 (WV-2) data, and Landsat 7 data to scale up the best statistical models of biomass. Field spectrometer-based models of the full dataset showed that narrowband reflectance data predicted biomass somewhat, though not significantly better than broadband reflectance data [R2 = 0.46 and percent normalized RMSE (%RMSE) = 16% for narrowband models]. However hyperspectral first derivative reflectance spectra best predicted biomass for plots where water levels were less than 15 cm (R2 = 0.69, %RMSE = 12.6%). In species-specific models, error rates differed by species (Typha spp.: %RMSE = 18.5%; S. acutus: %RMSE = 24.9%), likely due to the more vertical structure and deeper water habitat of S. acutus. The Landsat 7 dataset (7 images) predicted biomass slightly better than the WV-2 dataset (6 images) (R2 = 0.56, %RMSE = 20.9%, compared to R2 = 0.45, RMSE = 21.5%). The Hyperion dataset (one image) was least successful in predicting biomass (R2 = 0.27, %RMSE = 33.5%). Shortwave infrared bands on 30 m-resolution Hyperion and Landsat 7 sensors aided biomass estimation; however managers need to weigh tradeoffs between cost, additional spectral information, and high spatial resolution that will identify variability in small, fragmented marshes common to the Sacramento–San Joaquin River Delta and elsewhere in the Western U.S.

  1. Assessment of Current Process Modeling Approaches to Determine Their Limitations, Applicability and Developments Needed for Long-Fiber Thermoplastic Injection Molded Composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Holbery, Jim; Smith, Mark T.

    2006-11-30

    This report describes the status of the current process modeling approaches to predict the behavior and flow of fiber-filled thermoplastics under injection molding conditions. Previously, models have been developed to simulate the injection molding of short-fiber thermoplastics, and an as-formed composite part or component can then be predicted that contains a microstructure resulting from the constituents’ material properties and characteristics as well as the processing parameters. Our objective is to assess these models in order to determine their capabilities and limitations, and the developments needed for long-fiber injection-molded thermoplastics (LFTs). First, the concentration regimes are summarized to facilitate the understandingmore » of different types of fiber-fiber interaction that can occur for a given fiber volume fraction. After the formulation of the fiber suspension flow problem and the simplification leading to the Hele-Shaw approach, the interaction mechanisms are discussed. Next, the establishment of the rheological constitutive equation is presented that reflects the coupled flow/orientation nature. The decoupled flow/orientation approach is also discussed which constitutes a good simplification for many applications involving flows in thin cavities. Finally, before outlining the necessary developments for LFTs, some applications of the current orientation model and the so-called modified Folgar-Tucker model are illustrated through the fiber orientation predictions for selected LFT samples.« less

  2. Forecasting the global shortage of physicians: an economic- and needs-based approach

    PubMed Central

    Liu, Jenny X; Kinfu, Yohannes; Dal Poz, Mario R

    2008-01-01

    Abstract Objective Global achievements in health may be limited by critical shortages of health-care workers. To help guide workforce policy, we estimate the future demand for, need for and supply of physicians, by WHO region, to determine where likely shortages will occur by 2015, the target date of the Millennium Development Goals. Methods Using World Bank and WHO data on physicians per capita from 1980 to 2001 for 158 countries, we employ two modelling approaches for estimating the future global requirement for physicians. A needs-based model determines the number of physicians per capita required to achieve 80% coverage of live births by a skilled health-care attendant. In contrast, our economic model identifies the number of physicians per capita that are likely to be demanded, given each country’s economic growth. These estimates are compared to the future supply of physicians projected by extrapolating the historical rate of increase in physicians per capita for each country. Findings By 2015, the global supply of physicians appears to be in balance with projected economic demand. Because our measure of need reflects the minimum level of workforce density required to provide a basic health service that is met in all but the least developed countries, the needs-based estimates predict a global surplus of physicians. However, on a regional basis, both models predict shortages for many countries in the WHO African Region in 2015, with some countries experiencing a needs-based shortage, a demand-based shortage, or both. Conclusion The type of policy intervention needed to alleviate projected shortages, such as increasing health-care training or adopting measures to discourage migration, depends on the type of shortage projected. PMID:18670663

  3. Towards a Rigorous Assessment of Systems Biology Models: The DREAM3 Challenges

    PubMed Central

    Prill, Robert J.; Marbach, Daniel; Saez-Rodriguez, Julio; Sorger, Peter K.; Alexopoulos, Leonidas G.; Xue, Xiaowei; Clarke, Neil D.; Altan-Bonnet, Gregoire; Stolovitzky, Gustavo

    2010-01-01

    Background Systems biology has embraced computational modeling in response to the quantitative nature and increasing scale of contemporary data sets. The onslaught of data is accelerating as molecular profiling technology evolves. The Dialogue for Reverse Engineering Assessments and Methods (DREAM) is a community effort to catalyze discussion about the design, application, and assessment of systems biology models through annual reverse-engineering challenges. Methodology and Principal Findings We describe our assessments of the four challenges associated with the third DREAM conference which came to be known as the DREAM3 challenges: signaling cascade identification, signaling response prediction, gene expression prediction, and the DREAM3 in silico network challenge. The challenges, based on anonymized data sets, tested participants in network inference and prediction of measurements. Forty teams submitted 413 predicted networks and measurement test sets. Overall, a handful of best-performer teams were identified, while a majority of teams made predictions that were equivalent to random. Counterintuitively, combining the predictions of multiple teams (including the weaker teams) can in some cases improve predictive power beyond that of any single method. Conclusions DREAM provides valuable feedback to practitioners of systems biology modeling. Lessons learned from the predictions of the community provide much-needed context for interpreting claims of efficacy of algorithms described in the scientific literature. PMID:20186320

  4. Climate Information Responding to User Needs (CIRUN)

    NASA Astrophysics Data System (ADS)

    Busalacchi, A. J.

    2009-05-01

    For the past several decades many different US agencies have been involved in collecting Earth observations, e.g., NASA, NOAA, DoD, USGS, USDA. More recently, the US has led the international effort to design a Global Earth Observation System of Systems (GEOSS). Yet, there has been little substantive progress at the synthesis and integration of the various research and operational, space-based and in situ, observations. Similarly, access to such a range of observations across the atmosphere, ocean, and land surface remains fragmented. With respect to prediction of the Earth System, the US has not developed a comprehensive strategy. For climate, the US (e.g., NOAA, NASA, DoE) has taken a two-track strategy. At the more immediate time scale, coupled ocean-atmosphere models of the physical climate system have built upon the tradition of daily numerical weather prediction in order to extend the forecast window to seasonal to interannual times scales. At the century time scale, the nascent development of Earth System models, combining components of the physical climate system with biogeochemical cycles, are being used to provide future climate change projections in response to anticipated greenhouse gas forcings. Between these to two approaches to prediction lies a key deficiency of interest to decision makers, especially as it pertains to adaptation, i.e., deterministic prediction of the Earth System at time scales from days to decades with spatial scales from global to regional. One of many obstacles to be overcome is the design of present day observation and prediction products based on user needs. To date, most of such products have evolved from the technology and research "push" rather than the user or stakeholder "pull". In the future as planning proceeds for a national climate service, emphasis must be given to a more coordinated approach in which stakeholders' needs help design future Earth System observational and prediction products, and similarly, such products need to be tailored to provide decision support.

  5. Fine-scale habitat modeling of a top marine predator: do prey data improve predictive capacity?

    PubMed

    Torres, Leigh G; Read, Andrew J; Halpin, Patrick

    2008-10-01

    Predators and prey assort themselves relative to each other, the availability of resources and refuges, and the temporal and spatial scale of their interaction. Predictive models of predator distributions often rely on these relationships by incorporating data on environmental variability and prey availability to determine predator habitat selection patterns. This approach to predictive modeling holds true in marine systems where observations of predators are logistically difficult, emphasizing the need for accurate models. In this paper, we ask whether including prey distribution data in fine-scale predictive models of bottlenose dolphin (Tursiops truncatus) habitat selection in Florida Bay, Florida, U.S.A., improves predictive capacity. Environmental characteristics are often used as predictor variables in habitat models of top marine predators with the assumption that they act as proxies of prey distribution. We examine the validity of this assumption by comparing the response of dolphin distribution and fish catch rates to the same environmental variables. Next, the predictive capacities of four models, with and without prey distribution data, are tested to determine whether dolphin habitat selection can be predicted without recourse to describing the distribution of their prey. The final analysis determines the accuracy of predictive maps of dolphin distribution produced by modeling areas of high fish catch based on significant environmental characteristics. We use spatial analysis and independent data sets to train and test the models. Our results indicate that, due to high habitat heterogeneity and the spatial variability of prey patches, fine-scale models of dolphin habitat selection in coastal habitats will be more successful if environmental variables are used as predictor variables of predator distributions rather than relying on prey data as explanatory variables. However, predictive modeling of prey distribution as the response variable based on environmental variability did produce high predictive performance of dolphin habitat selection, particularly foraging habitat.

  6. Modeling long period swell in Southern California: Practical boundary conditions from buoy observations and global wave model predictions

    NASA Astrophysics Data System (ADS)

    Crosby, S. C.; O'Reilly, W. C.; Guza, R. T.

    2016-02-01

    Accurate, unbiased, high-resolution (in space and time) nearshore wave predictions are needed to drive models of beach erosion, coastal flooding, and alongshore transport of sediment, biota and pollutants. On highly sheltered shorelines, wave predictions are sensitive to the directions of onshore propagating waves, and nearshore model prediction error is often dominated by uncertainty in offshore boundary conditions. Offshore islands and shoals, and coastline curvature, create complex sheltering patterns over the 250km span of southern California (SC) shoreline. Here, regional wave model skill in SC was compared for different offshore boundary conditions created using offshore buoy observations and global wave model hindcasts (National Oceanographic and Atmospheric Administration Wave Watch 3, WW3). Spectral ray-tracing methods were used to transform incident offshore swell (0.04-0.09Hz) energy at high directional resolution (1-deg). Model skill is assessed for predictions (wave height, direction, and alongshore radiation stress) at 16 nearshore buoy sites between 2000 and 2009. Model skill using buoy-derived boundary conditions is higher than with WW3-derived boundary conditions. Buoy-driven nearshore model results are similar with various assumptions about the true offshore directional distribution (maximum entropy, Bayesian direct, and 2nd derivative smoothness). Two methods combining offshore buoy observations with WW3 predictions in the offshore boundary condition did not improve nearshore skill above buoy-only methods. A case example at Oceanside harbor shows strong sensitivity of alongshore sediment transport predictions to different offshore boundary conditions. Despite this uncertainty in alongshore transport magnitude, alongshore gradients in transport (e.g. the location of model accretion and erosion zones) are determined by the local bathymetry, and are similar for all predictions.

  7. Examining intrinsic versus extrinsic exercise goals: cognitive, affective, and behavioral outcomes.

    PubMed

    Sebire, Simon J; Standage, Martyn; Vansteenkiste, Maarten

    2009-04-01

    Grounded in self-determination theory (SDT), this study had two purposes: (a) examine the associations between intrinsic (relative to extrinsic) exercise goal content and cognitive, affective, and behavioral outcomes; and (b) test the mediating role of psychological need satisfaction in the Exercise Goal Content --> Outcomes relationship. Using a sample of 410 adults, hierarchical regression analysis showed relative intrinsic goal content to positively predict physical self-worth, self-reported exercise behavior, psychological well-being, and psychological need satisfaction and negatively predict exercise anxiety. Except for exercise behavior, the predictive utility of relative intrinsic goal content on the dependent variables of interest remained significant after controlling for participants' relative self-determined exercise motivation. Structural equation modeling analyses showed psychological need satisfaction to partially mediate the effect of relative intrinsic goal content on the outcome variables. Our findings support further investigation of exercise goals commensurate with the goal content perspective advanced in SDT.

  8. A perspective on sustained marine observations for climate modelling and prediction.

    PubMed

    Dunstone, Nick J

    2014-09-28

    Here, I examine some of the many varied ways in which sustained global ocean observations are used in numerical modelling activities. In particular, I focus on the use of ocean observations to initialize predictions in ocean and climate models. Examples are also shown of how models can be used to assess the impact of both current ocean observations and to simulate that of potential new ocean observing platforms. The ocean has never been better observed than it is today and similarly ocean models have never been as capable at representing the real ocean as they are now. However, there remain important unanswered questions that can likely only be addressed via future improvements in ocean observations. In particular, ocean observing systems need to respond to the needs of the burgeoning field of near-term climate predictions. Although new ocean observing platforms promise exciting new discoveries, there is a delicate balance to be made between their funding and that of the current ocean observing system. Here, I identify the need to secure long-term funding for ocean observing platforms as they mature, from a mainly research exercise to an operational system for sustained observation over climate change time scales. At the same time, considerable progress continues to be made via ship-based observing campaigns and I highlight some that are dedicated to addressing uncertainties in key ocean model parametrizations. The use of ocean observations to understand the prominent long time scale changes observed in the North Atlantic is another focus of this paper. The exciting first decade of monitoring of the Atlantic meridional overturning circulation by the RAPID-MOCHA array is highlighted. The use of ocean and climate models as tools to further probe the drivers of variability seen in such time series is another exciting development. I also discuss the need for a concerted combined effort from climate models and ocean observations in order to understand the current slow-down in surface global warming. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  9. Atmospheric drag model calibrations for spacecraft lifetime prediction

    NASA Technical Reports Server (NTRS)

    Binebrink, A. L.; Radomski, M. S.; Samii, M. V.

    1989-01-01

    Although solar activity prediction uncertainty normally dominates decay prediction error budget for near-Earth spacecraft, the effect of drag force modeling errors for given levels of solar activity needs to be considered. Two atmospheric density models, the modified Harris-Priester model and the Jacchia-Roberts model, to reproduce the decay histories of the Solar Mesosphere Explorer (SME) and Solar Maximum Mission (SMM) spacecraft in the 490- to 540-kilometer altitude range were analyzed. Historical solar activity data were used in the input to the density computations. For each spacecraft and atmospheric model, a drag scaling adjustment factor was determined for a high-solar-activity year, such that the observed annual decay in the mean semimajor axis was reproduced by an averaged variation-of-parameters (VOP) orbit propagation. The SME (SMM) calibration was performed using calendar year 1983 (1982). The resulting calibration factors differ by 20 to 40 percent from the predictions of the prelaunch ballistic coefficients. The orbit propagations for each spacecraft were extended to the middle of 1988 using the calibrated drag models. For the Jaccia-Roberts density model, the observed decay in the mean semimajor axis of SME (SMM) over the 4.5-year (5.5-year) predictive period was reproduced to within 1.5 (4.4) percent. The corresponding figure for the Harris-Priester model was 8.6 (20.6) percent. Detailed results and conclusions regarding the importance of accurate drag force modeling for lifetime predictions are presented.

  10. (Q)SARs to predict environmental toxicities: current status and future needs.

    PubMed

    Cronin, Mark T D

    2017-03-22

    The current state of the art of (Quantitative) Structure-Activity Relationships ((Q)SARs) to predict environmental toxicity is assessed along with recommendations to develop these models further. The acute toxicity of compounds acting by the non-polar narcotic mechanism of action can be well predicted, however other approaches, including read-across, may be required for compounds acting by specific mechanisms of action. The chronic toxicity of compounds to environmental species is more difficult to predict from (Q)SARs, with robust data sets and more mechanistic information required. In addition, the toxicity of mixtures is little addressed by (Q)SAR approaches. Developments in environmental toxicology including Adverse Outcome Pathways (AOPs) and omics responses should be utilised to develop better, more mechanistically relevant, (Q)SAR models.

  11. Bayesics

    NASA Astrophysics Data System (ADS)

    Skilling, John

    2005-11-01

    This tutorial gives a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its rôle in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive capability. These models are patterns of belief; there is no need to claim external reality. 1. Logic and probability 2. Probability and inference 3. Probability and model selection 4. Prior probabilities 5. Probability and frequency 6. Probability and quantum mechanics 7. Probability and fundamentalism 8. Probability and deception 9. Prediction and truth

  12. [Application of ARIMA model to predict number of malaria cases in China].

    PubMed

    Hui-Yu, H; Hua-Qin, S; Shun-Xian, Z; Lin, A I; Yan, L U; Yu-Chun, C; Shi-Zhu, L I; Xue-Jiao, T; Chun-Li, Y; Wei, H U; Jia-Xu, C

    2017-08-15

    Objective To study the application of autoregressive integrated moving average (ARIMA) model to predict the monthly reported malaria cases in China, so as to provide a reference for prevention and control of malaria. Methods SPSS 24.0 software was used to construct the ARIMA models based on the monthly reported malaria cases of the time series of 20062015 and 2011-2015, respectively. The data of malaria cases from January to December, 2016 were used as validation data to compare the accuracy of the two ARIMA models. Results The models of the monthly reported cases of malaria in China were ARIMA (2, 1, 1) (1, 1, 0) 12 and ARIMA (1, 0, 0) (1, 1, 0) 12 respectively. The comparison between the predictions of the two models and actual situation of malaria cases showed that the ARIMA model based on the data of 2011-2015 had a higher accuracy of forecasting than the model based on the data of 2006-2015 had. Conclusion The establishment and prediction of ARIMA model is a dynamic process, which needs to be adjusted unceasingly according to the accumulated data, and in addition, the major changes of epidemic characteristics of infectious diseases must be considered.

  13. Can spatial statistical river temperature models be transferred between catchments?

    NASA Astrophysics Data System (ADS)

    Jackson, Faye L.; Fryer, Robert J.; Hannah, David M.; Malcolm, Iain A.

    2017-09-01

    There has been increasing use of spatial statistical models to understand and predict river temperature (Tw) from landscape covariates. However, it is not financially or logistically feasible to monitor all rivers and the transferability of such models has not been explored. This paper uses Tw data from four river catchments collected in August 2015 to assess how well spatial regression models predict the maximum 7-day rolling mean of daily maximum Tw (Twmax) within and between catchments. Models were fitted for each catchment separately using (1) landscape covariates only (LS models) and (2) landscape covariates and an air temperature (Ta) metric (LS_Ta models). All the LS models included upstream catchment area and three included a river network smoother (RNS) that accounted for unexplained spatial structure. The LS models transferred reasonably to other catchments, at least when predicting relative levels of Twmax. However, the predictions were biased when mean Twmax differed between catchments. The RNS was needed to characterise and predict finer-scale spatially correlated variation. Because the RNS was unique to each catchment and thus non-transferable, predictions were better within catchments than between catchments. A single model fitted to all catchments found no interactions between the landscape covariates and catchment, suggesting that the landscape relationships were transferable. The LS_Ta models transferred less well, with particularly poor performance when the relationship with the Ta metric was physically implausible or required extrapolation outside the range of the data. A single model fitted to all catchments found catchment-specific relationships between Twmax and the Ta metric, indicating that the Ta metric was not transferable. These findings improve our understanding of the transferability of spatial statistical river temperature models and provide a foundation for developing new approaches for predicting Tw at unmonitored locations across multiple catchments and larger spatial scales.

  14. The Traditional Model Does Not Explain Attitudes Toward Euthanasia: A Web-Based Survey of the General Public in Finland.

    PubMed

    Terkamo-Moisio, Anja; Kvist, Tarja; Laitila, Teuvo; Kangasniemi, Mari; Ryynänen, Olli-Pekka; Pietilä, Anna-Maija

    2017-08-01

    The debate about euthanasia is ongoing in several countries including Finland. However, there is a lack of information on current attitudes toward euthanasia among general Finnish public. The traditional model for predicting individuals' attitudes to euthanasia is based on their age, gender, educational level, and religiosity. However, a new evaluation of religiosity is needed due to the limited operationalization of this factor in previous studies. This study explores the connections between the factors of the traditional model and the attitudes toward euthanasia among the general public in the Finnish context. The Finnish public's attitudes toward euthanasia have become remarkably more positive over the last decade. Further research is needed on the factors that predict euthanasia attitudes. We suggest two different explanatory models for consideration: one that emphasizes the value of individual autonomy and another that approaches euthanasia from the perspective of fears of death or the process of dying.

  15. Analysis and prediction of agricultural pest dynamics with Tiko'n, a generic tool to develop agroecological food web models

    NASA Astrophysics Data System (ADS)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Anandaraja, N.; Tuy, H.; Melgar-Quiñonez, H.

    2016-12-01

    While several well-validated crop growth models are currently widely used, very few crop pest models of the same caliber have been developed or applied, and pest models that take trophic interactions into account are even rarer. This may be due to several factors, including 1) the difficulty of representing complex agroecological food webs in a quantifiable model, and 2) the general belief that pesticides effectively remove insect pests from immediate concern. However, pests currently claim a substantial amount of harvests every year (and account for additional control costs), and the impact of insects and of their trophic interactions on agricultural crops cannot be ignored, especially in the context of changing climates and increasing pressures on crops across the globe. Unfortunately, most integrated pest management frameworks rely on very simple models (if at all), and most examples of successful agroecological management remain more anecdotal than scientifically replicable. In light of this, there is a need for validated and robust agroecological food web models that allow users to predict the response of these webs to changes in management, crops or climate, both in order to predict future pest problems under a changing climate as well as to develop effective integrated management plans. Here we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web agroecological models that predict pest dynamics in the field. The programme uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. We apply the model to the cononut black-headed caterpillar (Opisina arenosella) and associated parasitoid data from Sri Lanka, showing how the modeling framework can be used to rapidly develop, calibrate and validate models that elucidate how the internal structures of food webs determine their behaviour and allow users to evaluate different integrated management options.

  16. Progress in Space Weather Modeling and Observations Needed to Improve the Operational NAIRAS Model Aircraft Radiation Exposure Predictions

    NASA Astrophysics Data System (ADS)

    Mertens, C. J.; Kress, B. T.; Wiltberger, M. J.; Tobiska, W.; Xu, X.

    2011-12-01

    The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a prototype operational model for predicting commercial aircraft radiation exposure from galactic and solar cosmic rays. NAIRAS predictions are currently streaming live from the project's public website, and the exposure rate nowcast is also available on the SpaceWx smartphone app for iPhone, IPad, and Android. Cosmic rays are the primary source of human exposure to high linear energy transfer radiation at aircraft altitudes, which increases the risk of cancer and other adverse health effects. Thus, the NAIRAS model addresses an important national need with broad societal, public health and economic benefits. The processes responsible for the variability in the solar wind, interplanetary magnetic field, solar energetic particle spectrum, and the dynamical response of the magnetosphere to these space environment inputs, strongly influence the composition and energy distribution of the atmospheric ionizing radiation field. During the development of the NAIRAS model, new science questions were identified that must be addressed in order to obtain a more reliable and robust operational model of atmospheric radiation exposure. Addressing these science questions require improvements in both space weather modeling and observations. The focus of this talk is to present these science questions, the proposed methodologies for addressing these science questions, and the anticipated improvements to the operational predictions of atmospheric radiation exposure. The overarching goal of this work is to provide a decision support tool for the aviation industry that will enable an optimal balance to be achieved between minimizing health risks to passengers and aircrew while simultaneously minimizing costs to the airline companies.

  17. Technical note: Bayesian calibration of dynamic ruminant nutrition models.

    PubMed

    Reed, K F; Arhonditsis, G B; France, J; Kebreab, E

    2016-08-01

    Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. Predicting equipment needs of children with cerebral palsy using the Gross Motor Function Classification System: a cross-sectional study.

    PubMed

    Novak, Iona; Smithers-Sheedy, Hayley; Morgan, Cathy

    2012-01-01

    Children with cerebral palsy (CP) routinely use assistive equipment to improve their independence. Specialist equipment is expensive and therefore not always available to the child when needed. The aim of this study was to determine whether the assistive equipment needs of children with CP and the associated costs could be predicted. A cross-sectional study using a chart audit was completed. Two hundred forty-two children met eligibility criteria and were included in the study. Data abstracted from files pertained to the child's CP, associated impairments and assistive equipment prescribed. The findings were generated using linear regression modelling. Gross Motor Function Classification System (GMFCS) level [B = 3.01 (95% CI, 2.36-3.57), p = 0.000] and the presence of epilepsy [B = 2.35 (95% CI, 0.64-4.06), p = 0.008] predicted the prescription of assistive equipment. The more severely affected the gross motor function impairment, the more equipment that was required and the more the equipment cost. The equipment needs of children with CP can be predicted for the duration of childhood. This information may be useful for families and for budget and service planning.

  19. Outcome Prediction in Mathematical Models of Immune Response to Infection.

    PubMed

    Mai, Manuel; Wang, Kun; Huber, Greg; Kirby, Michael; Shattuck, Mark D; O'Hern, Corey S

    2015-01-01

    Clinicians need to predict patient outcomes with high accuracy as early as possible after disease inception. In this manuscript, we show that patient-to-patient variability sets a fundamental limit on outcome prediction accuracy for a general class of mathematical models for the immune response to infection. However, accuracy can be increased at the expense of delayed prognosis. We investigate several systems of ordinary differential equations (ODEs) that model the host immune response to a pathogen load. Advantages of systems of ODEs for investigating the immune response to infection include the ability to collect data on large numbers of 'virtual patients', each with a given set of model parameters, and obtain many time points during the course of the infection. We implement patient-to-patient variability v in the ODE models by randomly selecting the model parameters from distributions with coefficients of variation v that are centered on physiological values. We use logistic regression with one-versus-all classification to predict the discrete steady-state outcomes of the system. We find that the prediction algorithm achieves near 100% accuracy for v = 0, and the accuracy decreases with increasing v for all ODE models studied. The fact that multiple steady-state outcomes can be obtained for a given initial condition, i.e. the basins of attraction overlap in the space of initial conditions, limits the prediction accuracy for v > 0. Increasing the elapsed time of the variables used to train and test the classifier, increases the prediction accuracy, while adding explicit external noise to the ODE models decreases the prediction accuracy. Our results quantify the competition between early prognosis and high prediction accuracy that is frequently encountered by clinicians.

  20. Replication and extension of the dual pathway model of disordered eating: The role of fear of negative evaluation, suggestibility, rumination, and self-compassion.

    PubMed

    Maraldo, Toni M; Zhou, Wanni; Dowling, Jessica; Vander Wal, Jillon S

    2016-12-01

    The dual pathway model, a theoretical model of eating disorder development, suggests that thin ideal internalization leads to body dissatisfaction which leads to disordered eating via the dual pathways of negative affect and dietary restraint. While the dual pathway model has been a valuable guide for eating disorder prevention, greater knowledge of characteristics that predict thin ideal internalization is needed. The present study replicated and extended the dual pathway model by considering the addition of fear of negative evaluation, suggestibility, rumination, and self-compassion in a sample of community women and female university students. Results showed that fear of negative evaluation and suggestibility predicted thin ideal internalization whereas rumination and self-compassion (inversely) predicted body dissatisfaction. Negative affect was predicted by fear of negative evaluation, rumination, and self-compassion (inversely). The extended model fit the data well in both samples. Analogue and longitudinal study of these constructs is warranted in future research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Validation metrics for turbulent plasma transport

    DOE PAGES

    Holland, C.

    2016-06-22

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  2. Jet Noise Modeling for Supersonic Business Jet Application

    NASA Technical Reports Server (NTRS)

    Stone, James R.; Krejsa, Eugene A.; Clark, Bruce J.

    2004-01-01

    This document describes the development of an improved predictive model for coannular jet noise, including noise suppression modifications applicable to small supersonic-cruise aircraft such as the Supersonic Business Jet (SBJ), for NASA Langley Research Center (LaRC). For such aircraft a wide range of propulsion and integration options are under consideration. Thus there is a need for very versatile design tools, including a noise prediction model. The approach used is similar to that used with great success by the Modern Technologies Corporation (MTC) in developing a noise prediction model for two-dimensional mixer ejector (2DME) nozzles under the High Speed Research Program and in developing a more recent model for coannular nozzles over a wide range of conditions. If highly suppressed configurations are ultimately required, the 2DME model is expected to provide reasonable prediction for these smaller scales, although this has not been demonstrated. It is considered likely that more modest suppression approaches, such as dual stream nozzles featuring chevron or chute suppressors, perhaps in conjunction with inverted velocity profiles (IVP), will be sufficient for the SBJ.

  3. Validation metrics for turbulent plasma transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, C.

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  4. Prognostics Uncertainty Management with Application to Government and Industry

    NASA Technical Reports Server (NTRS)

    Celaya, Jose; Sankararaman, Shankar; Daigle, Matthew; Saxena, Abhinav; Goebel, Kai

    2014-01-01

    Predictions about the future are contingent on future usage, but also on the quality of the models employed and the assessment of the current health state. These factors, amongst others, need to be considered to arrive at a prediction that is conducted through a rigorous method but where the confidence bounds are not prohibitively large.

  5. RESOURCE NEED AND USE OF MULTIETHNIC CAREGIVERS OF ELDERS IN THEIR HOMES

    PubMed Central

    Friedemann, Marie-Luise; Newman, Frederick L.; Buckwalter, Kathleen C.; Montgomery, Rhonda J. V.

    2013-01-01

    Aims To predict South Florida family care-givers’ need for and use of informal help or formal services; specifically, to explore the predictive power of variables suggested by the Caregiver Identity Theory and the literature and develop and test a structural model 0. Background In the USA, most of the care to older adults is given by family members. Care-givers make economic and social sacrifices that endanger their health. They feel burdened, if they receive no assistance with their tasks; however, services available are not sufficiently used. Design This cross-sectional correlational study was a survey of family care-givers in their home, using standardized and/or pre-tested scales and a cognitive status test of their patients. Methods A random sample of 613 multiethnic care-givers of frail elders was recruited in home care and community agencies. The interviews occurred between 2006–2009. Analyses involved correlation and regression analyses and structural equation modeling. Outcome measures were need and use of family help and formal services. Results/Findings The model yielded excellent fit indices replicated on three random samples of 370. The patients’ functional limitations yielded the strongest predictive coefficients followed by care-giver stress. Cultural indicators played a minor role. Conclusion The lack of a link between resource need and use suggested access barriers. Important for policy makers and service providers are the delivery of high-quality services and the use of a personal and individualized approach with all ethnicities. Quality service includes understanding the care-giving situations and requires a trusting relationship with family care-givers. PMID:23980518

  6. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    PubMed

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  7. An integrated approach to rotorcraft human factors research

    NASA Technical Reports Server (NTRS)

    Hart, Sandra G.; Hartzell, E. James; Voorhees, James W.; Bucher, Nancy M.; Shively, R. Jay

    1988-01-01

    As the potential of civil and military helicopters has increased, more complex and demanding missions in increasingly hostile environments have been required. Users, designers, and manufacturers have an urgent need for information about human behavior and function to create systems that take advantage of human capabilities, without overloading them. Because there is a large gap between what is known about human behavior and the information needed to predict pilot workload and performance in the complex missions projected for pilots of advanced helicopters, Army and NASA scientists are actively engaged in Human Factors Research at Ames. The research ranges from laboratory experiments to computational modeling, simulation evaluation, and inflight testing. Information obtained in highly controlled but simpler environments generates predictions which can be tested in more realistic situations. These results are used, in turn, to refine theoretical models, provide the focus for subsequent research, and ensure operational relevance, while maintaining predictive advantages. The advantages and disadvantages of each type of research are described along with examples of experimental results.

  8. Predicting fiber refractive index from a measured preform index profile

    NASA Astrophysics Data System (ADS)

    Kiiveri, P.; Koponen, J.; Harra, J.; Novotny, S.; Husu, H.; Ihalainen, H.; Kokki, T.; Aallos, V.; Kimmelma, O.; Paul, J.

    2018-02-01

    When producing fiber lasers and amplifiers, silica glass compositions consisting of three to six different materials are needed. Due to the varying needs of different applications, substantial number of different glass compositions are used in the active fiber structures. Often it is not possible to find material parameters for theoretical models to estimate thermal and mechanical properties of those glass compositions. This makes it challenging to predict accurately fiber core refractive index values, even if the preform index profile is measured. Usually the desired fiber refractive index value is achieved experimentally, which is expensive. To overcome this problem, we analyzed statistically the changes between the measured preform and fiber index values. We searched for correlations that would help to predict the Δn-value change from preform to fiber in a situation where we don't know the values of the glass material parameters that define the change. Our index change models were built using the data collected from preforms and fibers made by the Direct Nanoparticle Deposition (DND) technology.

  9. Two models for identification and predicting behaviour of an induction motor system

    NASA Astrophysics Data System (ADS)

    Kuo, Chien-Hsun

    2018-01-01

    System identification or modelling is the process of building mathematical models of dynamical systems based on the available input and output data from the systems. This paper introduces system identification by using ARX (Auto Regressive with eXogeneous input) and ARMAX (Auto Regressive Moving Average with eXogeneous input) models. Through the identified system model, the predicted output could be compared with the measured one to help prevent the motor faults from developing into a catastrophic machine failure and avoid unnecessary costs and delays caused by the need to carry out unscheduled repairs. The induction motor system is illustrated as an example. Numerical and experimental results are shown for the identified induction motor system.

  10. The Potential of Virtual Reality to Assess Functional Communication in Aphasia

    ERIC Educational Resources Information Center

    Garcia, Linda J.; Rebolledo, Mercedes; Metthe, Lynn; Lefebvre, Renee

    2007-01-01

    Speech-language pathologists (SLPs) who work with adults with cognitive-linguistic impairments, including aphasia, have long needed an assessment tool that predicts ability to function in the real world. In this article, it is argued that virtual reality (VR)-supported approaches can address this need. Using models of disability such as the…

  11. Model identification using stochastic differential equation grey-box models in diabetes.

    PubMed

    Duun-Henriksen, Anne Katrine; Schmidt, Signe; Røge, Rikke Meldgaard; Møller, Jonas Bech; Nørgaard, Kirsten; Jørgensen, John Bagterp; Madsen, Henrik

    2013-03-01

    The acceptance of virtual preclinical testing of control algorithms is growing and thus also the need for robust and reliable models. Models based on ordinary differential equations (ODEs) can rarely be validated with standard statistical tools. Stochastic differential equations (SDEs) offer the possibility of building models that can be validated statistically and that are capable of predicting not only a realistic trajectory, but also the uncertainty of the prediction. In an SDE, the prediction error is split into two noise terms. This separation ensures that the errors are uncorrelated and provides the possibility to pinpoint model deficiencies. An identifiable model of the glucoregulatory system in a type 1 diabetes mellitus (T1DM) patient is used as the basis for development of a stochastic-differential-equation-based grey-box model (SDE-GB). The parameters are estimated on clinical data from four T1DM patients. The optimal SDE-GB is determined from likelihood-ratio tests. Finally, parameter tracking is used to track the variation in the "time to peak of meal response" parameter. We found that the transformation of the ODE model into an SDE-GB resulted in a significant improvement in the prediction and uncorrelated errors. Tracking of the "peak time of meal absorption" parameter showed that the absorption rate varied according to meal type. This study shows the potential of using SDE-GBs in diabetes modeling. Improved model predictions were obtained due to the separation of the prediction error. SDE-GBs offer a solid framework for using statistical tools for model validation and model development. © 2013 Diabetes Technology Society.

  12. Comment on "Advective transport in heterogeneous aquifers: Are proxy models predictive?" by A. Fiori, A. Zarlenga, H. Gotovac, I. Jankovic, E. Volpi, V. Cvetkovic, and G. Dagan

    NASA Astrophysics Data System (ADS)

    Neuman, Shlomo P.

    2016-07-01

    Fiori et al. (2015) examine the predictive capabilities of (among others) two "proxy" non-Fickian transport models, MRMT (Multi-Rate Mass Transfer) and CTRW (Continuous-Time Random Walk). In particular, they compare proxy model predictions of mean breakthrough curves (BTCs) at a sequence of control planes with near-ergodic BTCs generated through two- and three-dimensional simulations of nonreactive, mean-uniform advective transport in single realizations of stationary, randomly heterogeneous porous media. The authors find fitted proxy model parameters to be nonunique and devoid of clear physical meaning. This notwithstanding, they conclude optimistically that "i. Fitting the proxy models to match the BTC at [one control plane] automatically ensures prediction at downstream control planes [and thus] ii. … the measured BTC can be used directly for prediction, with no need to use models underlain by fitting." I show that (a) the authors' findings follow directly from (and thus confirm) theoretical considerations discussed earlier by Neuman and Tartakovsky (2009), which (b) additionally demonstrate that proxy models will lack similar predictive capabilities under more realistic, non-Markovian flow and transport conditions that prevail under flow through nonstationary (e.g., multiscale) media in the presence of boundaries and/or nonuniformly distributed sources, and/or when flow/transport are conditioned on measurements.

  13. Cognitive models of risky choice: parameter stability and predictive accuracy of prospect theory.

    PubMed

    Glöckner, Andreas; Pachur, Thorsten

    2012-04-01

    In the behavioral sciences, a popular approach to describe and predict behavior is cognitive modeling with adjustable parameters (i.e., which can be fitted to data). Modeling with adjustable parameters allows, among other things, measuring differences between people. At the same time, parameter estimation also bears the risk of overfitting. Are individual differences as measured by model parameters stable enough to improve the ability to predict behavior as compared to modeling without adjustable parameters? We examined this issue in cumulative prospect theory (CPT), arguably the most widely used framework to model decisions under risk. Specifically, we examined (a) the temporal stability of CPT's parameters; and (b) how well different implementations of CPT, varying in the number of adjustable parameters, predict individual choice relative to models with no adjustable parameters (such as CPT with fixed parameters, expected value theory, and various heuristics). We presented participants with risky choice problems and fitted CPT to each individual's choices in two separate sessions (which were 1 week apart). All parameters were correlated across time, in particular when using a simple implementation of CPT. CPT allowing for individual variability in parameter values predicted individual choice better than CPT with fixed parameters, expected value theory, and the heuristics. CPT's parameters thus seem to pick up stable individual differences that need to be considered when predicting risky choice. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Predictive vs. Empiric Assessment of Schistosomiasis: Implications for Treatment Projections in Ghana

    PubMed Central

    Kabore, Achille; Biritwum, Nana-Kwadwo; Downs, Philip W.; Soares Magalhaes, Ricardo J.; Zhang, Yaobi; Ottesen, Eric A.

    2013-01-01

    Background Mapping the distribution of schistosomiasis is essential to determine where control programs should operate, but because it is impractical to assess infection prevalence in every potentially endemic community, model-based geostatistics (MBG) is increasingly being used to predict prevalence and determine intervention strategies. Methodology/Principal Findings To assess the accuracy of MBG predictions for Schistosoma haematobium infection in Ghana, school surveys were evaluated at 79 sites to yield empiric prevalence values that could be compared with values derived from recently published MBG predictions. Based on these findings schools were categorized according to WHO guidelines so that practical implications of any differences could be determined. Using the mean predicted values alone, 21 of the 25 empirically determined ‘high-risk’ schools requiring yearly praziquantel would have been undertreated and almost 20% of the remaining schools would have been treated despite empirically-determined absence of infection – translating into 28% of the children in the 79 schools being undertreated and 12% receiving treatment in the absence of any demonstrated need. Conclusions/Significance Using the current predictive map for Ghana as a spatial decision support tool by aggregating prevalence estimates to the district level was clearly not adequate for guiding the national program, but the alternative of assessing each school in potentially endemic areas of Ghana or elsewhere is not at all feasible; modelling must be a tool complementary to empiric assessments. Thus for practical usefulness, predictive risk mapping should not be thought of as a one-time exercise but must, as in the current study, be an iterative process that incorporates empiric testing and model refining to create updated versions that meet the needs of disease control operational managers. PMID:23505584

  15. Overview of meteorological measurements for aerial spray modeling.

    PubMed

    Rafferty, J E; Biltoft, C A; Bowers, J F

    1996-06-01

    The routine meteorological observations made by the National Weather Service have a spatial resolution on the order of 1,000 km, whereas the resolution needed to conduct or model aerial spray applications is on the order of 1-10 km. Routinely available observations also do not include the detailed information on the turbulence and thermal structure of the boundary layer that is needed to predict the transport, dispersion, and deposition of aerial spray releases. This paper provides an overview of the information needed to develop the meteorological inputs for an aerial spray model such as the FSCBG and discusses the different types of instruments that are available to make the necessary measurements.

  16. Analysis of spatial correlation in predictive models of forest variables that use LiDAR auxiliary information

    Treesearch

    F. Mauro; Vicente J. Monleon; H. Temesgen; L.A. Ruiz

    2017-01-01

    Accounting for spatial correlation of LiDAR model errors can improve the precision of model-based estimators. To estimate spatial correlation, sample designs that provide close observations are needed, but their implementation might be prohibitively expensive. To quantify the gains obtained by accounting for the spatial correlation of model errors, we examined (

  17. Neural network model for growth of Salmonella serotypes in ground chicken subjected to temperature abuse during cold storage for application in HACCP and risk assessment

    USDA-ARS?s Scientific Manuscript database

    With the advent of commercial software applications, it is now easy to develop neural network models for predictive microbiology applications. However, different versions of the model may be required to meet the divergent needs of model users. In the current study, the commercial software applicat...

  18. Simpler score of routine laboratory tests predicts liver fibrosis in patients with chronic hepatitis B.

    PubMed

    Zhou, Kun; Gao, Chun-Fang; Zhao, Yun-Peng; Liu, Hai-Lin; Zheng, Rui-Dan; Xian, Jian-Chun; Xu, Hong-Tao; Mao, Yi-Min; Zeng, Min-De; Lu, Lun-Gen

    2010-09-01

    In recent years, a great interest has been dedicated to the development of noninvasive predictive models to substitute liver biopsy for fibrosis assessment and follow-up. Our aim was to provide a simpler model consisting of routine laboratory markers for predicting liver fibrosis in patients chronically infected with hepatitis B virus (HBV) in order to optimize their clinical management. Liver fibrosis was staged in 386 chronic HBV carriers who underwent liver biopsy and routine laboratory testing. Correlations between routine laboratory markers and fibrosis stage were statistically assessed. After logistic regression analysis, a novel predictive model was constructed. This S index was validated in an independent cohort of 146 chronic HBV carriers in comparison to the SLFG model, Fibrometer, Hepascore, Hui model, Forns score and APRI using receiver operating characteristic (ROC) curves. The diagnostic values of each marker panels were better than single routine laboratory markers. The S index consisting of gamma-glutamyltransferase (GGT), platelets (PLT) and albumin (ALB) (S-index: 1000 x GGT/(PLT x ALB(2))) had a higher diagnostic accuracy in predicting degree of fibrosis than any other mathematical model tested. The areas under the ROC curves (AUROC) were 0.812 and 0.890 for predicting significant fibrosis and cirrhosis in the validation cohort, respectively. The S index, a simpler mathematical model consisting of routine laboratory markers predicts significant fibrosis and cirrhosis in patients with chronic HBV infection with a high degree of accuracy, potentially decreasing the need for liver biopsy.

  19. UAV Trajectory Modeling Using Neural Networks

    NASA Technical Reports Server (NTRS)

    Xue, Min

    2017-01-01

    Massive small unmanned aerial vehicles are envisioned to operate in the near future. While there are lots of research problems need to be addressed before dense operations can happen, trajectory modeling remains as one of the keys to understand and develop policies, regulations, and requirements for safe and efficient unmanned aerial vehicle operations. The fidelity requirement of a small unmanned vehicle trajectory model is high because these vehicles are sensitive to winds due to their small size and low operational altitude. Both vehicle control systems and dynamic models are needed for trajectory modeling, which makes the modeling a great challenge, especially considering the fact that manufactures are not willing to share their control systems. This work proposed to use a neural network approach for modelling small unmanned vehicle's trajectory without knowing its control system and bypassing exhaustive efforts for aerodynamic parameter identification. As a proof of concept, instead of collecting data from flight tests, this work used the trajectory data generated by a mathematical vehicle model for training and testing the neural network. The results showed great promise because the trained neural network can predict 4D trajectories accurately, and prediction errors were less than 2:0 meters in both temporal and spatial dimensions.

  20. Predicting Survival of De Novo Metastatic Breast Cancer in Asian Women: Systematic Review and Validation Study

    PubMed Central

    Miao, Hui; Hartman, Mikael; Bhoo-Pathy, Nirmala; Lee, Soo-Chin; Taib, Nur Aishah; Tan, Ern-Yu; Chan, Patrick; Moons, Karel G. M.; Wong, Hoong-Seam; Goh, Jeremy; Rahim, Siti Mastura; Yip, Cheng-Har; Verkooijen, Helena M.

    2014-01-01

    Background In Asia, up to 25% of breast cancer patients present with distant metastases at diagnosis. Given the heterogeneous survival probabilities of de novo metastatic breast cancer, individual outcome prediction is challenging. The aim of the study is to identify existing prognostic models for patients with de novo metastatic breast cancer and validate them in Asia. Materials and Methods We performed a systematic review to identify prediction models for metastatic breast cancer. Models were validated in 642 women with de novo metastatic breast cancer registered between 2000 and 2010 in the Singapore Malaysia Hospital Based Breast Cancer Registry. Survival curves for low, intermediate and high-risk groups according to each prognostic score were compared by log-rank test and discrimination of the models was assessed by concordance statistic (C-statistic). Results We identified 16 prediction models, seven of which were for patients with brain metastases only. Performance status, estrogen receptor status, metastatic site(s) and disease-free interval were the most common predictors. We were able to validate nine prediction models. The capacity of the models to discriminate between poor and good survivors varied from poor to fair with C-statistics ranging from 0.50 (95% CI, 0.48–0.53) to 0.63 (95% CI, 0.60–0.66). Conclusion The discriminatory performance of existing prediction models for de novo metastatic breast cancer in Asia is modest. Development of an Asian-specific prediction model is needed to improve prognostication and guide decision making. PMID:24695692

Top