Sample records for accurately predict future

  1. Predicting and Supplying Human Resource Requirements for the Future.

    ERIC Educational Resources Information Center

    Blake, Larry J.

    After asserting that public institutions should not provide training for nonexistent jobs, this paper reviews problems associated with the accurate prediction of future manpower needs. The paper reviews the processes currently used to project labor force needs and notes the difficulty of accurately forecasting labor market "surprises,"…

  2. Mental models accurately predict emotion transitions.

    PubMed

    Thornton, Mark A; Tamir, Diana I

    2017-06-06

    Successful social interactions depend on people's ability to predict others' future actions and emotions. People possess many mechanisms for perceiving others' current emotional states, but how might they use this information to predict others' future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others' emotional dynamics. People could then use these mental models of emotion transitions to predict others' future emotions from currently observable emotions. To test this hypothesis, studies 1-3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants' ratings of emotion transitions predicted others' experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation-valence, social impact, rationality, and human mind-inform participants' mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants' accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone.

  3. Remaining dischargeable time prediction for lithium-ion batteries using unscented Kalman filter

    NASA Astrophysics Data System (ADS)

    Dong, Guangzhong; Wei, Jingwen; Chen, Zonghai; Sun, Han; Yu, Xiaowei

    2017-10-01

    To overcome the range anxiety, one of the important strategies is to accurately predict the range or dischargeable time of the battery system. To accurately predict the remaining dischargeable time (RDT) of a battery, a RDT prediction framework based on accurate battery modeling and state estimation is presented in this paper. Firstly, a simplified linearized equivalent-circuit-model is developed to simulate the dynamic characteristics of a battery. Then, an online recursive least-square-algorithm method and unscented-Kalman-filter are employed to estimate the system matrices and SOC at every prediction point. Besides, a discrete wavelet transform technique is employed to capture the statistical information of past dynamics of input currents, which are utilized to predict the future battery currents. Finally, the RDT can be predicted based on the battery model, SOC estimation results and predicted future battery currents. The performance of the proposed methodology has been verified by a lithium-ion battery cell. Experimental results indicate that the proposed method can provide an accurate SOC and parameter estimation and the predicted RDT can solve the range anxiety issues.

  4. Mental models accurately predict emotion transitions

    PubMed Central

    Thornton, Mark A.; Tamir, Diana I.

    2017-01-01

    Successful social interactions depend on people’s ability to predict others’ future actions and emotions. People possess many mechanisms for perceiving others’ current emotional states, but how might they use this information to predict others’ future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others’ emotional dynamics. People could then use these mental models of emotion transitions to predict others’ future emotions from currently observable emotions. To test this hypothesis, studies 1–3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants’ ratings of emotion transitions predicted others’ experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation—valence, social impact, rationality, and human mind—inform participants’ mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants’ accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone. PMID:28533373

  5. Highly accurate prediction of emotions surrounding the attacks of September 11, 2001 over 1-, 2-, and 7-year prediction intervals.

    PubMed

    Doré, Bruce P; Meksin, Robert; Mather, Mara; Hirst, William; Ochsner, Kevin N

    2016-06-01

    In the aftermath of a national tragedy, important decisions are predicated on judgments of the emotional significance of the tragedy in the present and future. Research in affective forecasting has largely focused on ways in which people fail to make accurate predictions about the nature and duration of feelings experienced in the aftermath of an event. Here we ask a related but understudied question: can people forecast how they will feel in the future about a tragic event that has already occurred? We found that people were strikingly accurate when predicting how they would feel about the September 11 attacks over 1-, 2-, and 7-year prediction intervals. Although people slightly under- or overestimated their future feelings at times, they nonetheless showed high accuracy in forecasting (a) the overall intensity of their future negative emotion, and (b) the relative degree of different types of negative emotion (i.e., sadness, fear, or anger). Using a path model, we found that the relationship between forecasted and actual future emotion was partially mediated by current emotion and remembered emotion. These results extend theories of affective forecasting by showing that emotional responses to an event of ongoing national significance can be predicted with high accuracy, and by identifying current and remembered feelings as independent sources of this accuracy. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Highly accurate prediction of emotions surrounding the attacks of September 11, 2001 over 1-, 2-, and 7-year prediction intervals

    PubMed Central

    Doré, B.P.; Meksin, R.; Mather, M.; Hirst, W.; Ochsner, K.N

    2016-01-01

    In the aftermath of a national tragedy, important decisions are predicated on judgments of the emotional significance of the tragedy in the present and future. Research in affective forecasting has largely focused on ways in which people fail to make accurate predictions about the nature and duration of feelings experienced in the aftermath of an event. Here we ask a related but understudied question: can people forecast how they will feel in the future about a tragic event that has already occurred? We found that people were strikingly accurate when predicting how they would feel about the September 11 attacks over 1-, 2-, and 7-year prediction intervals. Although people slightly under- or overestimated their future feelings at times, they nonetheless showed high accuracy in forecasting 1) the overall intensity of their future negative emotion, and 2) the relative degree of different types of negative emotion (i.e., sadness, fear, or anger). Using a path model, we found that the relationship between forecasted and actual future emotion was partially mediated by current emotion and remembered emotion. These results extend theories of affective forecasting by showing that emotional responses to an event of ongoing national significance can be predicted with high accuracy, and by identifying current and remembered feelings as independent sources of this accuracy. PMID:27100309

  7. A Battery Health Monitoring Framework for Planetary Rovers

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Kulkarni, Chetan Shrikant

    2014-01-01

    Batteries have seen an increased use in electric ground and air vehicles for commercial, military, and space applications as the primary energy source. An important aspect of using batteries in such contexts is battery health monitoring. Batteries must be carefully monitored such that the battery health can be determined, and end of discharge and end of usable life events may be accurately predicted. For planetary rovers, battery health estimation and prediction is critical to mission planning and decision-making. We develop a model-based approach utilizing computaitonally efficient and accurate electrochemistry models of batteries. An unscented Kalman filter yields state estimates, which are then used to predict the future behavior of the batteries and, specifically, end of discharge. The prediction algorithm accounts for possible future power demands on the rover batteries in order to provide meaningful results and an accurate representation of prediction uncertainty. The framework is demonstrated on a set of lithium-ion batteries powering a rover at NASA.

  8. Predicting Future Reconviction in Offenders with Intellectual Disabilities: The Predictive Efficacy of VRAG, PCL-SV, and the HCR-20

    ERIC Educational Resources Information Center

    Gray, Nicola S.; Fitzgerald, Suzanne; Taylor, John; MacCulloch, Malcolm J.; Snowden, Robert J.

    2007-01-01

    Accurate predictions of future reconviction, including those for violent crimes, have been shown to be greatly aided by the use of formal risk assessment instruments. However, it is unclear as to whether these instruments would also be predictive in a sample of offenders with intellectual disabilities. In this study, the authors have shown that…

  9. Improving medical decisions for incapacitated persons: does focusing on "accurate predictions" lead to an inaccurate picture?

    PubMed

    Kim, Scott Y H

    2014-04-01

    The Patient Preference Predictor (PPP) proposal places a high priority on the accuracy of predicting patients' preferences and finds the performance of surrogates inadequate. However, the quest to develop a highly accurate, individualized statistical model has significant obstacles. First, it will be impossible to validate the PPP beyond the limit imposed by 60%-80% reliability of people's preferences for future medical decisions--a figure no better than the known average accuracy of surrogates. Second, evidence supports the view that a sizable minority of persons may not even have preferences to predict. Third, many, perhaps most, people express their autonomy just as much by entrusting their loved ones to exercise their judgment than by desiring to specifically control future decisions. Surrogate decision making faces none of these issues and, in fact, it may be more efficient, accurate, and authoritative than is commonly assumed.

  10. Saccades to future ball location reveal memory-based prediction in a virtual-reality interception task.

    PubMed

    Diaz, Gabriel; Cooper, Joseph; Rothkopf, Constantin; Hayhoe, Mary

    2013-01-16

    Despite general agreement that prediction is a central aspect of perception, there is relatively little evidence concerning the basis on which visual predictions are made. Although both saccadic and pursuit eye-movements reveal knowledge of the future position of a moving visual target, in many of these studies targets move along simple trajectories through a fronto-parallel plane. Here, using a naturalistic and racquet-based interception task in a virtual environment, we demonstrate that subjects make accurate predictions of visual target motion, even when targets follow trajectories determined by the complex dynamics of physical interactions and the head and body are unrestrained. Furthermore, we found that, following a change in ball elasticity, subjects were able to accurately adjust their prebounce predictions of the ball's post-bounce trajectory. This suggests that prediction is guided by experience-based models of how information in the visual image will change over time.

  11. Saccades to future ball location reveal memory-based prediction in a virtual-reality interception task

    PubMed Central

    Diaz, Gabriel; Cooper, Joseph; Rothkopf, Constantin; Hayhoe, Mary

    2013-01-01

    Despite general agreement that prediction is a central aspect of perception, there is relatively little evidence concerning the basis on which visual predictions are made. Although both saccadic and pursuit eye-movements reveal knowledge of the future position of a moving visual target, in many of these studies targets move along simple trajectories through a fronto-parallel plane. Here, using a naturalistic and racquet-based interception task in a virtual environment, we demonstrate that subjects make accurate predictions of visual target motion, even when targets follow trajectories determined by the complex dynamics of physical interactions and the head and body are unrestrained. Furthermore, we found that, following a change in ball elasticity, subjects were able to accurately adjust their prebounce predictions of the ball's post-bounce trajectory. This suggests that prediction is guided by experience-based models of how information in the visual image will change over time. PMID:23325347

  12. Characterization of normality of chaotic systems including prediction and detection of anomalies

    NASA Astrophysics Data System (ADS)

    Engler, Joseph John

    Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational techniques and novel prediction methodologies. The value and efficiency of these methods are explored in various case studies. Presented is an overview of chaotic systems with examples taken from the real world. A representation schema for rapid understanding of the various states of deterministically chaotic systems is presented. This schema is then used to detect anomalies and system state changes. Additionally, a novel prediction methodology which utilizes Lyapunov exponents to facilitate longer term prediction accuracy is presented and compared with other nonlinear prediction methodologies. These novel methodologies are then demonstrated on applications such as wind energy, cyber security and classification of social networks.

  13. Characterizing Ship Navigation Patterns Using Automatic Identification System (AIS) Data in the Baltic Sea

    DTIC Science & Technology

    in the Saint Petersburg area. We use three random forest models, that differ in their use of past information , to predict a vessels next port of visit...network where past information is used to more accurately predict the future state. The transitional probabilities change when predictor variables are...added that reach deeper into the past. Our findings suggest that successful prediction of the movement of a vessel depends on having accurate information on its recent history.

  14. The surprising power of neighborly advice.

    PubMed

    Gilbert, Daniel T; Killingsworth, Matthew A; Eyre, Rebecca N; Wilson, Timothy D

    2009-03-20

    Two experiments revealed that (i) people can more accurately predict their affective reactions to a future event when they know how a neighbor in their social network reacted to the event than when they know about the event itself and (ii) people do not believe this. Undergraduates made more accurate predictions about their affective reactions to a 5-minute speed date (n = 25) and to a peer evaluation (n = 88) when they knew only how another undergraduate had reacted to these events than when they had information about the events themselves. Both participants and independent judges mistakenly believed that predictions based on information about the event would be more accurate than predictions based on information about how another person had reacted to it.

  15. External validation of a simple clinical tool used to predict falls in people with Parkinson disease

    PubMed Central

    Duncan, Ryan P.; Cavanaugh, James T.; Earhart, Gammon M.; Ellis, Terry D.; Ford, Matthew P.; Foreman, K. Bo; Leddy, Abigail L.; Paul, Serene S.; Canning, Colleen G.; Thackeray, Anne; Dibble, Leland E.

    2015-01-01

    Background Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. METHODS We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. RESULTS The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76 –0.89), comparable to the developmental study. CONCLUSION The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual’s risk of an impending fall. PMID:26003412

  16. External validation of a simple clinical tool used to predict falls in people with Parkinson disease.

    PubMed

    Duncan, Ryan P; Cavanaugh, James T; Earhart, Gammon M; Ellis, Terry D; Ford, Matthew P; Foreman, K Bo; Leddy, Abigail L; Paul, Serene S; Canning, Colleen G; Thackeray, Anne; Dibble, Leland E

    2015-08-01

    Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76-0.89), comparable to the developmental study. The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual's risk of an impending fall. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Predicting future forestland area: a comparison of econometric approaches.

    Treesearch

    SoEun Ahn; Andrew J. Plantinga; Ralph J. Alig

    2000-01-01

    Predictions of future forestland area are an important component of forest policy analyses. In this article, we test the ability of econometric land use models to accurately forecast forest area. We construct a panel data set for Alabama consisting of county and time-series observation for the period 1964 to 1992. We estimate models using restricted data sets-namely,...

  18. Hepatic venous pressure gradient after portal vein embolization: An accurate predictor of future liver remnant hypertrophy.

    PubMed

    Mohkam, Kayvan; Rode, Agnès; Darnis, Benjamin; Manichon, Anne-Frédérique; Boussel, Loïc; Ducerf, Christian; Merle, Philippe; Lesurtel, Mickaël; Mabrut, Jean-Yves

    2018-05-09

    The impact of portal hemodynamic variations after portal vein embolization on liver regeneration remains unknown. We studied the correlation between the parameters of hepatic venous pressure measured before and after portal vein embolization and future hypertrophy of the liver remnant after portal vein embolization. Between 2014 and 2017, we reviewed patients who were eligible for major hepatectomy and who had portal vein embolization. Patients had undergone simultaneous measurement of portal venous pressure and hepatic venous pressure gradient before and after portal vein embolization by direct puncture of portal vein and inferior vena cava. We assessed these parameters to predict future liver remnant hypertrophy. Twenty-six patients were included. After portal vein embolization, median portal venous pressure (range) increased from 15 (9-24) to 19 (10-27) mm Hg and hepatic venous pressure gradient increased from 5 (0-12) to 8 (0-14) mm Hg. Median future liver remnant volume (range) was 513 (299-933) mL before portal vein embolization versus 724 (499-1279) mL 3 weeks after portal vein embolization, representing a 35% (7.4-83.6) median hypertrophy. Post-portal vein embolization hepatic venous pressure gradient was the most accurate parameter to predict failure of future liver remnant to reach a 30% hypertrophy (c-statistic: 0.882 [95% CI: 0.727-1.000], P < 0.001). A cut-off value of post-portal vein embolization hepatic venous pressure gradient of 8 mm Hg showed a sensitivity of 91% (95% CI: 57%-99%), specificity of 80% (95% CI: 52%-96%), positive predictive value of 77% (95% CI: 46%-95%) and negative predictive value of 92.3% (95% CI: 64.0%-99.8%). On multivariate analysis, post-portal vein embolization hepatic venous pressure gradient and previous chemotherapy were identified as predictors of impaired future liver remnant hypertrophy. Post-portal vein embolization hepatic venous pressure gradient is a simple and reproducible tool which accurately predicts future liver remnant hypertrophy after portal vein embolization and allows early detection of patients who may benefit from more aggressive procedures inducing future liver remnant hypertrophy. (Surgery 2018;143:1-2.). Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Multi-scale predictions of coniferous forest mortality in the northern hemisphere

    NASA Astrophysics Data System (ADS)

    McDowell, N. G.

    2015-12-01

    Global temperature rise and extremes accompanying drought threaten forests and their associated climatic feedbacks. Our incomplete understanding of the fundamental physiological thresholds of vegetation mortality during drought limits our ability to accurately simulate future vegetation distributions and associated climate feedbacks. Here we integrate experimental evidence with models to show potential widespread loss of needleleaf evergreen trees (NET; ~ conifers) within the Southwest USA by 2100; with rising temperature being the primary cause of mortality. Experimentally, dominant Southwest USA NET species died when they fell below predawn water potential (Ypd) thresholds (April-August mean) beyond which photosynthesis, stomatal and hydraulic conductance, and carbohydrate availability approached zero. Empirical and mechanistic models accurately predicted NET Ypd, and 91% of predictions (10/11) exceeded mortality thresholds within the 21st century due to temperature rise. Completely independent global models predicted >50% loss of northern hemisphere NET by 2100, consistent with the findings for Southwest USA. The global models disagreed with the ecosystem process models in regards to future mortality in Southwest USA, however, highlighting the potential underestimates of future NET mortality as simulated by the global models and signifying the importance of improving regional predictions. Taken together, these results from the validated regional predictions and the global simulations predict global-scale conifer loss in coming decades under projected global warming.

  20. Do We Need Better Climate Predictions to Adapt to a Changing Climate? (Invited)

    NASA Astrophysics Data System (ADS)

    Dessai, S.; Hulme, M.; Lempert, R.; Pielke, R., Jr.

    2009-12-01

    Based on a series of international scientific assessments, climate change has been presented to society as a major problem that needs urgently to be tackled. The science that underpins these assessments has been pre-dominantly from the realm of the natural sciences and central to this framing have been ‘projections’ of future climate change (and its impacts on environment and society) under various greenhouse gas emissions scenarios and using a variety of climate model predictions with embedded assumptions. Central to much of the discussion surrounding adaptation to climate change is the claim - explicit or implicit - that decision makers need accurate and increasingly precise assessments of future impacts of climate change in order to adapt successfully. If true, this claim places a high premium on accurate and precise climate predictions at a range of geographical and temporal scales; such predictions therefore become indispensable, and indeed a prerequisite for, effective adaptation decision-making. But is effective adaptation tied to the ability of the scientific enterprise to predict future climate with accuracy and precision? If so, this may impose a serious and intractable limit on adaptation. This paper proceeds in three sections. It first gathers evidence of claims that climate prediction is necessary for adaptation decision-making. This evidence is drawn from peer-reviewed literature and from published science funding strategies and government policy in a number of different countries. The second part discusses the challenges of climate prediction and why science will consistently be unable to provide accurate and precise predictions of future climate relevant for adaptation (usually at the local/regional level). Section three discusses whether these limits to future foresight represent a limit to adaptation, arguing that effective adaptation need not be limited by a general inability to predict future climate. Given the deep uncertainties involved in climate prediction (and even more so in the prediction of climate impacts) and given that climate is usually only one factor in decisions aimed at climate adaptation, we conclude that the ‘predict and provide’ approach to science in support of climate change adaptation is largely flawed. We consider other important areas of public policy fraught with uncertainty - e.g. earthquake risk, national security, public health - where such a ‘predict and provide’ approach is not attempted. Instead of relying on an approach which has climate prediction (and consequent risk assessment) at its heart - which because of the associated epistemological limits to prediction will consequently act as an apparent limit to adaptation - we need to view adaptation differently, in a manner that opens up options for decision making under uncertainty. We suggest an approach which examines the robustness of adaptation strategies/policies/activities to the myriad of uncertainties that face us in the future, only one of which is the state of climate.

  1. Risk and the physics of clinical prediction.

    PubMed

    McEvoy, John W; Diamond, George A; Detrano, Robert C; Kaul, Sanjay; Blaha, Michael J; Blumenthal, Roger S; Jones, Steven R

    2014-04-15

    The current paradigm of primary prevention in cardiology uses traditional risk factors to estimate future cardiovascular risk. These risk estimates are based on prediction models derived from prospective cohort studies and are incorporated into guideline-based initiation algorithms for commonly used preventive pharmacologic treatments, such as aspirin and statins. However, risk estimates are more accurate for populations of similar patients than they are for any individual patient. It may be hazardous to presume that the point estimate of risk derived from a population model represents the most accurate estimate for a given patient. In this review, we exploit principles derived from physics as a metaphor for the distinction between predictions regarding populations versus patients. We identify the following: (1) predictions of risk are accurate at the level of populations but do not translate directly to patients, (2) perfect accuracy of individual risk estimation is unobtainable even with the addition of multiple novel risk factors, and (3) direct measurement of subclinical disease (screening) affords far greater certainty regarding the personalized treatment of patients, whereas risk estimates often remain uncertain for patients. In conclusion, shifting our focus from prediction of events to detection of disease could improve personalized decision-making and outcomes. We also discuss innovative future strategies for risk estimation and treatment allocation in preventive cardiology. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Predicting the future trend of popularity by network diffusion.

    PubMed

    Zeng, An; Yeung, Chi Ho

    2016-06-01

    Conventional approaches to predict the future popularity of products are mainly based on extrapolation of their current popularity, which overlooks the hidden microscopic information under the macroscopic trend. Here, we study diffusion processes on consumer-product and citation networks to exploit the hidden microscopic information and connect consumers to their potential purchase, publications to their potential citers to obtain a prediction for future item popularity. By using the data obtained from the largest online retailers including Netflix and Amazon as well as the American Physical Society citation networks, we found that our method outperforms the accurate short-term extrapolation and identifies the potentially popular items long before they become prominent.

  3. Predicting the future trend of popularity by network diffusion

    NASA Astrophysics Data System (ADS)

    Zeng, An; Yeung, Chi Ho

    2016-06-01

    Conventional approaches to predict the future popularity of products are mainly based on extrapolation of their current popularity, which overlooks the hidden microscopic information under the macroscopic trend. Here, we study diffusion processes on consumer-product and citation networks to exploit the hidden microscopic information and connect consumers to their potential purchase, publications to their potential citers to obtain a prediction for future item popularity. By using the data obtained from the largest online retailers including Netflix and Amazon as well as the American Physical Society citation networks, we found that our method outperforms the accurate short-term extrapolation and identifies the potentially popular items long before they become prominent.

  4. Multi-scale predictions of massive conifer mortality due to chronic temperature rise

    NASA Astrophysics Data System (ADS)

    McDowell, N. G.; Williams, A. P.; Xu, C.; Pockman, W. T.; Dickman, L. T.; Sevanto, S.; Pangle, R.; Limousin, J.; Plaut, J.; Mackay, D. S.; Ogee, J.; Domec, J. C.; Allen, C. D.; Fisher, R. A.; Jiang, X.; Muss, J. D.; Breshears, D. D.; Rauscher, S. A.; Koven, C.

    2016-03-01

    Global temperature rise and extremes accompanying drought threaten forests and their associated climatic feedbacks. Our ability to accurately simulate drought-induced forest impacts remains highly uncertain in part owing to our failure to integrate physiological measurements, regional-scale models, and dynamic global vegetation models (DGVMs). Here we show consistent predictions of widespread mortality of needleleaf evergreen trees (NET) within Southwest USA by 2100 using state-of-the-art models evaluated against empirical data sets. Experimentally, dominant Southwest USA NET species died when they fell below predawn water potential (Ψpd) thresholds (April-August mean) beyond which photosynthesis, hydraulic and stomatal conductance, and carbohydrate availability approached zero. The evaluated regional models accurately predicted NET Ψpd, and 91% of predictions (10 out of 11) exceeded mortality thresholds within the twenty-first century due to temperature rise. The independent DGVMs predicted >=50% loss of Northern Hemisphere NET by 2100, consistent with the NET findings for Southwest USA. Notably, the global models underestimated future mortality within Southwest USA, highlighting that predictions of future mortality within global models may be underestimates. Taken together, the validated regional predictions and the global simulations predict widespread conifer loss in coming decades under projected global warming.

  5. Multi-scale predictions of massive conifer mortality due to chronic temperature rise

    USGS Publications Warehouse

    McDowell, Nathan G.; Williams, A.P.; Xu, C.; Pockman, W. T.; Dickman, L. T.; Sevanto, Sanna; Pangle, R.; Limousin, J.; Plaut, J.J.; Mackay, D.S.; Ogee, J.; Domec, Jean-Christophe; Allen, Craig D.; Fisher, Rosie A.; Jiang, X.; Muss, J.D.; Breshears, D.D.; Rauscher, Sara A.; Koven, C.

    2016-01-01

    Global temperature rise and extremes accompanying drought threaten forests and their associated climatic feedbacks. Our ability to accurately simulate drought-induced forest impacts remains highly uncertain in part owing to our failure to integrate physiological measurements, regional-scale models, and dynamic global vegetation models (DGVMs). Here we show consistent predictions of widespread mortality of needleleaf evergreen trees (NET) within Southwest USA by 2100 using state-of-the-art models evaluated against empirical data sets. Experimentally, dominant Southwest USA NET species died when they fell below predawn water potential (Ψpd) thresholds (April–August mean) beyond which photosynthesis, hydraulic and stomatal conductance, and carbohydrate availability approached zero. The evaluated regional models accurately predicted NET Ψpd, and 91% of predictions (10 out of 11) exceeded mortality thresholds within the twenty-first century due to temperature rise. The independent DGVMs predicted ≥50% loss of Northern Hemisphere NET by 2100, consistent with the NET findings for Southwest USA. Notably, the global models underestimated future mortality within Southwest USA, highlighting that predictions of future mortality within global models may be underestimates. Taken together, the validated regional predictions and the global simulations predict widespread conifer loss in coming decades under projected global warming.

  6. On the Predictability of Future Impact in Science

    PubMed Central

    Penner, Orion; Pan, Raj K.; Petersen, Alexander M.; Kaski, Kimmo; Fortunato, Santo

    2013-01-01

    Correctly assessing a scientist's past research impact and potential for future impact is key in recruitment decisions and other evaluation processes. While a candidate's future impact is the main concern for these decisions, most measures only quantify the impact of previous work. Recently, it has been argued that linear regression models are capable of predicting a scientist's future impact. By applying that future impact model to 762 careers drawn from three disciplines: physics, biology, and mathematics, we identify a number of subtle, but critical, flaws in current models. Specifically, cumulative non-decreasing measures like the h-index contain intrinsic autocorrelation, resulting in significant overestimation of their “predictive power”. Moreover, the predictive power of these models depend heavily upon scientists' career age, producing least accurate estimates for young researchers. Our results place in doubt the suitability of such models, and indicate further investigation is required before they can be used in recruiting decisions. PMID:24165898

  7. Future-Orientated Approaches to Curriculum Development: Fictive Scripting

    ERIC Educational Resources Information Center

    Garraway, James

    2017-01-01

    Though the future cannot be accurately predicted, it is possible to envisage a number of probable developments which can promote thinking about the future and so promote a more informed stance about what should or should not be done. Studies in technology and society have claimed that the use of a type of forecasting using plausible but imaginary…

  8. Helicopter noise prediction - The current status and future direction

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth S.; Farassat, F.

    1992-01-01

    The paper takes stock of the progress, assesses the current prediction capabilities, and forecasts the direction of future helicopter noise prediction research. The acoustic analogy approach, specifically, theories based on the Ffowcs Williams-Hawkings equations, are the most widely used for deterministic noise sources. Thickness and loading noise can be routinely predicted given good plane motion and blade loading inputs. Blade-vortex interaction noise can also be predicted well with measured input data, but prediction of airloads with the high spatial and temporal resolution required for BVI is still difficult. Current semiempirical broadband noise predictions are useful and reasonably accurate. New prediction methods based on a Kirchhoff formula and direct computation appear to be very promising, but are currently very demanding computationally.

  9. The Impacts of Climate Variations on Military Operations in the Horn of Africa

    DTIC Science & Technology

    2006-03-01

    variability in a region. Climate forecasts are predictions of the future state of the climate , much as we think of weather forecasts but at longer...arrive at accurate characterizations of the future state of the climate . Many of the civilian organizations that generate reanalysis data also

  10. Development of Tripropellant CFD Design Code

    NASA Technical Reports Server (NTRS)

    Farmer, Richard C.; Cheng, Gary C.; Anderson, Peter G.

    1998-01-01

    A tripropellant, such as GO2/H2/RP-1, CFD design code has been developed to predict the local mixing of multiple propellant streams as they are injected into a rocket motor. The code utilizes real fluid properties to account for the mixing and finite-rate combustion processes which occur near an injector faceplate, thus the analysis serves as a multi-phase homogeneous spray combustion model. Proper accounting of the combustion allows accurate gas-side temperature predictions which are essential for accurate wall heating analyses. The complex secondary flows which are predicted to occur near a faceplate cannot be quantitatively predicted by less accurate methodology. Test cases have been simulated to describe an axisymmetric tripropellant coaxial injector and a 3-dimensional RP-1/LO2 impinger injector system. The analysis has been shown to realistically describe such injector combustion flowfields. The code is also valuable to design meaningful future experiments by determining the critical location and type of measurements needed.

  11. Probability of criminal acts of violence: a test of jury predictive accuracy.

    PubMed

    Reidy, Thomas J; Sorensen, Jon R; Cunningham, Mark D

    2013-01-01

    The ability of capital juries to accurately predict future prison violence at the sentencing phase of aggravated murder trials was examined through retrospective review of the disciplinary records of 115 male inmates sentenced to either life (n = 65) or death (n = 50) in Oregon from 1985 through 2008, with a mean post-conviction time at risk of 15.3 years. Violent prison behavior was completely unrelated to predictions made by capital jurors, with bidirectional accuracy simply reflecting the base rate of assaultive misconduct in the group. Rejection of the special issue predicting future violence enjoyed 90% accuracy. Conversely, predictions that future violence was probable had 90% error rates. More than 90% of the assaultive rule violations committed by these offenders resulted in no harm or only minor injuries. Copyright © 2013 John Wiley & Sons, Ltd.

  12. Prediction of lung function response for populations exposed to a wide range of ozone conditions

    EPA Science Inventory

    Abstract Context: A human exposure-response (E-R) model that has previously been demonstrated to accurately predict population mean FEV1 response to ozone exposure has been proposed as the foundation for future risk assessments for ambient ozone. Objective: Fit the origi...

  13. Fontan Surgical Planning: Previous Accomplishments, Current Challenges, and Future Directions.

    PubMed

    Trusty, Phillip M; Slesnick, Timothy C; Wei, Zhenglun Alan; Rossignac, Jarek; Kanter, Kirk R; Fogel, Mark A; Yoganathan, Ajit P

    2018-04-01

    The ultimate goal of Fontan surgical planning is to provide additional insights into the clinical decision-making process. In its current state, surgical planning offers an accurate hemodynamic assessment of the pre-operative condition, provides anatomical constraints for potential surgical options, and produces decent post-operative predictions if boundary conditions are similar enough between the pre-operative and post-operative states. Moving forward, validation with post-operative data is a necessary step in order to assess the accuracy of surgical planning and determine which methodological improvements are needed. Future efforts to automate the surgical planning process will reduce the individual expertise needed and encourage use in the clinic by clinicians. As post-operative physiologic predictions improve, Fontan surgical planning will become an more effective tool to accurately model patient-specific hemodynamics.

  14. The accuracy of new wheelchair users' predictions about their future wheelchair use.

    PubMed

    Hoenig, Helen; Griffiths, Patricia; Ganesh, Shanti; Caves, Kevin; Harris, Frances

    2012-06-01

    This study examined the accuracy of new wheelchair user predictions about their future wheelchair use. This was a prospective cohort study of 84 community-dwelling veterans provided a new manual wheelchair. The association between predicted and actual wheelchair use was strong at 3 mos (ϕ coefficient = 0.56), with 90% of those who anticipated using the wheelchair at 3 mos still using it (i.e., positive predictive value = 0.96) and 60% of those who anticipated not using it indeed no longer using the wheelchair (i.e., negative predictive value = 0.60, overall accuracy = 0.92). Predictive accuracy diminished over time, with overall accuracy declining from 0.92 at 3 mos to 0.66 at 6 mos. At all time points, and for all types of use, patients better predicted use as opposed to disuse, with correspondingly higher positive than negative predictive values. Accuracy of prediction of use in specific indoor and outdoor locations varied according to location. This study demonstrates the importance of better understanding the potential mismatch between the anticipated and actual patterns of wheelchair use. The findings suggest that users can be relied upon to accurately predict their basic wheelchair-related needs in the short-term. Further exploration is needed to identify characteristics that will aid users and their providers in more accurately predicting mobility needs for the long-term.

  15. Type- and Subtype-Specific Influenza Forecast.

    PubMed

    Kandula, Sasikiran; Yang, Wan; Shaman, Jeffrey

    2017-03-01

    Prediction of the growth and decline of infectious disease incidence has advanced considerably in recent years. As these forecasts improve, their public health utility should increase, particularly as interventions are developed that make explicit use of forecast information. It is the task of the research community to increase the content and improve the accuracy of these infectious disease predictions. Presently, operational real-time forecasts of total influenza incidence are produced at the municipal and state level in the United States. These forecasts are generated using ensemble simulations depicting local influenza transmission dynamics, which have been optimized prior to forecast with observations of influenza incidence and data assimilation methods. Here, we explore whether forecasts targeted to predict influenza by type and subtype during 2003-2015 in the United States were more or less accurate than forecasts targeted to predict total influenza incidence. We found that forecasts separated by type/subtype generally produced more accurate predictions and, when summed, produced more accurate predictions of total influenza incidence. These findings indicate that monitoring influenza by type and subtype not only provides more detailed observational content but supports more accurate forecasting. More accurate forecasting can help officials better respond to and plan for current and future influenza activity. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Predicting falls in older adults using the four square step test.

    PubMed

    Cleary, Kimberly; Skornyakov, Elena

    2017-10-01

    The Four Square Step Test (FSST) is a performance-based balance tool involving stepping over four single-point canes placed on the floor in a cross configuration. The purpose of this study was to evaluate properties of the FSST in older adults who lived independently. Forty-five community dwelling older adults provided fall history and completed the FSST, Berg Balance Scale (BBS), Timed Up and Go (TUG), and Tinetti in random order. Future falls were recorded for 12 months following testing. The FSST accurately distinguished between non-fallers and multiple fallers, and the 15-second threshold score accurately distinguished multiple fallers from non-multiple fallers based on fall history. The FSST predicted future falls, and performance on the FSST was significantly correlated with performance on the BBS, TUG, and Tinetti. However, the test is not appropriate for older adults who use walkers. Overall, the FSST is a valid yet underutilized measure of balance performance and fall prediction tool that physical therapists should consider using in ambulatory community dwelling older adults.

  17. Predictability of the future development of aggressive behavior of cranial dural arteriovenous fistulas based on decision tree analysis.

    PubMed

    Satomi, Junichiro; Ghaibeh, A Ammar; Moriguchi, Hiroki; Nagahiro, Shinji

    2015-07-01

    The severity of clinical signs and symptoms of cranial dural arteriovenous fistulas (DAVFs) are well correlated with their pattern of venous drainage. Although the presence of cortical venous drainage can be considered a potential predictor of aggressive DAVF behaviors, such as intracranial hemorrhage or progressive neurological deficits due to venous congestion, accurate statistical analyses are currently not available. Using a decision tree data mining method, the authors aimed at clarifying the predictability of the future development of aggressive behaviors of DAVF and at identifying the main causative factors. Of 266 DAVF patients, 89 were eligible for analysis. Under observational management, 51 patients presented with intracranial hemorrhage/infarction during the follow-up period. The authors created a decision tree able to assess the risk for the development of aggressive DAVF behavior. Evaluated by 10-fold cross-validation, the decision tree's accuracy, sensitivity, and specificity were 85.28%, 88.33%, and 80.83%, respectively. The tree shows that the main factor in symptomatic patients was the presence of cortical venous drainage. In its absence, the lesion location determined the risk of a DAVF developing aggressive behavior. Decision tree analysis accurately predicts the future development of aggressive DAVF behavior.

  18. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  19. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  20. Tank System Integrated Model: A Cryogenic Tank Performance Prediction Program

    NASA Technical Reports Server (NTRS)

    Bolshinskiy, L. G.; Hedayat, A.; Hastings, L. J.; Sutherlin, S. G.; Schnell, A. R.; Moder, J. P.

    2017-01-01

    Accurate predictions of the thermodynamic state of the cryogenic propellants, pressurization rate, and performance of pressure control techniques in cryogenic tanks are required for development of cryogenic fluid long-duration storage technology and planning for future space exploration missions. This Technical Memorandum (TM) presents the analytical tool, Tank System Integrated Model (TankSIM), which can be used for modeling pressure control and predicting the behavior of cryogenic propellant for long-term storage for future space missions. Utilizing TankSIM, the following processes can be modeled: tank self-pressurization, boiloff, ullage venting, mixing, and condensation on the tank wall. This TM also includes comparisons of TankSIM program predictions with the test data andexamples of multiphase mission calculations.

  1. Climate Prediction Sees Future Despite Chaos: Researchers Outside NASA use NCCS Resources for Studies

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The air on this mostly sunny January day is crisp and the wind is blustery. The morning's National Weather Service 6-hour forecast had accurately predicted these conditions for the Baltimore-Washington area and the 2-3 day extended outlook was almost perfect. The previous week, the National Center for Environmental Prediction's (NCEP) 6-10 day temperature and precipitation outlook for the general trends for the' region was correct as well. However, no forecast could have predicted specific details about this day. It is 28.5 F in the sunshine bright enough for dark sunglasses, and windy enough to blow off a hat. Such details are impossible to foresee with any accuracy and are outside the scope of routine weather prediction. Equally difficult is accurately forecasting weather beyond about 2 weeks.

  2. Issues and Importance of "Good" Starting Points for Nonlinear Regression for Mathematical Modeling with Maple: Basic Model Fitting to Make Predictions with Oscillating Data

    ERIC Educational Resources Information Center

    Fox, William

    2012-01-01

    The purpose of our modeling effort is to predict future outcomes. We assume the data collected are both accurate and relatively precise. For our oscillating data, we examined several mathematical modeling forms for predictions. We also examined both ignoring the oscillations as an important feature and including the oscillations as an important…

  3. Physics-of-Failure Approach to Prognostics

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.

    2017-01-01

    As more and more electric vehicles emerge in our daily operation progressively, a very critical challenge lies in accurate prediction of the electrical components present in the system. In case of electric vehicles, computing remaining battery charge is safety-critical. In order to tackle and solve the prediction problem, it is essential to have awareness of the current state and health of the system, especially since it is necessary to perform condition-based predictions. To be able to predict the future state of the system, it is also required to possess knowledge of the current and future operations of the vehicle. In this presentation our approach to develop a system level health monitoring safety indicator for different electronic components is presented which runs estimation and prediction algorithms to determine state-of-charge and estimate remaining useful life of respective components. Given models of the current and future system behavior, the general approach of model-based prognostics can be employed as a solution to the prediction problem and further for decision making.

  4. Can phenological models predict tree phenology accurately in the future? The unrevealed hurdle of endodormancy break.

    PubMed

    Chuine, Isabelle; Bonhomme, Marc; Legave, Jean-Michel; García de Cortázar-Atauri, Iñaki; Charrier, Guillaume; Lacointe, André; Améglio, Thierry

    2016-10-01

    The onset of the growing season of trees has been earlier by 2.3 days per decade during the last 40 years in temperate Europe because of global warming. The effect of temperature on plant phenology is, however, not linear because temperature has a dual effect on bud development. On one hand, low temperatures are necessary to break bud endodormancy, and, on the other hand, higher temperatures are necessary to promote bud cell growth afterward. Different process-based models have been developed in the last decades to predict the date of budbreak of woody species. They predict that global warming should delay or compromise endodormancy break at the species equatorward range limits leading to a delay or even impossibility to flower or set new leaves. These models are classically parameterized with flowering or budbreak dates only, with no information on the endodormancy break date because this information is very scarce. Here, we evaluated the efficiency of a set of phenological models to accurately predict the endodormancy break dates of three fruit trees. Our results show that models calibrated solely with budbreak dates usually do not accurately predict the endodormancy break date. Providing endodormancy break date for the model parameterization results in much more accurate prediction of this latter, with, however, a higher error than that on budbreak dates. Most importantly, we show that models not calibrated with endodormancy break dates can generate large discrepancies in forecasted budbreak dates when using climate scenarios as compared to models calibrated with endodormancy break dates. This discrepancy increases with mean annual temperature and is therefore the strongest after 2050 in the southernmost regions. Our results claim for the urgent need of massive measurements of endodormancy break dates in forest and fruit trees to yield more robust projections of phenological changes in a near future. © 2016 John Wiley & Sons Ltd.

  5. Regional analysis of drought and heat impacts on forests: current and future science directions.

    PubMed

    Law, Beverly E

    2014-12-01

    Accurate assessments of forest response to current and future climate and human actions are needed at regional scales. Predicting future impacts on forests will require improved analysis of species-level adaptation, resilience, and vulnerability to mortality. Land system models can be enhanced by creating trait-based groupings of species that better represent climate sensitivity, such as risk of hydraulic failure from drought. This emphasizes the need for more coordinated in situ and remote sensing observations to track changes in ecosystem function, and to improve model inputs, spatio-temporal diagnosis, and predictions of future conditions, including implications of actions to mitigate climate change. © 2014 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  6. Predicting the evolution of complex networks via similarity dynamics

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Chen, Leiting; Zhong, Linfeng; Xian, Xingping

    2017-01-01

    Almost all real-world networks are subject to constant evolution, and plenty of them have been investigated empirically to uncover the underlying evolution mechanism. However, the evolution prediction of dynamic networks still remains a challenging problem. The crux of this matter is to estimate the future network links of dynamic networks. This paper studies the evolution prediction of dynamic networks with link prediction paradigm. To estimate the likelihood of the existence of links more accurate, an effective and robust similarity index is presented by exploiting network structure adaptively. Moreover, most of the existing link prediction methods do not make a clear distinction between future links and missing links. In order to predict the future links, the networks are regarded as dynamic systems in this paper, and a similarity updating method, spatial-temporal position drift model, is developed to simulate the evolutionary dynamics of node similarity. Then the updated similarities are used as input information for the future links' likelihood estimation. Extensive experiments on real-world networks suggest that the proposed similarity index performs better than baseline methods and the position drift model performs well for evolution prediction in real-world evolving networks.

  7. Accurate predictions of iron redox state in silicate glasses: A multivariate approach using X-ray absorption spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyar, M. Darby; McCanta, Molly; Breves, Elly

    2016-03-01

    Pre-edge features in the K absorption edge of X-ray absorption spectra are commonly used to predict Fe3+ valence state in silicate glasses. However, this study shows that using the entire spectral region from the pre-edge into the extended X-ray absorption fine-structure region provides more accurate results when combined with multivariate analysis techniques. The least absolute shrinkage and selection operator (lasso) regression technique yields %Fe3+ values that are accurate to ±3.6% absolute when the full spectral region is employed. This method can be used across a broad range of glass compositions, is easily automated, and is demonstrated to yield accurate resultsmore » from different synchrotrons. It will enable future studies involving X-ray mapping of redox gradients on standard thin sections at 1 × 1 μm pixel sizes.« less

  8. Accurate predictions of iron redox state in silicate glasses: A multivariate approach using X-ray absorption spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyar, M. Darby; McCanta, Molly; Breves, Elly

    2016-03-01

    Pre-edge features in the K absorption edge of X-ray absorption spectra are commonly used to predict Fe 3+ valence state in silicate glasses. However, this study shows that using the entire spectral region from the pre-edge into the extended X-ray absorption fine-structure region provides more accurate results when combined with multivariate analysis techniques. The least absolute shrinkage and selection operator (lasso) regression technique yields %Fe 3+ values that are accurate to ±3.6% absolute when the full spectral region is employed. This method can be used across a broad range of glass compositions, is easily automated, and is demonstrated to yieldmore » accurate results from different synchrotrons. It will enable future studies involving X-ray mapping of redox gradients on standard thin sections at 1 × 1 μm pixel sizes.« less

  9. Measured and predicted rotor performance for the SERI advanced wind turbine blades

    NASA Astrophysics Data System (ADS)

    Tangler, J.; Smith, B.; Kelley, N.; Jager, D.

    1992-02-01

    Measured and predicted rotor performance for the Solar Energy Research Institute (SERI) advanced wind turbine blades were compared to assess the accuracy of predictions and to identify the sources of error affecting both predictions and measurements. An awareness of these sources of error contributes to improved prediction and measurement methods that will ultimately benefit future rotor design efforts. Propeller/vane anemometers were found to underestimate the wind speed in turbulent environments such as the San Gorgonio Pass wind farm area. Using sonic or cup anemometers, good agreement was achieved between predicted and measured power output for wind speeds up to 8 m/sec. At higher wind speeds an optimistic predicted power output and the occurrence of peak power at wind speeds lower than measurements resulted from the omission of turbulence and yaw error. In addition, accurate two-dimensional (2-D) airfoil data prior to stall and a post stall airfoil data synthesization method that reflects three-dimensional (3-D) effects were found to be essential for accurate performance prediction.

  10. Water Quality, Cyanobacteria, and Environmental Factors and Their Relations to Microcystin Concentrations for Use in Predictive Models at Ohio Lake Erie and Inland Lake Recreational Sites, 2013-14

    USGS Publications Warehouse

    Francy, Donna S.; Graham, Jennifer L.; Stelzer, Erin A.; Ecker, Christopher D.; Brady, Amie M. G.; Pam Struffolino,; Loftin, Keith A.

    2015-11-06

    The results of this study showed that water-quality and environmental variables are promising for use in site-specific daily or long-term predictive models. In order to develop more accurate models to predict toxin concentrations at freshwater lake sites, data need to be collected more frequently and for consecutive days in future studies.

  11. The use of machine learning for the identification of peripheral artery disease and future mortality risk.

    PubMed

    Ross, Elsie Gyang; Shah, Nigam H; Dalman, Ronald L; Nead, Kevin T; Cooke, John P; Leeper, Nicholas J

    2016-11-01

    A key aspect of the precision medicine effort is the development of informatics tools that can analyze and interpret "big data" sets in an automated and adaptive fashion while providing accurate and actionable clinical information. The aims of this study were to develop machine learning algorithms for the identification of disease and the prognostication of mortality risk and to determine whether such models perform better than classical statistical analyses. Focusing on peripheral artery disease (PAD), patient data were derived from a prospective, observational study of 1755 patients who presented for elective coronary angiography. We employed multiple supervised machine learning algorithms and used diverse clinical, demographic, imaging, and genomic information in a hypothesis-free manner to build models that could identify patients with PAD and predict future mortality. Comparison was made to standard stepwise linear regression models. Our machine-learned models outperformed stepwise logistic regression models both for the identification of patients with PAD (area under the curve, 0.87 vs 0.76, respectively; P = .03) and for the prediction of future mortality (area under the curve, 0.76 vs 0.65, respectively; P = .10). Both machine-learned models were markedly better calibrated than the stepwise logistic regression models, thus providing more accurate disease and mortality risk estimates. Machine learning approaches can produce more accurate disease classification and prediction models. These tools may prove clinically useful for the automated identification of patients with highly morbid diseases for which aggressive risk factor management can improve outcomes. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  12. Can phenological models predict tree phenology accurately under climate change conditions?

    NASA Astrophysics Data System (ADS)

    Chuine, Isabelle; Bonhomme, Marc; Legave, Jean Michel; García de Cortázar-Atauri, Inaki; Charrier, Guillaume; Lacointe, André; Améglio, Thierry

    2014-05-01

    The onset of the growing season of trees has been globally earlier by 2.3 days/decade during the last 50 years because of global warming and this trend is predicted to continue according to climate forecast. The effect of temperature on plant phenology is however not linear because temperature has a dual effect on bud development. On one hand, low temperatures are necessary to break bud dormancy, and on the other hand higher temperatures are necessary to promote bud cells growth afterwards. Increasing phenological changes in temperate woody species have strong impacts on forest trees distribution and productivity, as well as crops cultivation areas. Accurate predictions of trees phenology are therefore a prerequisite to understand and foresee the impacts of climate change on forests and agrosystems. Different process-based models have been developed in the last two decades to predict the date of budburst or flowering of woody species. They are two main families: (1) one-phase models which consider only the ecodormancy phase and make the assumption that endodormancy is always broken before adequate climatic conditions for cell growth occur; and (2) two-phase models which consider both the endodormancy and ecodormancy phases and predict a date of dormancy break which varies from year to year. So far, one-phase models have been able to predict accurately tree bud break and flowering under historical climate. However, because they do not consider what happens prior to ecodormancy, and especially the possible negative effect of winter temperature warming on dormancy break, it seems unlikely that they can provide accurate predictions in future climate conditions. It is indeed well known that a lack of low temperature results in abnormal pattern of bud break and development in temperate fruit trees. An accurate modelling of the dormancy break date has thus become a major issue in phenology modelling. Two-phases phenological models predict that global warming should delay or compromise dormancy break at the species equatorward range limits leading to a delay or even impossibility to flower or set new leaves. These models are classically parameterized with flowering or budburst dates only, with no information on the dormancy break date because this information is very scarce. We evaluated the efficiency of a set of process-based phenological models to accurately predict the dormancy break dates of four fruit trees. Our results show that models calibrated solely with flowering or budburst dates do not accurately predict the dormancy break date. Providing dormancy break date for the model parameterization results in much more accurate simulation of this latter, with however a higher error than that on flowering or bud break dates. But most importantly, we show also that models not calibrated with dormancy break dates can generate significant differences in forecasted flowering or bud break dates when using climate scenarios. Our results claim for the urgent need of massive measurements of dormancy break dates in forest and fruit trees to yield more robust projections of phenological changes in a near future.

  13. Understanding reproducibility of human IVF traits to predict next IVF cycle outcome.

    PubMed

    Wu, Bin; Shi, Juanzi; Zhao, Wanqiu; Lu, Suzhen; Silva, Marta; Gelety, Timothy J

    2014-10-01

    Evaluating the failed IVF cycle often provides useful prognostic information. Before undergoing another attempt, patients experiencing an unsuccessful IVF cycle frequently request information about the probability of future success. Here, we introduced the concept of reproducibility and formulae to predict the next IVF cycle outcome. The experimental design was based on the retrospective review of IVF cycle data from 2006 to 2013 in two different IVF centers and statistical analysis. The reproducibility coefficients (r) of IVF traits including number of oocytes retrieved, oocyte maturity, fertilization, embryo quality and pregnancy were estimated using the interclass correlation coefficient between the repeated IVF cycle measurements for the same patient by variance component analysis. The formulae were designed to predict next IVF cycle outcome. The number of oocytes retrieved from patients and their fertilization rate had the highest reproducibility coefficients (r = 0.81 ~ 0.84), which indicated a very close correlation between the first retrieval cycle and subsequent IVF cycles. Oocyte maturity and number of top quality embryos had middle level reproducibility (r = 0.38 ~ 0.76) and pregnancy rate had a relative lower reproducibility (r = 0.23 ~ 0.27). Based on these parameters, the next outcome for these IVF traits might be accurately predicted by the designed formulae. The introduction of the concept of reproducibility to our human IVF program allows us to predict future IVF cycle outcomes. The traits of oocyte numbers retrieved, oocyte maturity, fertilization, and top quality embryos had higher or middle reproducibility, which provides a basis for accurate prediction of future IVF outcomes. Based on this prediction, physicians may counsel their patients or change patient's stimulation plans, and laboratory embryologists may improve their IVF techniques accordingly.

  14. In Search of Black Swans: Identifying Students at Risk of Failing Licensing Examinations.

    PubMed

    Barber, Cassandra; Hammond, Robert; Gula, Lorne; Tithecott, Gary; Chahine, Saad

    2018-03-01

    To determine which admissions variables and curricular outcomes are predictive of being at risk of failing the Medical Council of Canada Qualifying Examination Part 1 (MCCQE1), how quickly student risk of failure can be predicted, and to what extent predictive modeling is possible and accurate in estimating future student risk. Data from five graduating cohorts (2011-2015), Schulich School of Medicine & Dentistry, Western University, were collected and analyzed using hierarchical generalized linear models (HGLMs). Area under the receiver operating characteristic curve (AUC) was used to evaluate the accuracy of predictive models and determine whether they could be used to predict future risk, using the 2016 graduating cohort. Four predictive models were developed to predict student risk of failure at admissions, year 1, year 2, and pre-MCCQE1. The HGLM analyses identified gender, MCAT verbal reasoning score, two preclerkship course mean grades, and the year 4 summative objective structured clinical examination score as significant predictors of student risk. The predictive accuracy of the models varied. The pre-MCCQE1 model was the most accurate at predicting a student's risk of failing (AUC 0.66-0.93), while the admissions model was not predictive (AUC 0.25-0.47). Key variables predictive of students at risk were found. The predictive models developed suggest, while it is not possible to identify student risk at admission, we can begin to identify and monitor students within the first year. Using such models, programs may be able to identify and monitor students at risk quantitatively and develop tailored intervention strategies.

  15. Predictive modeling of complications.

    PubMed

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  16. Predicting Renal Failure Progression in Chronic Kidney Disease Using Integrated Intelligent Fuzzy Expert System.

    PubMed

    Norouzi, Jamshid; Yadollahpour, Ali; Mirbagheri, Seyed Ahmad; Mazdeh, Mitra Mahdavi; Hosseini, Seyed Ahmad

    2016-01-01

    Chronic kidney disease (CKD) is a covert disease. Accurate prediction of CKD progression over time is necessary for reducing its costs and mortality rates. The present study proposes an adaptive neurofuzzy inference system (ANFIS) for predicting the renal failure timeframe of CKD based on real clinical data. This study used 10-year clinical records of newly diagnosed CKD patients. The threshold value of 15 cc/kg/min/1.73 m(2) of glomerular filtration rate (GFR) was used as the marker of renal failure. A Takagi-Sugeno type ANFIS model was used to predict GFR values. Variables of age, sex, weight, underlying diseases, diastolic blood pressure, creatinine, calcium, phosphorus, uric acid, and GFR were initially selected for the predicting model. Weight, diastolic blood pressure, diabetes mellitus as underlying disease, and current GFR(t) showed significant correlation with GFRs and were selected as the inputs of model. The comparisons of the predicted values with the real data showed that the ANFIS model could accurately estimate GFR variations in all sequential periods (Normalized Mean Absolute Error lower than 5%). Despite the high uncertainties of human body and dynamic nature of CKD progression, our model can accurately predict the GFR variations at long future periods.

  17. Future of endemic flora of biodiversity hotspots in India.

    PubMed

    Chitale, Vishwas Sudhir; Behera, Mukund Dev; Roy, Partha Sarthi

    2014-01-01

    India is one of the 12 mega biodiversity countries of the world, which represents 11% of world's flora in about 2.4% of global land mass. Approximately 28% of the total Indian flora and 33% of angiosperms occurring in India are endemic. Higher human population density in biodiversity hotspots in India puts undue pressure on these sensitive eco-regions. In the present study, we predict the future distribution of 637 endemic plant species from three biodiversity hotspots in India; Himalaya, Western Ghats, Indo-Burma, based on A1B scenario for year 2050 and 2080. We develop individual variable based models as well as mixed models in MaxEnt by combining ten least co-related bioclimatic variables, two disturbance variables and one physiography variable as predictor variables. The projected changes suggest that the endemic flora will be adversely impacted, even under such a moderate climate scenario. The future distribution is predicted to shift in northern and north-eastern direction in Himalaya and Indo-Burma, while in southern and south-western direction in Western Ghats, due to cooler climatic conditions in these regions. In the future distribution of endemic plants, we observe a significant shift and reduction in the distribution range compared to the present distribution. The model predicts a 23.99% range reduction and a 7.70% range expansion in future distribution by 2050, while a 41.34% range reduction and a 24.10% range expansion by 2080. Integration of disturbance and physiography variables along with bioclimatic variables in the models improved the prediction accuracy. Mixed models provide most accurate results for most of the combinations of climatic and non-climatic variables as compared to individual variable based models. We conclude that a) regions with cooler climates and higher moisture availability could serve as refugia for endemic plants in future climatic conditions; b) mixed models provide more accurate results, compared to single variable based models.

  18. Future of Endemic Flora of Biodiversity Hotspots in India

    PubMed Central

    Chitale, Vishwas Sudhir; Behera, Mukund Dev; Roy, Partha Sarthi

    2014-01-01

    India is one of the 12 mega biodiversity countries of the world, which represents 11% of world's flora in about 2.4% of global land mass. Approximately 28% of the total Indian flora and 33% of angiosperms occurring in India are endemic. Higher human population density in biodiversity hotspots in India puts undue pressure on these sensitive eco-regions. In the present study, we predict the future distribution of 637 endemic plant species from three biodiversity hotspots in India; Himalaya, Western Ghats, Indo-Burma, based on A1B scenario for year 2050 and 2080. We develop individual variable based models as well as mixed models in MaxEnt by combining ten least co-related bioclimatic variables, two disturbance variables and one physiography variable as predictor variables. The projected changes suggest that the endemic flora will be adversely impacted, even under such a moderate climate scenario. The future distribution is predicted to shift in northern and north-eastern direction in Himalaya and Indo-Burma, while in southern and south-western direction in Western Ghats, due to cooler climatic conditions in these regions. In the future distribution of endemic plants, we observe a significant shift and reduction in the distribution range compared to the present distribution. The model predicts a 23.99% range reduction and a 7.70% range expansion in future distribution by 2050, while a 41.34% range reduction and a 24.10% range expansion by 2080. Integration of disturbance and physiography variables along with bioclimatic variables in the models improved the prediction accuracy. Mixed models provide most accurate results for most of the combinations of climatic and non-climatic variables as compared to individual variable based models. We conclude that a) regions with cooler climates and higher moisture availability could serve as refugia for endemic plants in future climatic conditions; b) mixed models provide more accurate results, compared to single variable based models. PMID:25501852

  19. A hybrid intelligent method for three-dimensional short-term prediction of dissolved oxygen content in aquaculture.

    PubMed

    Chen, Yingyi; Yu, Huihui; Cheng, Yanjun; Cheng, Qianqian; Li, Daoliang

    2018-01-01

    A precise predictive model is important for obtaining a clear understanding of the changes in dissolved oxygen content in crab ponds. Highly accurate interval forecasting of dissolved oxygen content is fundamental to reduce risk, and three-dimensional prediction can provide more accurate results and overall guidance. In this study, a hybrid three-dimensional (3D) dissolved oxygen content prediction model based on a radial basis function (RBF) neural network, K-means and subtractive clustering was developed and named the subtractive clustering (SC)-K-means-RBF model. In this modeling process, K-means and subtractive clustering methods were employed to enhance the hyperparameters required in the RBF neural network model. The comparison of the predicted results of different traditional models validated the effectiveness and accuracy of the proposed hybrid SC-K-means-RBF model for three-dimensional prediction of dissolved oxygen content. Consequently, the proposed model can effectively display the three-dimensional distribution of dissolved oxygen content and serve as a guide for feeding and future studies.

  20. Quasi-closed phase forward-backward linear prediction analysis of speech for accurate formant detection and estimation.

    PubMed

    Gowda, Dhananjaya; Airaksinen, Manu; Alku, Paavo

    2017-09-01

    Recently, a quasi-closed phase (QCP) analysis of speech signals for accurate glottal inverse filtering was proposed. However, the QCP analysis which belongs to the family of temporally weighted linear prediction (WLP) methods uses the conventional forward type of sample prediction. This may not be the best choice especially in computing WLP models with a hard-limiting weighting function. A sample selective minimization of the prediction error in WLP reduces the effective number of samples available within a given window frame. To counter this problem, a modified quasi-closed phase forward-backward (QCP-FB) analysis is proposed, wherein each sample is predicted based on its past as well as future samples thereby utilizing the available number of samples more effectively. Formant detection and estimation experiments on synthetic vowels generated using a physical modeling approach as well as natural speech utterances show that the proposed QCP-FB method yields statistically significant improvements over the conventional linear prediction and QCP methods.

  1. Predicting Violent Behavior: What Can Neuroscience Add?

    PubMed

    Poldrack, Russell A; Monahan, John; Imrey, Peter B; Reyna, Valerie; Raichle, Marcus E; Faigman, David; Buckholtz, Joshua W

    2018-02-01

    The ability to accurately predict violence and other forms of serious antisocial behavior would provide important societal benefits, and there is substantial enthusiasm for the potential predictive accuracy of neuroimaging techniques. Here, we review the current status of violence prediction using actuarial and clinical methods, and assess the current state of neuroprediction. We then outline several questions that need to be addressed by future studies of neuroprediction if neuroimaging and other neuroscientific markers are to be successfully translated into public policy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Psychosis prediction and clinical utility in familial high-risk studies: Selective review, synthesis, and implications for early detection and intervention

    PubMed Central

    Shah, Jai L.; Tandon, Neeraj; Keshavan, Matcheri S.

    2016-01-01

    Aim Accurate prediction of which individuals will go on to develop psychosis would assist early intervention and prevention paradigms. We sought to review investigations of prospective psychosis prediction based on markers and variables examined in longitudinal familial high-risk (FHR) studies. Methods We performed literature searches in MedLine, PubMed and PsycINFO for articles assessing performance characteristics of predictive clinical tests in FHR studies of psychosis. Studies were included if they reported one or more predictive variables in subjects at FHR for psychosis. We complemented this search strategy with references drawn from articles, reviews, book chapters and monographs. Results Across generations of familial high-risk projects, predictive studies have investigated behavioral, cognitive, psychometric, clinical, neuroimaging, and other markers. Recent analyses have incorporated multivariate and multi-domain approaches to risk ascertainment, although with still generally modest results. Conclusions While a broad range of risk factors has been identified, no individual marker or combination of markers can at this time enable accurate prospective prediction of emerging psychosis for individuals at FHR. We outline the complex and multi-level nature of psychotic illness, the myriad of factors influencing its development, and methodological hurdles to accurate and reliable prediction. Prospects and challenges for future generations of FHR studies are discussed in the context of early detection and intervention strategies. PMID:23693118

  3. Practical implications for genetic modeling in the genomics era

    USDA-ARS?s Scientific Manuscript database

    Genetic models convert data into estimated breeding values and other information useful to breeders. The goal is to provide accurate and timely predictions of the future performance for each animal (or embryo). Modeling involves defining traits, editing raw data, removing environmental effects, incl...

  4. Influence of Information Technology on Kinesiology and Physical Education.

    ERIC Educational Resources Information Center

    Haggerty, Terry R.

    1997-01-01

    This paper discusses the difficulty of accurately predicting the future role of information technology, presents an overview of technological advances, and highlights such special interest areas as virtual reality, the information highway, and the influence of computers on traditional ways of thinking. (SM)

  5. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  6. Accuracy and artifact: reexamining the intensity bias in affective forecasting.

    PubMed

    Levine, Linda J; Lench, Heather C; Kaplan, Robin L; Safer, Martin A

    2012-10-01

    Research on affective forecasting shows that people have a robust tendency to overestimate the intensity of future emotion. We hypothesized that (a) people can accurately predict the intensity of their feelings about events and (b) a procedural artifact contributes to people's tendency to overestimate the intensity of their feelings in general. People may misinterpret the forecasting question as asking how they will feel about a focal event, but they are later asked to report their feelings in general without reference to that event. In the current investigation, participants predicted and reported both their feelings in general and their feelings about an election outcome (Study 1) and an exam grade (Study 3). We also assessed how participants interpreted forecasting questions (Studies 2 and 4) and conducted a meta-analysis of affective forecasting research (Study 5). The results showed that participants accurately predicted the intensity of their feelings about events. They overestimated only when asked to predict how they would feel in general and later report their feelings without reference to the focal event. Most participants, however, misinterpreted requests to predict their feelings in general as asking how they would feel when they were thinking about the focal event. Clarifying the meaning of the forecasting question significantly reduced overestimation. These findings reveal that people have more sophisticated self-knowledge than is commonly portrayed in the affective forecasting literature. Overestimation of future emotion is partly due to a procedure in which people predict one thing but are later asked to report another.

  7. Fire Consortia for Advanced Modeling of Meteorology and Smoke-FCAMMS: a National Paradigm for Wildland Fire and Smoke Management

    Treesearch

    A. R. Riebau; D. G. Fox

    2003-01-01

    Fires can be catastrophic, but only when the weather permits. Predicting the weather more than a few hours into the future with accuracy, precision and reliability is an on-going challenge to researchers. Accurate and precise forecasting for more than a few hours into the future has been virtually unrealizable until the latter half of the 20th Century. In the modern...

  8. Building accurate historic and future climate MEPDG input files for Louisiana DOTD : tech summary.

    DOT National Transportation Integrated Search

    2017-02-01

    The new pavement design process (originally MEPDG, then DARWin-ME, and now Pavement ME Design) requires two types : of inputs to infl uence the prediction of pavement distress for a selected set of pavement materials and structure. One input is : tra...

  9. Prognostics and Health Monitoring: Application to Electric Vehicles

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.

    2017-01-01

    As more and more autonomous electric vehicles emerge in our daily operation progressively, a very critical challenge lies in accurate prediction of remaining useful life of the systemssubsystems, specifically the electrical powertrain. In case of electric aircrafts, computing remaining flying time is safety-critical, since an aircraft that runs out of power (battery charge) while in the air will eventually lose control leading to catastrophe. In order to tackle and solve the prediction problem, it is essential to have awareness of the current state and health of the system, especially since it is necessary to perform condition-based predictions. To be able to predict the future state of the system, it is also required to possess knowledge of the current and future operations of the vehicle.Our research approach is to develop a system level health monitoring safety indicator either to the pilotautopilot for the electric vehicles which runs estimation and prediction algorithms to estimate remaining useful life of the vehicle e.g. determine state-of-charge in batteries. Given models of the current and future system behavior, a general approach of model-based prognostics can be employed as a solution to the prediction problem and further for decision making.

  10. CFD Modeling of Launch Vehicle Aerodynamic Heating

    NASA Technical Reports Server (NTRS)

    Tashakkor, Scott B.; Canabal, Francisco; Mishtawy, Jason E.

    2011-01-01

    The Loci-CHEM 3.2 Computational Fluid Dynamics (CFD) code is being used to predict Ares-I launch vehicle aerodynamic heating. CFD has been used to predict both ascent and stage reentry environments and has been validated against wind tunnel tests and the Ares I-X developmental flight test. Most of the CFD predictions agreed with measurements. On regions where mismatches occurred, the CFD predictions tended to be higher than measured data. These higher predictions usually occurred in complex regions, where the CFD models (mainly turbulence) contain less accurate approximations. In some instances, the errors causing the over-predictions would cause locations downstream to be affected even though the physics were still being modeled properly by CHEM. This is easily seen when comparing to the 103-AH data. In the areas where predictions were low, higher grid resolution often brought the results closer to the data. Other disagreements are attributed to Ares I-X hardware not being present in the grid, as a result of computational resources limitations. The satisfactory predictions from CHEM provide confidence that future designs and predictions from the CFD code will provide an accurate approximation of the correct values for use in design and other applications

  11. Genomic signals of selection predict climate-driven population declines in a migratory bird.

    PubMed

    Bay, Rachael A; Harrigan, Ryan J; Underwood, Vinh Le; Gibbs, H Lisle; Smith, Thomas B; Ruegg, Kristen

    2018-01-05

    The ongoing loss of biodiversity caused by rapid climatic shifts requires accurate models for predicting species' responses. Despite evidence that evolutionary adaptation could mitigate climate change impacts, evolution is rarely integrated into predictive models. Integrating population genomics and environmental data, we identified genomic variation associated with climate across the breeding range of the migratory songbird, yellow warbler ( Setophaga petechia ). Populations requiring the greatest shifts in allele frequencies to keep pace with future climate change have experienced the largest population declines, suggesting that failure to adapt may have already negatively affected populations. Broadly, our study suggests that the integration of genomic adaptation can increase the accuracy of future species distribution models and ultimately guide more effective mitigation efforts. Copyright © 2018, American Association for the Advancement of Science.

  12. Practical implications for genetic modeling in the genomics era for the dairy industry

    USDA-ARS?s Scientific Manuscript database

    Genetic models convert data into estimated breeding values and other information useful to breeders. The goal is to provide accurate and timely predictions of the future performance for each animal (or embryo). Modeling involves defining traits, editing raw data, removing environmental effects, incl...

  13. Computer Aided Evaluation of Higher Education Tutors' Performance

    ERIC Educational Resources Information Center

    Xenos, Michalis; Papadopoulos, Thanos

    2007-01-01

    This article presents a method for computer-aided tutor evaluation: Bayesian Networks are used for organizing the collected data about tutors and for enabling accurate estimations and predictions about future tutor behavior. The model provides indications about each tutor's strengths and weaknesses, which enables the evaluator to exploit strengths…

  14. Accurately Predicting Future Reading Difficulty for Bilingual Latino Children at Risk for Language Impairment

    ERIC Educational Resources Information Center

    Petersen, Douglas B.; Gillam, Ronald B.

    2013-01-01

    Sixty-three bilingual Latino children who were at risk for language impairment were administered reading-related measures in English and Spanish (letter identification, phonological awareness, rapid automatized naming, and sentence repetition) and descriptive measures including English language proficiency (ELP), language ability (LA),…

  15. A hybrid intelligent method for three-dimensional short-term prediction of dissolved oxygen content in aquaculture

    PubMed Central

    Yu, Huihui; Cheng, Yanjun; Cheng, Qianqian; Li, Daoliang

    2018-01-01

    A precise predictive model is important for obtaining a clear understanding of the changes in dissolved oxygen content in crab ponds. Highly accurate interval forecasting of dissolved oxygen content is fundamental to reduce risk, and three-dimensional prediction can provide more accurate results and overall guidance. In this study, a hybrid three-dimensional (3D) dissolved oxygen content prediction model based on a radial basis function (RBF) neural network, K-means and subtractive clustering was developed and named the subtractive clustering (SC)-K-means-RBF model. In this modeling process, K-means and subtractive clustering methods were employed to enhance the hyperparameters required in the RBF neural network model. The comparison of the predicted results of different traditional models validated the effectiveness and accuracy of the proposed hybrid SC-K-means-RBF model for three-dimensional prediction of dissolved oxygen content. Consequently, the proposed model can effectively display the three-dimensional distribution of dissolved oxygen content and serve as a guide for feeding and future studies. PMID:29466394

  16. The influence of coarse-scale environmental features on current and predicted future distributions of narrow-range endemic crayfish populations

    USGS Publications Warehouse

    Dyer, Joseph J.; Brewer, Shannon K.; Worthington, Thomas A.; Bergey, Elizabeth A.

    2013-01-01

    1.A major limitation to effective management of narrow-range crayfish populations is the paucity of information on the spatial distribution of crayfish species and a general understanding of the interacting environmental variables that drive current and future potential distributional patterns. 2.Maximum Entropy Species Distribution Modeling Software (MaxEnt) was used to predict the current and future potential distributions of four endemic crayfish species in the Ouachita Mountains. Current distributions were modelled using climate, geology, soils, land use, landform and flow variables thought to be important to lotic crayfish. Potential changes in the distribution were forecast by using models trained on current conditions and projecting onto the landscape predicted under climate-change scenarios. 3.The modelled distribution of the four species closely resembled the perceived distribution of each species but also predicted populations in streams and catchments where they had not previously been collected. Soils, elevation and winter precipitation and temperature most strongly related to current distributions and represented 6587% of the predictive power of the models. Model accuracy was high for all models, and model predictions of new populations were verified through additional field sampling. 4.Current models created using two spatial resolutions (1 and 4.5km2) showed that fine-resolution data more accurately represented current distributions. For three of the four species, the 1-km2 resolution models resulted in more conservative predictions. However, the modelled distributional extent of Orconectes leptogonopodus was similar regardless of data resolution. Field validations indicated 1-km2 resolution models were more accurate than 4.5-km2 resolution models. 5.Future projected (4.5-km2 resolution models) model distributions indicated three of the four endemic species would have truncated ranges with low occurrence probabilities under the low-emission scenario, whereas two of four species would be severely restricted in range under moderatehigh emissions. Discrepancies in the two emission scenarios probably relate to the exclusion of behavioural adaptations from species-distribution models. 6.These model predictions illustrate possible impacts of climate change on narrow-range endemic crayfish populations. The predictions do not account for biotic interactions, migration, local habitat conditions or species adaptation. However, we identified the constraining landscape features acting on these populations that provide a framework for addressing habitat needs at a fine scale and developing targeted and systematic monitoring programmes.

  17. Future missions studies: Combining Schatten's solar activity prediction model with a chaotic prediction model

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.

    1991-01-01

    K. Schatten (1991) recently developed a method for combining his prediction model with our chaotic model. The philosophy behind this combined model and his method of combination is explained. Because the Schatten solar prediction model (KS) uses a dynamo to mimic solar dynamics, accurate prediction is limited to long-term solar behavior (10 to 20 years). The Chaotic prediction model (SA) uses the recently developed techniques of nonlinear dynamics to predict solar activity. It can be used to predict activity only up to the horizon. In theory, the chaotic prediction should be several orders of magnitude better than statistical predictions up to that horizon; beyond the horizon, chaotic predictions would theoretically be just as good as statistical predictions. Therefore, chaos theory puts a fundamental limit on predictability.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wosnik, Martin; Bachant, Pete; Neary, Vincent Sinclair

    CACTUS, developed by Sandia National Laboratories, is an open-source code for the design and analysis of wind and hydrokinetic turbines. While it has undergone extensive validation for both vertical axis and horizontal axis wind turbines, and it has been demonstrated to accurately predict the performance of horizontal (axial-flow) hydrokinetic turbines, its ability to predict the performance of crossflow hydrokinetic turbines has yet to be tested. The present study addresses this problem by comparing the predicted performance curves derived from CACTUS simulations of the U.S. Department of Energy’s 1:6 scale reference model crossflow turbine to those derived by experimental measurements inmore » a tow tank using the same model turbine at the University of New Hampshire. It shows that CACTUS cannot accurately predict the performance of this crossflow turbine, raising concerns on its application to crossflow hydrokinetic turbines generally. The lack of quality data on NACA 0021 foil aerodynamic (hydrodynamic) characteristics over the wide range of angles of attack (AoA) and Reynolds numbers is identified as the main cause for poor model prediction. A comparison of several different NACA 0021 foil data sources, derived using both physical and numerical modeling experiments, indicates significant discrepancies at the high AoA experienced by foils on crossflow turbines. Users of CACTUS for crossflow hydrokinetic turbines are, therefore, advised to limit its application to higher tip speed ratios (lower AoA), and to carefully verify the reliability and accuracy of their foil data. Accurate empirical data on the aerodynamic characteristics of the foil is the greatest limitation to predicting performance for crossflow turbines with semi-empirical models like CACTUS. Future improvements of CACTUS for crossflow turbine performance prediction will require the development of accurate foil aerodynamic characteristic data sets within the appropriate ranges of Reynolds numbers and AoA.« less

  19. Reward-related neural activity and structure predict future substance use in dysregulated youth.

    PubMed

    Bertocci, M A; Bebko, G; Versace, A; Iyengar, S; Bonar, L; Forbes, E E; Almeida, J R C; Perlman, S B; Schirda, C; Travis, M J; Gill, M K; Diwadkar, V A; Sunshine, J L; Holland, S K; Kowatch, R A; Birmaher, B; Axelson, D A; Frazier, T W; Arnold, L E; Fristad, M A; Youngstrom, E A; Horwitz, S M; Findling, R L; Phillips, M L

    2017-06-01

    Identifying youth who may engage in future substance use could facilitate early identification of substance use disorder vulnerability. We aimed to identify biomarkers that predicted future substance use in psychiatrically un-well youth. LASSO regression for variable selection was used to predict substance use 24.3 months after neuroimaging assessment in 73 behaviorally and emotionally dysregulated youth aged 13.9 (s.d. = 2.0) years, 30 female, from three clinical sites in the Longitudinal Assessment of Manic Symptoms (LAMS) study. Predictor variables included neural activity during a reward task, cortical thickness, and clinical and demographic variables. Future substance use was associated with higher left middle prefrontal cortex activity, lower left ventral anterior insula activity, thicker caudal anterior cingulate cortex, higher depression and lower mania scores, not using antipsychotic medication, more parental stress, older age. This combination of variables explained 60.4% of the variance in future substance use, and accurately classified 83.6%. These variables explained a large proportion of the variance, were useful classifiers of future substance use, and showed the value of combining multiple domains to provide a comprehensive understanding of substance use development. This may be a step toward identifying neural measures that can identify future substance use disorder risk, and act as targets for therapeutic interventions.

  20. Calculating Reuse Distance from Source Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narayanan, Sri Hari Krishna; Hovland, Paul

    The efficient use of a system is of paramount importance in high-performance computing. Applications need to be engineered for future systems even before the architecture of such a system is clearly known. Static performance analysis that generates performance bounds is one way to approach the task of understanding application behavior. Performance bounds provide an upper limit on the performance of an application on a given architecture. Predicting cache hierarchy behavior and accesses to main memory is a requirement for accurate performance bounds. This work presents our static reuse distance algorithm to generate reuse distance histograms. We then use these histogramsmore » to predict cache miss rates. Experimental results for kernels studied show that the approach is accurate.« less

  1. Forecasting Construction Cost Index based on visibility graph: A network approach

    NASA Astrophysics Data System (ADS)

    Zhang, Rong; Ashuri, Baabak; Shyr, Yu; Deng, Yong

    2018-03-01

    Engineering News-Record (ENR), a professional magazine in the field of global construction engineering, publishes Construction Cost Index (CCI) every month. Cost estimators and contractors assess projects, arrange budgets and prepare bids by forecasting CCI. However, fluctuations and uncertainties of CCI cause irrational estimations now and then. This paper aims at achieving more accurate predictions of CCI based on a network approach in which time series is firstly converted into a visibility graph and future values are forecasted relied on link prediction. According to the experimental results, the proposed method shows satisfactory performance since the error measures are acceptable. Compared with other methods, the proposed method is easier to implement and is able to forecast CCI with less errors. It is convinced that the proposed method is efficient to provide considerably accurate CCI predictions, which will make contributions to the construction engineering by assisting individuals and organizations in reducing costs and making project schedules.

  2. Predict the Medicare Functional Classification Level (K-level) using the Amputee Mobility Predictor in people with unilateral transfemoral and transtibial amputation: A pilot study.

    PubMed

    Dillon, Michael P; Major, Matthew J; Kaluf, Brian; Balasanov, Yuri; Fatone, Stefania

    2018-04-01

    While Amputee Mobility Predictor scores differ between Medicare Functional Classification Levels (K-level), this does not demonstrate that the Amputee Mobility Predictor can accurately predict K-level. To determine how accurately K-level could be predicted using the Amputee Mobility Predictor in combination with patient characteristics for persons with transtibial and transfemoral amputation. Prediction. A cumulative odds ordinal logistic regression was built to determine the effect that the Amputee Mobility Predictor, in combination with patient characteristics, had on the odds of being assigned to a particular K-level in 198 people with transtibial or transfemoral amputation. For people assigned to the K2 or K3 level by their clinician, the Amputee Mobility Predictor predicted the clinician-assigned K-level more than 80% of the time. For people assigned to the K1 or K4 level by their clinician, the prediction of clinician-assigned K-level was less accurate. The odds of being in a higher K-level improved with younger age and transfemoral amputation. Ordinal logistic regression can be used to predict the odds of being assigned to a particular K-level using the Amputee Mobility Predictor and patient characteristics. This pilot study highlighted critical method design issues, such as potential predictor variables and sample size requirements for future prospective research. Clinical relevance This pilot study demonstrated that the odds of being assigned a particular K-level could be predicted using the Amputee Mobility Predictor score and patient characteristics. While the model seemed sufficiently accurate to predict clinician assignment to the K2 or K3 level, further work is needed in larger and more representative samples, particularly for people with low (K1) and high (K4) levels of mobility, to be confident in the model's predictive value prior to use in clinical practice.

  3. Validation of Afterbody Aeroheating Predictions for Planetary Probes: Status and Future Work

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Brown, James L.; Sinha, Krishnendu; Candler, Graham V.; Milos, Frank S.; Prabhu, DInesh K.

    2005-01-01

    A review of the relevant flight conditions and physical models for planetary probe afterbody aeroheating calculations is given. Readily available sources of afterbody flight data and published attempts to computationally simulate those flights are summarized. A current status of the application of turbulence models to afterbody flows is presented. Finally, recommendations for additional analysis and testing that would reduce our uncertainties in our ability to accurately predict base heating levels are given.

  4. Prediction of future uniform milk prices in Florida federal milk marketing order 6 from milk futures markets.

    PubMed

    De Vries, A; Feleke, S

    2008-12-01

    This study assessed the accuracy of 3 methods that predict the uniform milk price in Federal Milk Marketing Order 6 (Florida). Predictions were made for 1 to 12 mo into the future. Data were from January 2003 to May 2007. The CURRENT method assumed that future uniform milk prices were equal to the last announced uniform milk price. The F+BASIS and F+UTIL methods were based on the milk futures markets because the futures prices reflect the market's expectation of the class III and class IV cash prices that are announced monthly by USDA. The F+BASIS method added an exponentially weighted moving average of the difference between the class III cash price and the historical uniform milk price (also known as basis) to the class III futures price. The F+UTIL method used the class III and class IV futures prices, the most recently announced butter price, and historical utilizations to predict the skim milk prices, butterfat prices, and utilizations in all 4 classes. Predictions of future utilizations were made with a Holt-Winters smoothing method. Federal Milk Marketing Order 6 had high class I utilization (85 +/- 4.8%). Mean and standard deviation of the class III and class IV cash prices were $13.39 +/- 2.40/cwt (1 cwt = 45.36 kg) and $12.06 +/- 1.80/cwt, respectively. The actual uniform price in Tampa, Florida, was $16.62 +/- 2.16/cwt. The basis was $3.23 +/- 1.23/cwt. The F+BASIS and F+UTIL predictions were generally too low during the period considered because the class III cash prices were greater than the corresponding class III futures prices. For the 1- to 6-mo-ahead predictions, the root of the mean squared prediction errors from the F+BASIS method were $1.12, $1.20, $1.55, $1.91, $2.16, and $2.34/cwt, respectively. The root of the mean squared prediction errors ranged from $2.50 to $2.73/cwt for predictions up to 12 mo ahead. Results from the F+UTIL method were similar. The accuracies of the F+BASIS and F+UTIL methods for all 12 fore-cast horizons were not significantly different. Application of the modified Mariano-Diebold tests showed that no method included all the information contained in the other methods. In conclusion, both F+BASIS and F+UTIL methods tended to more accurately predict the future uniform milk prices than the CURRENT method, but prediction errors could be substantial even a few months into the future. The majority of the prediction error was caused by the inefficiency of the futures markets to predict the class III cash prices.

  5. Predicting consumer behavior with Web search.

    PubMed

    Goel, Sharad; Hofman, Jake M; Lahaie, Sébastien; Pennock, David M; Watts, Duncan J

    2010-10-12

    Recent work has demonstrated that Web search volume can "predict the present," meaning that it can be used to accurately track outcomes such as unemployment levels, auto and home sales, and disease prevalence in near real time. Here we show that what consumers are searching for online can also predict their collective future behavior days or even weeks in advance. Specifically we use search query volume to forecast the opening weekend box-office revenue for feature films, first-month sales of video games, and the rank of songs on the Billboard Hot 100 chart, finding in all cases that search counts are highly predictive of future outcomes. We also find that search counts generally boost the performance of baseline models fit on other publicly available data, where the boost varies from modest to dramatic, depending on the application in question. Finally, we reexamine previous work on tracking flu trends and show that, perhaps surprisingly, the utility of search data relative to a simple autoregressive model is modest. We conclude that in the absence of other data sources, or where small improvements in predictive performance are material, search queries provide a useful guide to the near future.

  6. Predicting consumer behavior with Web search

    PubMed Central

    Goel, Sharad; Hofman, Jake M.; Lahaie, Sébastien; Pennock, David M.; Watts, Duncan J.

    2010-01-01

    Recent work has demonstrated that Web search volume can “predict the present,” meaning that it can be used to accurately track outcomes such as unemployment levels, auto and home sales, and disease prevalence in near real time. Here we show that what consumers are searching for online can also predict their collective future behavior days or even weeks in advance. Specifically we use search query volume to forecast the opening weekend box-office revenue for feature films, first-month sales of video games, and the rank of songs on the Billboard Hot 100 chart, finding in all cases that search counts are highly predictive of future outcomes. We also find that search counts generally boost the performance of baseline models fit on other publicly available data, where the boost varies from modest to dramatic, depending on the application in question. Finally, we reexamine previous work on tracking flu trends and show that, perhaps surprisingly, the utility of search data relative to a simple autoregressive model is modest. We conclude that in the absence of other data sources, or where small improvements in predictive performance are material, search queries provide a useful guide to the near future. PMID:20876140

  7. Artificial Neural Network and Genetic Algorithm Hybrid Intelligence for Predicting Thai Stock Price Index Trend

    PubMed Central

    Boonjing, Veera; Intakosum, Sarun

    2016-01-01

    This study investigated the use of Artificial Neural Network (ANN) and Genetic Algorithm (GA) for prediction of Thailand's SET50 index trend. ANN is a widely accepted machine learning method that uses past data to predict future trend, while GA is an algorithm that can find better subsets of input variables for importing into ANN, hence enabling more accurate prediction by its efficient feature selection. The imported data were chosen technical indicators highly regarded by stock analysts, each represented by 4 input variables that were based on past time spans of 4 different lengths: 3-, 5-, 10-, and 15-day spans before the day of prediction. This import undertaking generated a big set of diverse input variables with an exponentially higher number of possible subsets that GA culled down to a manageable number of more effective ones. SET50 index data of the past 6 years, from 2009 to 2014, were used to evaluate this hybrid intelligence prediction accuracy, and the hybrid's prediction results were found to be more accurate than those made by a method using only one input variable for one fixed length of past time span. PMID:27974883

  8. Artificial Neural Network and Genetic Algorithm Hybrid Intelligence for Predicting Thai Stock Price Index Trend.

    PubMed

    Inthachot, Montri; Boonjing, Veera; Intakosum, Sarun

    2016-01-01

    This study investigated the use of Artificial Neural Network (ANN) and Genetic Algorithm (GA) for prediction of Thailand's SET50 index trend. ANN is a widely accepted machine learning method that uses past data to predict future trend, while GA is an algorithm that can find better subsets of input variables for importing into ANN, hence enabling more accurate prediction by its efficient feature selection. The imported data were chosen technical indicators highly regarded by stock analysts, each represented by 4 input variables that were based on past time spans of 4 different lengths: 3-, 5-, 10-, and 15-day spans before the day of prediction. This import undertaking generated a big set of diverse input variables with an exponentially higher number of possible subsets that GA culled down to a manageable number of more effective ones. SET50 index data of the past 6 years, from 2009 to 2014, were used to evaluate this hybrid intelligence prediction accuracy, and the hybrid's prediction results were found to be more accurate than those made by a method using only one input variable for one fixed length of past time span.

  9. Numerical simulation of turbulence flow in a Kaplan turbine -Evaluation on turbine performance prediction accuracy-

    NASA Astrophysics Data System (ADS)

    Ko, P.; Kurosawa, S.

    2014-03-01

    The understanding and accurate prediction of the flow behaviour related to cavitation and pressure fluctuation in a Kaplan turbine are important to the design work enhancing the turbine performance including the elongation of the operation life span and the improvement of turbine efficiency. In this paper, high accuracy turbine and cavitation performance prediction method based on entire flow passage for a Kaplan turbine is presented and evaluated. Two-phase flow field is predicted by solving Reynolds-Averaged Navier-Stokes equations expressed by volume of fluid method tracking the free surface and combined with Reynolds Stress model. The growth and collapse of cavitation bubbles are modelled by the modified Rayleigh-Plesset equation. The prediction accuracy is evaluated by comparing with the model test results of Ns 400 Kaplan model turbine. As a result that the experimentally measured data including turbine efficiency, cavitation performance, and pressure fluctuation are accurately predicted. Furthermore, the cavitation occurrence on the runner blade surface and the influence to the hydraulic loss of the flow passage are discussed. Evaluated prediction method for the turbine flow and performance is introduced to facilitate the future design and research works on Kaplan type turbine.

  10. Wetter subtropics in a warmer world: Contrasting past and future hydrological cycles

    NASA Astrophysics Data System (ADS)

    Burls, Natalie J.; Fedorov, Alexey V.

    2017-12-01

    During the warm Miocene and Pliocene Epochs, vast subtropical regions had enough precipitation to support rich vegetation and fauna. Only with global cooling and the onset of glacial cycles some 3 Mya, toward the end of the Pliocene, did the broad patterns of arid and semiarid subtropical regions become fully developed. However, current projections of future global warming caused by CO2 rise generally suggest the intensification of dry conditions over these subtropical regions, rather than the return to a wetter state. What makes future projections different from these past warm climates? Here, we investigate this question by comparing a typical quadrupling-of-CO2 experiment with a simulation driven by sea-surface temperatures closely resembling available reconstructions for the early Pliocene. Based on these two experiments and a suite of other perturbed climate simulations, we argue that this puzzle is explained by weaker atmospheric circulation in response to the different ocean surface temperature patterns of the Pliocene, specifically reduced meridional and zonal temperature gradients. Thus, our results highlight that accurately predicting the response of the hydrological cycle to global warming requires predicting not only how global mean temperature responds to elevated CO2 forcing (climate sensitivity) but also accurately quantifying how meridional sea-surface temperature patterns will change (structural climate sensitivity).

  11. Differential responses of carbon and water vapor fluxes to climate among evergreen needleleaf forests in the USA

    USDA-ARS?s Scientific Manuscript database

    Understanding of differences in carbon and water vapor fluxes of spatially distributed evergreen needle leaf forests (ENFs) is crucial to accurately estimating regional carbon and water budgets and when predicting the responses of ENFs to future climate. We investigated cross-site variability in car...

  12. Machine Learning Techniques for Prediction of Early Childhood Obesity.

    PubMed

    Dugan, T M; Mukhopadhyay, S; Carroll, A; Downs, S

    2015-01-01

    This paper aims to predict childhood obesity after age two, using only data collected prior to the second birthday by a clinical decision support system called CHICA. Analyses of six different machine learning methods: RandomTree, RandomForest, J48, ID3, Naïve Bayes, and Bayes trained on CHICA data show that an accurate, sensitive model can be created. Of the methods analyzed, the ID3 model trained on the CHICA dataset proved the best overall performance with accuracy of 85% and sensitivity of 89%. Additionally, the ID3 model had a positive predictive value of 84% and a negative predictive value of 88%. The structure of the tree also gives insight into the strongest predictors of future obesity in children. Many of the strongest predictors seen in the ID3 modeling of the CHICA dataset have been independently validated in the literature as correlated with obesity, thereby supporting the validity of the model. This study demonstrated that data from a production clinical decision support system can be used to build an accurate machine learning model to predict obesity in children after age two.

  13. Deep space target location with Hubble Space Telescope (HST) and Hipparcos data

    NASA Technical Reports Server (NTRS)

    Null, George W.

    1988-01-01

    Interplanetary spacecraft navigation requires accurate a priori knowledge of target positions. A concept is presented for attaining improved target ephemeris accuracy using two future Earth-orbiting optical observatories, the European Space Agency (ESA) Hipparcos observatory and the Nasa Hubble Space Telescope (HST). Assuming nominal observatory performance, the Hipparcos data reduction will provide an accurate global star catalog, and HST will provide a capability for accurate angular measurements of stars and solar system bodies. The target location concept employs HST to observe solar system bodies relative to Hipparcos catalog stars and to determine the orientation (frame tie) of these stars to compact extragalactic radio sources. The target location process is described, the major error sources discussed, the potential target ephemeris error predicted, and mission applications identified. Preliminary results indicate that ephemeris accuracy comparable to the errors in individual Hipparcos catalog stars may be possible with a more extensive HST observing program. Possible future ground and spacebased replacements for Hipparcos and HST astrometric capabilities are also discussed.

  14. Information theory of adaptation in neurons, behavior, and mood.

    PubMed

    Sharpee, Tatyana O; Calhoun, Adam J; Chalasani, Sreekanth H

    2014-04-01

    The ability to make accurate predictions of future stimuli and consequences of one's actions are crucial for the survival and appropriate decision-making. These predictions are constantly being made at different levels of the nervous system. This is evidenced by adaptation to stimulus parameters in sensory coding, and in learning of an up-to-date model of the environment at the behavioral level. This review will discuss recent findings that actions of neurons and animals are selected based on detailed stimulus history in such a way as to maximize information for achieving the task at hand. Information maximization dictates not only how sensory coding should adapt to various statistical aspects of stimuli, but also that reward function should adapt to match the predictive information from past to future. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. The future of stellar occultations by distant solar system bodies: Perspectives from the Gaia astrometry and the deep sky surveys

    NASA Astrophysics Data System (ADS)

    Camargo, J. I. B.; Desmars, J.; Braga-Ribas, F.; Vieira-Martins, R.; Assafin, M.; Sicardy, B.; Bérard, D.; Benedetti-Rossi, G.

    2018-05-01

    Distant objects in the solar system are crucial to better understand the history and evolution of its outskirts. The stellar occultation technique allows the determination of their sizes and shapes with kilometric accuracy, a detailed investigation of their immediate vicinities, as well as the detection of tenuous atmospheres. The prediction of such events is a key point in this study, and yet accurate enough predictions are available to a handful of objects only. In this work, we briefly discuss the dramatic impact that both the astrometry from the Gaia space mission and the deep sky surveys - the Large Synoptic Survey Telescope in particular - will have on the prediction of stellar occultations and how they may influence the future of the study of distant small solar system bodies through this technique.

  16. Prediction of fishing effort distributions using boosted regression trees.

    PubMed

    Soykan, Candan U; Eguchi, Tomoharu; Kohin, Suzanne; Dewar, Heidi

    2014-01-01

    Concerns about bycatch of protected species have become a dominant factor shaping fisheries management. However, efforts to mitigate bycatch are often hindered by a lack of data on the distributions of fishing effort and protected species. One approach to overcoming this problem has been to overlay the distribution of past fishing effort with known locations of protected species, often obtained through satellite telemetry and occurrence data, to identify potential bycatch hotspots. This approach, however, generates static bycatch risk maps, calling into question their ability to forecast into the future, particularly when dealing with spatiotemporally dynamic fisheries and highly migratory bycatch species. In this study, we use boosted regression trees to model the spatiotemporal distribution of fishing effort for two distinct fisheries in the North Pacific Ocean, the albacore (Thunnus alalunga) troll fishery and the California drift gillnet fishery that targets swordfish (Xiphias gladius). Our results suggest that it is possible to accurately predict fishing effort using < 10 readily available predictor variables (cross-validated correlations between model predictions and observed data -0.6). Although the two fisheries are quite different in their gears and fishing areas, their respective models had high predictive ability, even when input data sets were restricted to a fraction of the full time series. The implications for conservation and management are encouraging: Across a range of target species, fishing methods, and spatial scales, even a relatively short time series of fisheries data may suffice to accurately predict the location of fishing effort into the future. In combination with species distribution modeling of bycatch species, this approach holds promise as a mitigation tool when observer data are limited. Even in data-rich regions, modeling fishing effort and bycatch may provide more accurate estimates of bycatch risk than partial observer coverage for fisheries and bycatch species that are heavily influenced by dynamic oceanographic conditions.

  17. Pattern recognition and functional neuroimaging help to discriminate healthy adolescents at risk for mood disorders from low risk adolescents.

    PubMed

    Mourão-Miranda, Janaina; Oliveira, Leticia; Ladouceur, Cecile D; Marquand, Andre; Brammer, Michael; Birmaher, Boris; Axelson, David; Phillips, Mary L

    2012-01-01

    There are no known biological measures that accurately predict future development of psychiatric disorders in individual at-risk adolescents. We investigated whether machine learning and fMRI could help to: 1. differentiate healthy adolescents genetically at-risk for bipolar disorder and other Axis I psychiatric disorders from healthy adolescents at low risk of developing these disorders; 2. identify those healthy genetically at-risk adolescents who were most likely to develop future Axis I disorders. 16 healthy offspring genetically at risk for bipolar disorder and other Axis I disorders by virtue of having a parent with bipolar disorder and 16 healthy, age- and gender-matched low-risk offspring of healthy parents with no history of psychiatric disorders (12-17 year-olds) performed two emotional face gender-labeling tasks (happy/neutral; fearful/neutral) during fMRI. We used Gaussian Process Classifiers (GPC), a machine learning approach that assigns a predictive probability of group membership to an individual person, to differentiate groups and to identify those at-risk adolescents most likely to develop future Axis I disorders. Using GPC, activity to neutral faces presented during the happy experiment accurately and significantly differentiated groups, achieving 75% accuracy (sensitivity = 75%, specificity = 75%). Furthermore, predictive probabilities were significantly higher for those at-risk adolescents who subsequently developed an Axis I disorder than for those at-risk adolescents remaining healthy at follow-up. We show that a combination of two promising techniques, machine learning and neuroimaging, not only discriminates healthy low-risk from healthy adolescents genetically at-risk for Axis I disorders, but may ultimately help to predict which at-risk adolescents subsequently develop these disorders.

  18. Computational Fluid Dynamics of Whole-Body Aircraft

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh

    1999-01-01

    The current state of the art in computational aerodynamics for whole-body aircraft flowfield simulations is described. Recent advances in geometry modeling, surface and volume grid generation, and flow simulation algorithms have led to accurate flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics has emerged as a crucial enabling technology for the design and development of flight vehicles. Examples illustrating the current capability for the prediction of transport and fighter aircraft flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future, inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology, and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  19. SU-E-J-191: Motion Prediction Using Extreme Learning Machine in Image Guided Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, J; Cao, R; Pei, X

    Purpose: Real-time motion tracking is a critical issue in image guided radiotherapy due to the time latency caused by image processing and system response. It is of great necessity to fast and accurately predict the future position of the respiratory motion and the tumor location. Methods: The prediction of respiratory position was done based on the positioning and tracking module in ARTS-IGRT system which was developed by FDS Team (www.fds.org.cn). An approach involving with the extreme learning machine (ELM) was adopted to predict the future respiratory position as well as the tumor’s location by training the past trajectories. For themore » training process, a feed-forward neural network with one single hidden layer was used for the learning. First, the number of hidden nodes was figured out for the single layered feed forward network (SLFN). Then the input weights and hidden layer biases of the SLFN were randomly assigned to calculate the hidden neuron output matrix. Finally, the predicted movement were obtained by applying the output weights and compared with the actual movement. Breathing movement acquired from the external infrared markers was used to test the prediction accuracy. And the implanted marker movement for the prostate cancer was used to test the implementation of the tumor motion prediction. Results: The accuracy of the predicted motion and the actual motion was tested. Five volunteers with different breathing patterns were tested. The average prediction time was 0.281s. And the standard deviation of prediction accuracy was 0.002 for the respiratory motion and 0.001 for the tumor motion. Conclusion: The extreme learning machine method can provide an accurate and fast prediction of the respiratory motion and the tumor location and therefore can meet the requirements of real-time tumor-tracking in image guided radiotherapy.« less

  20. An unexpected way forward: towards a more accurate and rigorous protein-protein binding affinity scoring function by eliminating terms from an already simple scoring function.

    PubMed

    Swanson, Jon; Audie, Joseph

    2018-01-01

    A fundamental and unsolved problem in biophysical chemistry is the development of a computationally simple, physically intuitive, and generally applicable method for accurately predicting and physically explaining protein-protein binding affinities from protein-protein interaction (PPI) complex coordinates. Here, we propose that the simplification of a previously described six-term PPI scoring function to a four term function results in a simple expression of all physically and statistically meaningful terms that can be used to accurately predict and explain binding affinities for a well-defined subset of PPIs that are characterized by (1) crystallographic coordinates, (2) rigid-body association, (3) normal interface size, and hydrophobicity and hydrophilicity, and (4) high quality experimental binding affinity measurements. We further propose that the four-term scoring function could be regarded as a core expression for future development into a more general PPI scoring function. Our work has clear implications for PPI modeling and structure-based drug design.

  1. Health-based risk adjustment: is inpatient and outpatient diagnostic information sufficient?

    PubMed

    Lamers, L M

    Adequate risk adjustment is critical to the success of market-oriented health care reforms in many countries. Currently used risk adjusters based on demographic and diagnostic cost groups (DCGs) do not reflect expected costs accurately. This study examines the simultaneous predictive accuracy of inpatient and outpatient morbidity measures and prior costs. DCGs, pharmacy cost groups (PCGs), and prior year's costs improve the predictive accuracy of the demographic model substantially. DCGs and PCGs seem complementary in their ability to predict future costs. However, this study shows that the combination of DCGs and PCGs still leaves room for cream skimming.

  2. Behavior-Based Budget Management Using Predictive Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Troy Hiltbrand

    Historically, the mechanisms to perform forecasting have primarily used two common factors as a basis for future predictions: time and money. While time and money are very important aspects of determining future budgetary spend patterns, organizations represent a complex system of unique individuals with a myriad of associated behaviors and all of these behaviors have bearing on how budget is utilized. When looking to forecasted budgets, it becomes a guessing game about how budget managers will behave under a given set of conditions. This becomes relatively messy when human nature is introduced, as different managers will react very differently undermore » similar circumstances. While one manager becomes ultra conservative during periods of financial austerity, another might be un-phased and continue to spend as they have in the past. Both might revert into a state of budgetary protectionism masking what is truly happening at a budget holder level, in order to keep as much budget and influence as possible while at the same time sacrificing the greater good of the organization. To more accurately predict future outcomes, the models should consider both time and money and other behavioral patterns that have been observed across the organization. The field of predictive analytics is poised to provide the tools and methodologies needed for organizations to do just this: capture and leverage behaviors of the past to predict the future.« less

  3. The applicability of a computer model for predicting head injury incurred during actual motor vehicle collisions.

    PubMed

    Moran, Stephan G; Key, Jason S; McGwin, Gerald; Keeley, Jason W; Davidson, James S; Rue, Loring W

    2004-07-01

    Head injury is a significant cause of both morbidity and mortality. Motor vehicle collisions (MVCs) are the most common source of head injury in the United States. No studies have conclusively determined the applicability of computer models for accurate prediction of head injuries sustained in actual MVCs. This study sought to determine the applicability of such models for predicting head injuries sustained by MVC occupants. The Crash Injury Research and Engineering Network (CIREN) database was queried for restrained drivers who sustained a head injury. These collisions were modeled using occupant dynamic modeling (MADYMO) software, and head injury scores were generated. The computer-generated head injury scores then were evaluated with respect to the actual head injuries sustained by the occupants to determine the applicability of MADYMO computer modeling for predicting head injury. Five occupants meeting the selection criteria for the study were selected from the CIREN database. The head injury scores generated by MADYMO were lower than expected given the actual injuries sustained. In only one case did the computer analysis predict a head injury of a severity similar to that actually sustained by the occupant. Although computer modeling accurately simulates experimental crash tests, it may not be applicable for predicting head injury in actual MVCs. Many complicating factors surrounding actual MVCs make accurate computer modeling difficult. Future modeling efforts should consider variables such as age of the occupant and should account for a wider variety of crash scenarios.

  4. Space Environment (Natural and Induced)

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; George, Kerry A.; Cucinotta, Francis A.

    2007-01-01

    Considerable effort and improvement have been made in the study of ionizing radiation exposure occurring in various regions of space. Satellites and spacecrafts equipped with innovative instruments are continually refining particle data and providing more accurate information on the ionizing radiation environment. The major problem in accurate spectral definition of ionizing radiation appears to be the detailed energy spectra, especially at high energies, which is important parameter for accurate radiation risk assessment. Magnitude of risks posed by exposure to radiation in future space missions is subject to the accuracies of predictive forecast of event size of SPE, GCR environment, geomagnetic fields, and atmospheric radiation environment. Although heavy ion fragmentations and interactions are adequately resolved through laboratory study and model development, improvements in fragmentation cross sections for the light nuclei produced from HZE nuclei and their laboratory validation are still required to achieve the principal goal of planetary GCR simulation at a critical exposure site. More accurate prediction procedure for ionizing radiation environment can be made with a better understanding of the solar and space physics, fulfillment of required measurements for nuclear/atomic processes, and their validation and verification with spaceflights and heavy ion accelerators experiments. It is certainly true that the continued advancements in solar and space physics combining with physical measurements will strengthen the confidence of future manned exploration of solar system. Advancements in radiobiology will surely give the meaningful radiation hazard assessments for short and long term effects, by which appropriate and effective mitigation measures can be placed to ensure that humans safely live and work in the space, anywhere, anytime.

  5. Predicting suicide attempts with the SAD PERSONS scale: a longitudinal analysis.

    PubMed

    Bolton, James M; Spiwak, Rae; Sareen, Jitender

    2012-06-01

    The SAD PERSONS scale is a widely used risk assessment tool for suicidal behavior despite a paucity of supporting data. The objective of this study was to examine the ability of the scale in predicting suicide attempts. Participants consisted of consecutive referrals (N=4,019) over 2 years (January 1, 2009 to December 31, 2010) to psychiatric services in the emergency departments of the 2 largest tertiary care hospitals in the province of Manitoba, Canada. SAD PERSONS and Modified SAD PERSONS (MSPS) scale scores were recorded for individuals at their index and all subsequent presentations. The 2 main outcome measures in the study included current suicide attempts (at index presentation) and future suicide attempts (within the next 6 months). The ability of the scales to predict suicide attempts was evaluated with logistic regression, sensitivity and specificity analyses, and receiver operating characteristic curves. 566 people presented with suicide attempts (14.1% of the sample). Both SAD PERSONS and MSPS showed poor predictive ability for future suicide attempts. Compared to low risk scores, high risk baseline scores had low sensitivity (19.6% and 40.0%, respectively) and low positive predictive value (5.3% and 7.4%, respectively). SAD PERSONS did not predict suicide attempts better than chance (area under the curve =0.572; 95% confidence interval [CI], 0.51-0.64; P value nonsignificant). Stepwise regression identified 5 original scale items that accounted for the greatest proportion of future suicide attempt variance. High risk scores using this model had high sensitivity (93.5%) and were associated with a 5-fold higher likelihood of future suicide attempt presentation (odds ratio =5.58; 95% CI, 2.24-13.86; P<.001). In their current form, SAD PERSONS and MSPS do not accurately predict future suicide attempts. © Copyright 2012 Physicians Postgraduate Press, Inc.

  6. Modeling methodology for the accurate and prompt prediction of symptomatic events in chronic diseases.

    PubMed

    Pagán, Josué; Risco-Martín, José L; Moya, José M; Ayala, José L

    2016-08-01

    Prediction of symptomatic crises in chronic diseases allows to take decisions before the symptoms occur, such as the intake of drugs to avoid the symptoms or the activation of medical alarms. The prediction horizon is in this case an important parameter in order to fulfill the pharmacokinetics of medications, or the time response of medical services. This paper presents a study about the prediction limits of a chronic disease with symptomatic crises: the migraine. For that purpose, this work develops a methodology to build predictive migraine models and to improve these predictions beyond the limits of the initial models. The maximum prediction horizon is analyzed, and its dependency on the selected features is studied. A strategy for model selection is proposed to tackle the trade off between conservative but robust predictive models, with respect to less accurate predictions with higher horizons. The obtained results show a prediction horizon close to 40min, which is in the time range of the drug pharmacokinetics. Experiments have been performed in a realistic scenario where input data have been acquired in an ambulatory clinical study by the deployment of a non-intrusive Wireless Body Sensor Network. Our results provide an effective methodology for the selection of the future horizon in the development of prediction algorithms for diseases experiencing symptomatic crises. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Understanding physical (in-) activity, overweight, and obesity in childhood: Effects of congruence between physical self-concept and motor competence.

    PubMed

    Utesch, T; Dreiskämper, D; Naul, R; Geukes, K

    2018-04-12

    Both the physical self-concept and actual motor competence are important for healthy future physical activity levels and consequently decrease overweight and obesity in childhood. However, children scoring high on motor competence do not necessarily report high levels of physical self-concept and vice versa, resulting in respective (in-) accuracy also referred to as (non-) veridicality. This study examines whether children's accuracy of physical self-concept is a meaningful predictive factor for their future physical activity. Motor competence, physical self-concept and physical activity were assessed in 3 rd grade and one year later in 4 th grade. Children's weight status was categorized based on WHO recommendations. Polynomial regression with Response surface analyses were conducted with a quasi-DIF approach examining moderating weight status effects. Analyses revealed that children with higher motor competence levels and higher self-perceptions show greater physical activity. Importantly, children who perceive their motor competence more accurately (compared to less) show more future physical activity. This effect is strong for underweight and overweight/obese children, but weak for normal weight children. This study indicates that an accurate self-perception of motor competence fosters future physical activity beyond single main effects, respectively. Hence, the promotion of actual motor competence should be linked with the respective development of accurate self-knowledge.

  8. Recent and projected future climatic suitability of North America for the Asian tiger mosquito Aedes albopictus.

    PubMed

    Ogden, Nicholas H; Milka, Radojević; Caminade, Cyril; Gachon, Philippe

    2014-12-02

    Since the 1980s, populations of the Asian tiger mosquito Aedes albopictus have become established in south-eastern, eastern and central United States, extending to approximately 40°N. Ae. albopictus is a vector of a wide range of human pathogens including dengue and chikungunya viruses, which are currently emerging in the Caribbean and Central America and posing a threat to North America. The risk of Ae. albopictus expanding its geographic range in North America under current and future climate was assessed using three climatic indicators of Ae. albopictus survival: overwintering conditions (OW), OW combined with annual air temperature (OWAT), and a linear index of precipitation and air temperature suitability expressed through a sigmoidal function (SIG). The capacity of these indicators to predict Ae. albopictus occurrence was evaluated using surveillance data from the United States. Projected future climatic suitability for Ae. albopictus was obtained using output of nine Regional Climate Model experiments (RCMs). OW and OWAT showed >90% specificity and sensitivity in predicting observed Ae. albopictus occurrence and also predicted moderate to high risk of Ae. albopictus invasion in Pacific coastal areas of the Unites States and Canada under current climate. SIG also well predicted observed Ae. albopictus occurrence (ROC area under the curve was 0.92) but predicted wider current climatic suitability in the north-central and north-eastern United States and south-eastern Canada. RCM output projected modest (circa 500 km) future northward range expansion of Ae. albopictus by the 2050s when using OW and OWAT indicators, but greater (600-1000 km) range expansion, particularly in eastern and central Canada, when using the SIG indicator. Variation in future possible distributions of Ae. albopictus was greater amongst the climatic indicators used than amongst the RCM experiments. Current Ae. albopictus distributions were well predicted by simple climatic indicators and northward range expansion was predicted for the future with climate change. However, current and future predicted geographic distributions of Ae. albopictus varied amongst the climatic indicators used. Further field studies are needed to assess which climatic indicator is the most accurate in predicting regions suitable for Ae. albopictus survival in North America.

  9. The Integration of Geographical Information System and Remotely Sensed Data to Track and Predict the Migration Path of the Africanized Honey Bee

    NASA Technical Reports Server (NTRS)

    Ward, Charles; Bravo, Jessica; De Luna, Rosalia; Lopez, Gerardo; Pichardo, Itza; Trejo, Danny; Vargas, Gabriel

    1997-01-01

    One of the research groups at the Pan American Center for Earth and Environmental Studies (PACES) is researching the northward migration path of Africanized Honey Bees or often referred to in the popular press as killer bees. The goal of the Killer Bee Research Group (KBRG) is to set up a database in the form of a geographical information system, which will be used to track and predict the bees future migration path. Included in this paper is background information on geographical information systems, the SPANS Explorer software package which was used to implement the database, and Advanced Very High Resolution Radiometer data and how each of these is being incorporated in the research. With an accurate means of predicting future migration paths, the negative effects of the Africanized honey bees maybe reduced.

  10. Artificial neural networks in gynaecological diseases: current and potential future applications.

    PubMed

    Siristatidis, Charalampos S; Chrelias, Charalampos; Pouliakis, Abraham; Katsimanis, Evangelos; Kassanos, Dimitrios

    2010-10-01

    Current (and probably future) practice of medicine is mostly associated with prediction and accurate diagnosis. Especially in clinical practice, there is an increasing interest in constructing and using valid models of diagnosis and prediction. Artificial neural networks (ANNs) are mathematical systems being used as a prospective tool for reliable, flexible and quick assessment. They demonstrate high power in evaluating multifactorial data, assimilating information from multiple sources and detecting subtle and complex patterns. Their capability and difference from other statistical techniques lies in performing nonlinear statistical modelling. They represent a new alternative to logistic regression, which is the most commonly used method for developing predictive models for outcomes resulting from partitioning in medicine. In combination with the other non-algorithmic artificial intelligence techniques, they provide useful software engineering tools for the development of systems in quantitative medicine. Our paper first presents a brief introduction to ANNs, then, using what we consider the best available evidence through paradigms, we evaluate the ability of these networks to serve as first-line detection and prediction techniques in some of the most crucial fields in gynaecology. Finally, through the analysis of their current application, we explore their dynamics for future use.

  11. Quantum Electrodynamical Shifts in Multivalent Heavy Ions.

    PubMed

    Tupitsyn, I I; Kozlov, M G; Safronova, M S; Shabaev, V M; Dzuba, V A

    2016-12-16

    The quantum electrodynamics (QED) corrections are directly incorporated into the most accurate treatment of the correlation corrections for ions with complex electronic structure of interest to metrology and tests of fundamental physics. We compared the performance of four different QED potentials for various systems to access the accuracy of QED calculations and to make a prediction of highly charged ion properties urgently needed for planning future experiments. We find that all four potentials give consistent and reliable results for ions of interest. For the strongly bound electrons, the nonlocal potentials are more accurate than the local potential.

  12. Combining Satellite Measurements and Numerical Flood Prediction Models to Save Lives and Property from Flooding

    NASA Astrophysics Data System (ADS)

    Saleh, F.; Garambois, P. A.; Biancamaria, S.

    2017-12-01

    Floods are considered the major natural threats to human societies across all continents. Consequences of floods in highly populated areas are more dramatic with losses of human lives and substantial property damage. This risk is projected to increase with the effects of climate change, particularly sea-level rise, increasing storm frequencies and intensities and increasing population and economic assets in such urban watersheds. Despite the advances in computational resources and modeling techniques, significant gaps exist in predicting complex processes and accurately representing the initial state of the system. Improving flood prediction models and data assimilation chains through satellite has become an absolute priority to produce accurate flood forecasts with sufficient lead times. The overarching goal of this work is to assess the benefits of the Surface Water Ocean Topography SWOT satellite data from a flood prediction perspective. The near real time methodology is based on combining satellite data from a simulator that mimics the future SWOT data, numerical models, high resolution elevation data and real-time local measurement in the New York/New Jersey area.

  13. A retrospective view of the quality of the fauna component of the Olympic Dam Project Environmental Impact Statement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Read, J.L.

    1994-06-01

    The merits of the fauna section of the Olympic Dam Project Environmental Impact Statement (EIS) are discussed. The values of different survey methods and monitoring organisms used in this document are evaluated following 10 years of fauna monitoring and research subsequent to the preparation of the EIS. The pilot fauna reconnaissance was found to be of little value, although the associated literature review formed an integral part of the EIS. Over 95% of all amphibian, reptile and bird species recorded at Olympic Dam were confirmed or predicted to occur in the EIS. Mammal predictions were less accurate because of themore » sparse populations and irruptive nature of several arid-zone species. Prediction and monitoring of rare species were demonstrably difficult. The Olympic Dam Project EIS was found in general to be an accurate and useful document. However, it is suggested that more emphasis be placed on establishing monitoring programmes for future EISs, particularly for invertebrates. 35 refs., 1 fig., 3 tabs.« less

  14. Identifying Future Drinkers: Behavioral Analysis of Monkeys Initiating Drinking to Intoxication is Predictive of Future Drinking Classification.

    PubMed

    Baker, Erich J; Walter, Nicole A R; Salo, Alex; Rivas Perea, Pablo; Moore, Sharon; Gonzales, Steven; Grant, Kathleen A

    2017-03-01

    The Monkey Alcohol Tissue Research Resource (MATRR) is a repository and analytics platform for detailed data derived from well-documented nonhuman primate (NHP) alcohol self-administration studies. This macaque model has demonstrated categorical drinking norms reflective of human drinking populations, resulting in consumption pattern classifications of very heavy drinking (VHD), heavy drinking (HD), binge drinking (BD), and low drinking (LD) individuals. Here, we expand on previous findings that suggest ethanol drinking patterns during initial drinking to intoxication can reliably predict future drinking category assignment. The classification strategy uses a machine-learning approach to examine an extensive set of daily drinking attributes during 90 sessions of induction across 7 cohorts of 5 to 8 monkeys for a total of 50 animals. A Random Forest classifier is employed to accurately predict categorical drinking after 12 months of self-administration. Predictive outcome accuracy is approximately 78% when classes are aggregated into 2 groups, "LD and BD" and "HD and VHD." A subsequent 2-step classification model distinguishes individual LD and BD categories with 90% accuracy and between HD and VHD categories with 95% accuracy. Average 4-category classification accuracy is 74%, and provides putative distinguishing behavioral characteristics between groupings. We demonstrate that data derived from the induction phase of this ethanol self-administration protocol have significant predictive power for future ethanol consumption patterns. Importantly, numerous predictive factors are longitudinal, measuring the change of drinking patterns through 3 stages of induction. Factors during induction that predict future heavy drinkers include being younger at the time of first intoxication and developing a shorter latency to first ethanol drink. Overall, this analysis identifies predictive characteristics in future very heavy drinkers that optimize intoxication, such as having increasingly fewer bouts with more drinks. This analysis also identifies characteristic avoidance of intoxicating topographies in future low drinkers, such as increasing number of bouts and waiting longer before the first ethanol drink. Copyright © 2017 The Authors Alcoholism: Clinical & Experimental Research published by Wiley Periodicals, Inc. on behalf of Research Society on Alcoholism.

  15. Sex-specific lean body mass predictive equations are accurate in the obese paediatric population

    PubMed Central

    Jackson, Lanier B.; Henshaw, Melissa H.; Carter, Janet; Chowdhury, Shahryar M.

    2015-01-01

    Background The clinical assessment of lean body mass (LBM) is challenging in obese children. A sex-specific predictive equation for LBM derived from anthropometric data was recently validated in children. Aim The purpose of this study was to independently validate these predictive equations in the obese paediatric population. Subjects and methods Obese subjects aged 4–21 were analysed retrospectively. Predicted LBM (LBMp) was calculated using equations previously developed in children. Measured LBM (LBMm) was derived from dual-energy x-ray absorptiometry. Agreement was expressed as [(LBMm-LBMp)/LBMm] with 95% limits of agreement. Results Of 310 enrolled patients, 195 (63%) were females. The mean age was 11.8 ± 3.4 years and mean BMI Z-score was 2.3 ± 0.4. The average difference between LBMm and LBMp was −0.6% (−17.0%, 15.8%). Pearson’s correlation revealed a strong linear relationship between LBMm and LBMp (r=0.97, p<0.01). Conclusion This study validates the use of these clinically-derived sex-specific LBM predictive equations in the obese paediatric population. Future studies should use these equations to improve the ability to accurately classify LBM in obese children. PMID:26287383

  16. Forecasting municipal solid waste generation using artificial intelligence modelling approaches.

    PubMed

    Abbasi, Maryam; El Hanandeh, Ali

    2016-10-01

    Municipal solid waste (MSW) management is a major concern to local governments to protect human health, the environment and to preserve natural resources. The design and operation of an effective MSW management system requires accurate estimation of future waste generation quantities. The main objective of this study was to develop a model for accurate forecasting of MSW generation that helps waste related organizations to better design and operate effective MSW management systems. Four intelligent system algorithms including support vector machine (SVM), adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN) and k-nearest neighbours (kNN) were tested for their ability to predict monthly waste generation in the Logan City Council region in Queensland, Australia. Results showed artificial intelligence models have good prediction performance and could be successfully applied to establish municipal solid waste forecasting models. Using machine learning algorithms can reliably predict monthly MSW generation by training with waste generation time series. In addition, results suggest that ANFIS system produced the most accurate forecasts of the peaks while kNN was successful in predicting the monthly averages of waste quantities. Based on the results, the total annual MSW generated in Logan City will reach 9.4×10(7)kg by 2020 while the peak monthly waste will reach 9.37×10(6)kg. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Status of Cycle 23 Forecasts

    NASA Technical Reports Server (NTRS)

    Hathaway, D. H.

    2000-01-01

    A number of techniques for predicting solar activity on a solar cycle time scale are identified, described, and tested with historical data. Some techniques, e.g,, regression and curve-fitting, work well as solar activity approaches maximum and provide a month- by-month description of future activity, while others, e.g., geomagnetic precursors, work well near solar minimum but provide an estimate only of the amplitude of the cycle. A synthesis of different techniques is shown to provide a more accurate and useful forecast of solar cycle activity levels. A combination of two uncorrelated geomagnetic precursor techniques provides the most accurate prediction for the amplitude of a solar activity cycle at a time well before activity minimum. This precursor method gave a smoothed sunspot number maximum of 154+21 for cycle 23. A mathematical function dependent upon the time of cycle initiation and the cycle amplitude then describes the level of solar activity for the complete cycle. As the time of cycle maximum approaches a better estimate of the cycle activity is obtained by including the fit between recent activity levels and this function. This Combined Solar Cycle Activity Forecast now gives a smoothed sunspot maximum of 140+20 for cycle 23. The success of the geomagnetic precursors in predicting future solar activity suggests that solar magnetic phenomena at latitudes above the sunspot activity belts are linked to solar activity, which occurs many years later in the lower latitudes.

  18. Exploring Temporal Frameworks for Constructing Longitudinal Instance-Specific Models from Clinical Data

    ERIC Educational Resources Information Center

    Watt, Emily

    2012-01-01

    The prevalence of the EMR in biomedical research is growing, the EMR being regarded as a source of contextually rich, longitudinal data for computation and statistical/trend analysis. However, models trained with data abstracted from the EMR often (1) do not capture all features needed to accurately predict the patient's future state and to…

  19. Computational optimization and biological evolution.

    PubMed

    Goryanin, Igor

    2010-10-01

    Modelling and optimization principles become a key concept in many biological areas, especially in biochemistry. Definitions of objective function, fitness and co-evolution, although they differ between biology and mathematics, are similar in a general sense. Although successful in fitting models to experimental data, and some biochemical predictions, optimization and evolutionary computations should be developed further to make more accurate real-life predictions, and deal not only with one organism in isolation, but also with communities of symbiotic and competing organisms. One of the future goals will be to explain and predict evolution not only for organisms in shake flasks or fermenters, but for real competitive multispecies environments.

  20. Transcriptomics in cancer diagnostics: developments in technology, clinical research and commercialization.

    PubMed

    Sager, Monica; Yeat, Nai Chien; Pajaro-Van der Stadt, Stefan; Lin, Charlotte; Ren, Qiuyin; Lin, Jimmy

    2015-01-01

    Transcriptomic technologies are evolving to diagnose cancer earlier and more accurately to provide greater predictive and prognostic utility to oncologists and patients. Digital techniques such as RNA sequencing are replacing still-imaging techniques to provide more detailed analysis of the transcriptome and aberrant expression that causes oncogenesis, while companion diagnostics are developing to determine the likely effectiveness of targeted treatments. This article examines recent advancements in molecular profiling research and technology as applied to cancer diagnosis, clinical applications and predictions for the future of personalized medicine in oncology.

  1. Development of a HEC-RAS temperature model for the North Santiam River, northwestern Oregon

    USGS Publications Warehouse

    Stonewall, Adam J.; Buccola, Norman L.

    2015-01-01

    Much of the error in temperature predictions resulted from the model’s inability to accurately simulate the full range of diurnal fluctuations during the warmest months. Future iterations of the model could be improved by the collection and inclusion of additional streamflow and temperature data, especially near the mouth of the South Santiam River. Presently, the model is able to predict hourly and daily water temperatures under a wide variety of conditions with a typical error of 0.8 and 0.7 °C, respectively.

  2. Emerging Tools for Synthetic Genome Design

    PubMed Central

    Lee, Bo-Rahm; Cho, Suhyung; Song, Yoseb; Kim, Sun Chang; Cho, Byung-Kwan

    2013-01-01

    Synthetic biology is an emerging discipline for designing and synthesizing predictable, measurable, controllable, and transformable biological systems. These newly designed biological systems have great potential for the development of cheaper drugs, green fuels, biodegradable plastics, and targeted cancer therapies over the coming years. Fortunately, our ability to quickly and accurately engineer biological systems that behave predictably has been dramatically expanded by significant advances in DNA-sequencing, DNA-synthesis, and DNA-editing technologies. Here, we review emerging technologies and methodologies in the field of building designed biological systems, and we discuss their future perspectives. PMID:23708771

  3. Towards Bridging the Gaps in Holistic Transition Prediction via Numerical Simulations

    NASA Technical Reports Server (NTRS)

    Choudhari, Meelan M.; Li, Fei; Duan, Lian; Chang, Chau-Lyan; Carpenter, Mark H.; Streett, Craig L.; Malik, Mujeeb R.

    2013-01-01

    The economic and environmental benefits of laminar flow technology via reduced fuel burn of subsonic and supersonic aircraft cannot be realized without minimizing the uncertainty in drag prediction in general and transition prediction in particular. Transition research under NASA's Aeronautical Sciences Project seeks to develop a validated set of variable fidelity prediction tools with known strengths and limitations, so as to enable "sufficiently" accurate transition prediction and practical transition control for future vehicle concepts. This paper provides a summary of selected research activities targeting the current gaps in high-fidelity transition prediction, specifically those related to the receptivity and laminar breakdown phases of crossflow induced transition in a subsonic swept-wing boundary layer. The results of direct numerical simulations are used to obtain an enhanced understanding of the laminar breakdown region as well as to validate reduced order prediction methods.

  4. Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods

    NASA Astrophysics Data System (ADS)

    Davis, A. D.

    2015-12-01

    The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity analysis to help answer this question, and make the computation of sensitivity indices computationally tractable using a combination of polynomial chaos and Monte Carlo techniques.

  5. A Multiobjective Interval Programming Model for Wind-Hydrothermal Power System Dispatching Using 2-Step Optimization Algorithm

    PubMed Central

    Jihong, Qu

    2014-01-01

    Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision. PMID:24895663

  6. A multiobjective interval programming model for wind-hydrothermal power system dispatching using 2-step optimization algorithm.

    PubMed

    Ren, Kun; Jihong, Qu

    2014-01-01

    Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision.

  7. Computational modeling of human oral bioavailability: what will be next?

    PubMed

    Cabrera-Pérez, Miguel Ángel; Pham-The, Hai

    2018-06-01

    The oral route is the most convenient way of administrating drugs. Therefore, accurate determination of oral bioavailability is paramount during drug discovery and development. Quantitative structure-property relationship (QSPR), rule-of-thumb (RoT) and physiologically based-pharmacokinetic (PBPK) approaches are promising alternatives to the early oral bioavailability prediction. Areas covered: The authors give insight into the factors affecting bioavailability, the fundamental theoretical framework and the practical aspects of computational methods for predicting this property. They also give their perspectives on future computational models for estimating oral bioavailability. Expert opinion: Oral bioavailability is a multi-factorial pharmacokinetic property with its accurate prediction challenging. For RoT and QSPR modeling, the reliability of datasets, the significance of molecular descriptor families and the diversity of chemometric tools used are important factors that define model predictability and interpretability. Likewise, for PBPK modeling the integrity of the pharmacokinetic data, the number of input parameters, the complexity of statistical analysis and the software packages used are relevant factors in bioavailability prediction. Although these approaches have been utilized independently, the tendency to use hybrid QSPR-PBPK approaches together with the exploration of ensemble and deep-learning systems for QSPR modeling of oral bioavailability has opened new avenues for development promising tools for oral bioavailability prediction.

  8. Player's success prediction in rugby union: From youth performance to senior level placing.

    PubMed

    Fontana, Federico Y; Colosio, Alessandro L; Da Lozzo, Giorgio; Pogliaghi, Silvia

    2017-04-01

    The study questioned if and to what extent specific anthropometric and functional characteristics measured in youth draft camps, can accurately predict subsequent career progression in rugby union. Original research. Anthropometric and functional characteristics of 531 male players (U16) were retrospectively analysed in relation to senior level team representation at age 21-24. Players were classified as International (Int: National team and international clubs) or National (Nat: 1st, 2nd and other divisions and dropout). Multivariate analysis of variance (one-way MANOVA) tested differences between Int and Nat, along a combination of anthropometric (body mass, height, body fat, fat-free mass) and functional variables (SJ, CMJ, t 15m , t 30m , VO 2max ). A discriminant function (DF) was determined to predict group assignment based on the linear combination of variables that best discriminate groups. Correct level assignment was expressed as % hit rate. A combination of anthropometric and functional characteristics reflects future level assignment (Int vs. Nat). Players' success can be accurately predicted (hit rate=81% and 77% for Int and Nat respectively) by a DF that combines anthropometric and functional variables as measured at ∼15 years of age, percent body fat and speed being the most influential predictors of group stratification. Within a group of 15 year-olds with exceptional physical characteristics, future players' success can be predicted using a linear combination of anthropometric and functional variables, among which a lower percent body fat and higher speed over a 15m sprint provide the most important predictors of the highest career success. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  9. Modeling and Prediction of Fan Noise

    NASA Technical Reports Server (NTRS)

    Envia, Ed

    2008-01-01

    Fan noise is a significant contributor to the total noise signature of a modern high bypass ratio aircraft engine and with the advent of ultra high bypass ratio engines like the geared turbofan, it is likely to remain so in the future. As such, accurate modeling and prediction of the basic characteristics of fan noise are necessary ingredients in designing quieter aircraft engines in order to ensure compliance with ever more stringent aviation noise regulations. In this paper, results from a comprehensive study aimed at establishing the utility of current tools for modeling and predicting fan noise will be summarized. It should be emphasized that these tools exemplify present state of the practice and embody what is currently used at NASA and Industry for predicting fan noise. The ability of these tools to model and predict fan noise is assessed against a set of benchmark fan noise databases obtained for a range of representative fan cycles and operating conditions. Detailed comparisons between the predicted and measured narrowband spectral and directivity characteristics of fan nose will be presented in the full paper. General conclusions regarding the utility of current tools and recommendations for future improvements will also be given.

  10. ROI on yield data analysis systems through a business process management strategy

    NASA Astrophysics Data System (ADS)

    Rehani, Manu; Strader, Nathan; Hanson, Jeff

    2005-05-01

    The overriding motivation for yield engineering is profitability. This is achieved through application of yield management. The first application is to continually reduce waste in the form of yield loss. New products, new technologies and the dynamic state of the process and equipment keep introducing new ways to cause yield loss. In response, the yield management efforts have to continually come up with new solutions to minimize it. The second application of yield engineering is to aid in accurate product pricing. This is achieved through predicting future results of the yield engineering effort. The more accurate the yield prediction, the more accurate the wafer start volume, the more accurate the wafer pricing. Another aspect of yield prediction pertains to gauging the impact of a yield problem and predicting how long that will last. The ability to predict such impacts again feeds into wafer start calculations and wafer pricing. The question then is that if the stakes on yield management are so high why is it that most yield management efforts are run like science and engineering projects and less like manufacturing? In the eighties manufacturing put the theory of constraints1 into practice and put a premium on stability and predictability in manufacturing activities, why can't the same be done for yield management activities? This line of introspection led us to define and implement a business process to manage the yield engineering activities. We analyzed the best known methods (BKM) and deployed a workflow tool to make them the standard operating procedure (SOP) for yield managment. We present a case study in deploying a Business Process Management solution for Semiconductor Yield Engineering in a high-mix ASIC environment. We will present a description of the situation prior to deployment, a window into the development process and a valuation of the benefits.

  11. Prediction of Exposure Level of Energetic Solar Particle Events

    NASA Astrophysics Data System (ADS)

    Kim, M. H. Y.; Blattnig, S.

    2016-12-01

    The potential for exposure to large solar particle events (SPEs) with fluxes that extend to high energies is a major concern during interplanetary transfer and extravehicular activities (EVAs) on the lunar and Martian surfaces. Prediction of sporadic occurrence of SPEs is not accurate for near or long-term scales, while the expected frequency of such events is strongly influenced by solar cycle activity. In the development of NASA's operational strategies real-time estimation of exposure to SPEs has been considered so that adequate responses can be applied in a timely manner to reduce exposures to well below the exposure limits. Previously, the organ doses of large historical SPEs had been calculated by using the complete energy spectra of each event and then developing a prediction model for blood-forming organ (BFO) dose based solely on an assumed value of integrated fluence above 30 MeV (Φ30) for an otherwise unspecified future SPE. While BFO dose is determined primarily by solar protons with high energies, it was reasoned that more accurate BFO dose prediction models could be developed using integrated fluence above 60 MeV (Φ60) and above 100 MeV (Φ100) as predictors instead of Φ30. In the current study, re-analysis of major SPEs (in which the proton spectra of the ground level enhancement [GLE] events since 1956 are correctly described by Band functions) has been used in evaluation of exposure levels. More accurate prediction models for BFO dose and NASA effective dose are then developed using integrated fluence above 200 MeV (Φ200), which by far have the most weight in the calculation of doses for deep-seated organs from exposure to extreme SPEs (GLEs or sub-GLEs). The unconditional probability of a BFO dose exceeding a pre-specified BFO dose limit is simultaneously calculated by taking into account the distribution of the predictor (Φ30, Φ60, Φ100, or Φ200) as estimated from historical SPEs. These results can be applied to the development of approaches to improve radiation protection of astronauts and the optimization of mission planning for future space missions.

  12. Universality of quantum gravity corrections.

    PubMed

    Das, Saurya; Vagenas, Elias C

    2008-11-28

    We show that the existence of a minimum measurable length and the related generalized uncertainty principle (GUP), predicted by theories of quantum gravity, influence all quantum Hamiltonians. Thus, they predict quantum gravity corrections to various quantum phenomena. We compute such corrections to the Lamb shift, the Landau levels, and the tunneling current in a scanning tunneling microscope. We show that these corrections can be interpreted in two ways: (a) either that they are exceedingly small, beyond the reach of current experiments, or (b) that they predict upper bounds on the quantum gravity parameter in the GUP, compatible with experiments at the electroweak scale. Thus, more accurate measurements in the future should either be able to test these predictions, or further tighten the above bounds and predict an intermediate length scale between the electroweak and the Planck scale.

  13. Failure Pressure and Leak Rate of Steam Generator Tubes With Stress Corrosion Cracks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Majumdar, S.; Kasza, K.; Park, J.Y.

    2002-07-01

    This paper illustrates the use of an 'equivalent rectangular crack' approach to predict leak rates through laboratory generated stress corrosion cracks. A comparison between predicted and observed test data on rupture and leak rate from laboratory generated stress corrosion cracks are provided. Specimen flaws were sized by post-test fractography in addition to pre-test advanced eddy current technique. The test failure pressures and leak rates are shown to be closer to those predicted on the basis of fractography than on NDE. However, the predictions based on NDE results are encouraging, particularly because they have the potential to determine a more detailedmore » geometry of ligamentous cracks from which more accurate predictions of failure pressure and leak rate can be made in the future. (authors)« less

  14. Flight Experiment Verification of Shuttle Boundary Layer Transition Prediction Tool

    NASA Technical Reports Server (NTRS)

    Berry, Scott A.; Berger, Karen T.; Horvath, Thomas J.; Wood, William A.

    2016-01-01

    Boundary layer transition at hypersonic conditions is critical to the design of future high-speed aircraft and spacecraft. Accurate methods to predict transition would directly impact the aerothermodynamic environments used to size a hypersonic vehicle's thermal protection system. A transition prediction tool, based on wind tunnel derived discrete roughness correlations, was developed and implemented for the Space Shuttle return-to-flight program. This tool was also used to design a boundary layer transition flight experiment in order to assess correlation uncertainties, particularly with regard to high Mach-number transition and tunnel-to-flight scaling. A review is provided of the results obtained from the flight experiment in order to evaluate the transition prediction tool implemented for the Shuttle program.

  15. Influence of Photoperiod on Hormones, Behavior, and Immune Function

    PubMed Central

    Walton, James C.; Weil, Zachary M.; Nelson, Randy J.

    2011-01-01

    Photoperiodism is the ability of plants and animals to measure environmental day length to ascertain time of year. Central to the evolution of photoperiodism in animals is the adaptive distribution of energetically challenging activities across the year to optimize reproductive fitness while balancing the energetic tradeoffs necessary for seasonally- appropriate survival strategies. The ability to accurately predict future events requires endogenous mechanisms to permit physiological anticipation of annual conditions. Day length provides a virtually noise free environmental signal to monitor and accurately predict time of the year. In mammals, melatonin provides the hormonal signal transducing day length. Duration of pineal melatonin is inversely related to day length and its secretion drives enduring changes in many physiological systems, including the HPA, HPG, and brain-gut axes, the autonomic nervous system, and the immune system. Thus, melatonin is the fulcrum mediating redistribution of energetic investment among physiological processes to maximize fitness and survival. PMID:21156187

  16. Selecting At-Risk Readers in First Grade for Early Intervention: A Two-Year Longitudinal Study of Decision Rules and Procedures

    ERIC Educational Resources Information Center

    Compton, Donald L.; Fuchs, Douglas; Fuchs, Lynn S.; Bryant, Joan D.

    2006-01-01

    Response to intervention (RTI) models for identifying learning disabilities rely on the accurate identification of children who, without Tier 2 tutoring, would develop reading disability (RD). This study examined 2 questions concerning the use of 1st-grade data to predict future RD: (1) Does adding initial word identification fluency (WIF) and 5…

  17. Development and Testing of a Coupled Ocean-atmosphere Mesoscale Ensemble Prediction System

    DTIC Science & Technology

    2011-06-28

    wind, temperature, and moisture variables, while the oceanographic ET is derived from ocean current, temperature, and salinity variables. Estimates of...wind, temperature, and moisture variables while the oceanographic ET is derived from ocean current temperature, and salinity variables. Estimates of...uncertainty in the model. Rigorously accurate ensemble methods for describing the distribution of future states given past information include particle

  18. A crucial step toward realism: responses to climate change from an evolving metacommunity perspective.

    PubMed

    Urban, Mark C; De Meester, Luc; Vellend, Mark; Stoks, Robby; Vanoverbeke, Joost

    2012-02-01

    We need to understand joint ecological and evolutionary responses to climate change to predict future threats to biological diversity. The 'evolving metacommunity' framework emphasizes that interactions between ecological and evolutionary mechanisms at both local and regional scales will drive community dynamics during climate change. Theory suggests that ecological and evolutionary dynamics often interact to produce outcomes different from those predicted based on either mechanism alone. We highlight two of these dynamics: (i) species interactions prevent adaptation of nonresident species to new niches and (ii) resident species adapt to changing climates and thereby prevent colonization by nonresident species. The rate of environmental change, level of genetic variation, source-sink structure, and dispersal rates mediate between these potential outcomes. Future models should evaluate multiple species, species interactions other than competition, and multiple traits. Future experiments should manipulate factors such as genetic variation and dispersal to determine their joint effects on responses to climate change. Currently, we know much more about how climates will change across the globe than about how species will respond to these changes despite the profound effects these changes will have on global biological diversity. Integrating evolving metacommunity perspectives into climate change biology should produce more accurate predictions about future changes to species distributions and extinction threats.

  19. A crucial step toward realism: responses to climate change from an evolving metacommunity perspective

    PubMed Central

    Urban, Mark C; De Meester, Luc; Vellend, Mark; Stoks, Robby; Vanoverbeke, Joost

    2012-01-01

    We need to understand joint ecological and evolutionary responses to climate change to predict future threats to biological diversity. The ‘evolving metacommunity’ framework emphasizes that interactions between ecological and evolutionary mechanisms at both local and regional scales will drive community dynamics during climate change. Theory suggests that ecological and evolutionary dynamics often interact to produce outcomes different from those predicted based on either mechanism alone. We highlight two of these dynamics: (i) species interactions prevent adaptation of nonresident species to new niches and (ii) resident species adapt to changing climates and thereby prevent colonization by nonresident species. The rate of environmental change, level of genetic variation, source-sink structure, and dispersal rates mediate between these potential outcomes. Future models should evaluate multiple species, species interactions other than competition, and multiple traits. Future experiments should manipulate factors such as genetic variation and dispersal to determine their joint effects on responses to climate change. Currently, we know much more about how climates will change across the globe than about how species will respond to these changes despite the profound effects these changes will have on global biological diversity. Integrating evolving metacommunity perspectives into climate change biology should produce more accurate predictions about future changes to species distributions and extinction threats. PMID:25568038

  20. [How exactly can we predict the prognosis of COPD].

    PubMed

    Atiş, Sibel; Kanik, Arzu; Ozgür, Eylem Sercan; Eker, Suzan; Tümkaya, Münir; Ozge, Cengiz

    2009-01-01

    Predictive models play a pivotal role in the provision of accurate and useful probabilistic assessments of clinical outcomes in chronic diseases. This study was aimed to develop a dedicated prognostic index for quantifying progression risk in chronic obstructive pulmonary disease (COPD). Data were collected prospectively from 75 COPD patients during a three years period. A predictive model of progression risk of COPD was developed using Bayesian logistic regression analysis by Markov chain Monte Carlo method. One-year cycles were used for the disease progression in this model. Primary end points for progression were impairment in basal dyspne index (BDI) score, FEV(1) decline, and exacerbation frequency in last three years. Time-varying covariates age, smoking, body mass index (BMI), severity of disease according to GOLD, PaO2, PaCO(2), IC, RV/TLC, DLCO were used under the study. The mean age was 57.1 + or - 8.1. BDI were strongly correlated with exacerbation frequency (p= 0.001) but not with FEV(1) decline. BMI was found to be a predictor factor for impairment in BDI (p= 0.03). The following independent risk factors were significant to predict exacerbation frequency: GOLD staging (OR for GOLD I vs. II and III = 2.3 and 4.0), hypoxemia (OR for mild vs moderate and severe = 2.1 and 5.1) and hyperinflation (OR= 1.6). PaO2 (p= 0.026), IC (p= 0.02) and RV/TLC (p= 0.03) were found to be predictive factors for FEV(1) decline. The model estimated BDI, lung function and exacerbation frequency at the last time point by testing initial data of three years with 95% reliability (p< 0.001). Accordingly, this model was evaluated as confident of 95% for assessing the future status of COPD patients. Using Bayesian predictive models, it was possible to develop a risk-stratification index that accurately predicted progression of COPD. This model can provide decision-making about future in COPD patients with high reliability looking clinical data of beginning.

  1. Modification of the Feline-Ality™ Assessment and the Ability to Predict Adopted Cats’ Behaviors in Their New Homes

    PubMed Central

    Weiss, Emily; Gramann, Shannon; Drain, Natasha; Dolan, Emily; Slater, Margaret

    2015-01-01

    Simple Summary While millions of cats enter animal shelters every year, only 11.5% of pet cats are obtained from a shelter in the United States. Previous research has indicated that unrealistic expectations set by adopters can increase the chances of an adopted cat returning to the shelter. The ASPCA®’s Meet Your Match® Feline-ality™ adoption program was designed to provide adopters with accurate information about an adult cat’s future behavior in the home. This research explored the ability of the modified Feline-ality™ assessment when done one day after the cat entered the shelter. Our modified version was predictive of feline behavior post adoption. Abstract It is estimated that 2.5 million cats enter animal shelters in the United States every year and as few as 20% leave the shelter alive. Of those adopted, the greatest risk to post-adoption human animal bond is unrealistic expectations set by the adopter. The ASPCA®’s Meet Your Match® Feline-ality™ adoption program was developed to provide adopters with an accurate assessment of an adult cat’s future behavior in the home. However, the original Feline-ality™ required a three-day hold time to collect cat behaviors on a data card, which was challenging for some shelters. This research involved creating a survey to determine in-home feline behavior post adoption and explored the predictive ability of the in-shelter assessment without the data card. Our results show that the original Feline-ality™ assessment and our modified version were predictive of feline behavior post adoption. Our modified version also decreased hold time for cats to one day. Shelters interested in increasing cat adoptions, decreasing length of stay and improving the adoption experience can now implement the modified version for future feline adoption success. PMID:26479138

  2. Evidence base and future research directions in the management of low back pain.

    PubMed

    Abbott, Allan

    2016-03-18

    Low back pain (LBP) is a prevalent and costly condition. Awareness of valid and reliable patient history taking, physical examination and clinical testing is important for diagnostic accuracy. Stratified care which targets treatment to patient subgroups based on key characteristics is reliant upon accurate diagnostics. Models of stratified care that can potentially improve treatment effects include prognostic risk profiling for persistent LBP, likely response to specific treatment based on clinical prediction models or suspected underlying causal mechanisms. The focus of this editorial is to highlight current research status and future directions for LBP diagnostics and stratified care.

  3. Disease prevention versus data privacy: using landcover maps to inform spatial epidemic models.

    PubMed

    Tildesley, Michael J; Ryan, Sadie J

    2012-01-01

    The availability of epidemiological data in the early stages of an outbreak of an infectious disease is vital for modelers to make accurate predictions regarding the likely spread of disease and preferred intervention strategies. However, in some countries, the necessary demographic data are only available at an aggregate scale. We investigated the ability of models of livestock infectious diseases to predict epidemic spread and obtain optimal control policies in the event of imperfect, aggregated data. Taking a geographic information approach, we used land cover data to predict UK farm locations and investigated the influence of using these synthetic location data sets upon epidemiological predictions in the event of an outbreak of foot-and-mouth disease. When broadly classified land cover data were used to create synthetic farm locations, model predictions deviated significantly from those simulated on true data. However, when more resolved subclass land use data were used, moderate to highly accurate predictions of epidemic size, duration and optimal vaccination and ring culling strategies were obtained. This suggests that a geographic information approach may be useful where individual farm-level data are not available, to allow predictive analyses to be carried out regarding the likely spread of disease. This method can also be used for contingency planning in collaboration with policy makers to determine preferred control strategies in the event of a future outbreak of infectious disease in livestock.

  4. Disease Prevention versus Data Privacy: Using Landcover Maps to Inform Spatial Epidemic Models

    PubMed Central

    Tildesley, Michael J.; Ryan, Sadie J.

    2012-01-01

    The availability of epidemiological data in the early stages of an outbreak of an infectious disease is vital for modelers to make accurate predictions regarding the likely spread of disease and preferred intervention strategies. However, in some countries, the necessary demographic data are only available at an aggregate scale. We investigated the ability of models of livestock infectious diseases to predict epidemic spread and obtain optimal control policies in the event of imperfect, aggregated data. Taking a geographic information approach, we used land cover data to predict UK farm locations and investigated the influence of using these synthetic location data sets upon epidemiological predictions in the event of an outbreak of foot-and-mouth disease. When broadly classified land cover data were used to create synthetic farm locations, model predictions deviated significantly from those simulated on true data. However, when more resolved subclass land use data were used, moderate to highly accurate predictions of epidemic size, duration and optimal vaccination and ring culling strategies were obtained. This suggests that a geographic information approach may be useful where individual farm-level data are not available, to allow predictive analyses to be carried out regarding the likely spread of disease. This method can also be used for contingency planning in collaboration with policy makers to determine preferred control strategies in the event of a future outbreak of infectious disease in livestock. PMID:23133352

  5. Dispersal and extrapolation on the accuracy of temporal predictions from distribution models for the Darwin's frog.

    PubMed

    Uribe-Rivera, David E; Soto-Azat, Claudio; Valenzuela-Sánchez, Andrés; Bizama, Gustavo; Simonetti, Javier A; Pliscoff, Patricio

    2017-07-01

    Climate change is a major threat to biodiversity; the development of models that reliably predict its effects on species distributions is a priority for conservation biogeography. Two of the main issues for accurate temporal predictions from Species Distribution Models (SDM) are model extrapolation and unrealistic dispersal scenarios. We assessed the consequences of these issues on the accuracy of climate-driven SDM predictions for the dispersal-limited Darwin's frog Rhinoderma darwinii in South America. We calibrated models using historical data (1950-1975) and projected them across 40 yr to predict distribution under current climatic conditions, assessing predictive accuracy through the area under the ROC curve (AUC) and True Skill Statistics (TSS), contrasting binary model predictions against temporal-independent validation data set (i.e., current presences/absences). To assess the effects of incorporating dispersal processes we compared the predictive accuracy of dispersal constrained models with no dispersal limited SDMs; and to assess the effects of model extrapolation on the predictive accuracy of SDMs, we compared this between extrapolated and no extrapolated areas. The incorporation of dispersal processes enhanced predictive accuracy, mainly due to a decrease in the false presence rate of model predictions, which is consistent with discrimination of suitable but inaccessible habitat. This also had consequences on range size changes over time, which is the most used proxy for extinction risk from climate change. The area of current climatic conditions that was absent in the baseline conditions (i.e., extrapolated areas) represents 39% of the study area, leading to a significant decrease in predictive accuracy of model predictions for those areas. Our results highlight (1) incorporating dispersal processes can improve predictive accuracy of temporal transference of SDMs and reduce uncertainties of extinction risk assessments from global change; (2) as geographical areas subjected to novel climates are expected to arise, they must be reported as they show less accurate predictions under future climate scenarios. Consequently, environmental extrapolation and dispersal processes should be explicitly incorporated to report and reduce uncertainties in temporal predictions of SDMs, respectively. Doing so, we expect to improve the reliability of the information we provide for conservation decision makers under future climate change scenarios. © 2017 by the Ecological Society of America.

  6. Visual prediction and perceptual expertise

    PubMed Central

    Cheung, Olivia S.; Bar, Moshe

    2012-01-01

    Making accurate predictions about what may happen in the environment requires analogies between perceptual input and associations in memory. These elements of predictions are based on cortical representations, but little is known about how these processes can be enhanced by experience and training. On the other hand, studies on perceptual expertise have revealed that the acquisition of expertise leads to strengthened associative processing among features or objects, suggesting that predictions and expertise may be tightly connected. Here we review the behavioral and neural findings regarding the mechanisms involving prediction and expert processing, and highlight important possible overlaps between them. Future investigation should examine the relations among perception, memory and prediction skills as a function of expertise. The knowledge gained by this line of research will have implications for visual cognition research, and will advance our understanding of how the human brain can improve its ability to predict by learning from experience. PMID:22123523

  7. Analysis of the predictive qualities of betting odds and FIFA World Ranking: evidence from the 2006, 2010 and 2014 Football World Cups.

    PubMed

    Wunderlich, Fabian; Memmert, Daniel

    2016-12-01

    The present study aims to investigate the ability of a new framework enabling to derive more detailed model-based predictions from ranking systems. These were compared to predictions from the bet market including data from the World Cups 2006, 2010, and 2014. The results revealed that the FIFA World Ranking has essentially improved its predictive qualities compared to the bet market since the mode of calculation was changed in 2006. While both predictors were useful to obtain accurate predictions in general, the world ranking was able to outperform the bet market significantly for the World Cup 2014 and when the data from the World Cups 2010 and 2014 were pooled. Our new framework can be extended in future research to more detailed prediction tasks (i.e., predicting the final scores of a match or the tournament progress of a team).

  8. Next Day Building Load Predictions based on Limited Input Features Using an On-Line Laterally Primed Adaptive Resonance Theory Artificial Neural Network.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Christian Birk; Robinson, Matt; Yasaei, Yasser

    Optimal integration of thermal energy storage within commercial building applications requires accurate load predictions. Several methods exist that provide an estimate of a buildings future needs. Methods include component-based models and data-driven algorithms. This work implemented a previously untested algorithm for this application that is called a Laterally Primed Adaptive Resonance Theory (LAPART) artificial neural network (ANN). The LAPART algorithm provided accurate results over a two month period where minimal historical data and a small amount of input types were available. These results are significant, because common practice has often overlooked the implementation of an ANN. ANN have often beenmore » perceived to be too complex and require large amounts of data to provide accurate results. The LAPART neural network was implemented in an on-line learning manner. On-line learning refers to the continuous updating of training data as time occurs. For this experiment, training began with a singe day and grew to two months of data. This approach provides a platform for immediate implementation that requires minimal time and effort. The results from the LAPART algorithm were compared with statistical regression and a component-based model. The comparison was based on the predictions linear relationship with the measured data, mean squared error, mean bias error, and cost savings achieved by the respective prediction techniques. The results show that the LAPART algorithm provided a reliable and cost effective means to predict the building load for the next day.« less

  9. Superensemble forecasts of dengue outbreaks

    PubMed Central

    Kandula, Sasikiran; Shaman, Jeffrey

    2016-01-01

    In recent years, a number of systems capable of predicting future infectious disease incidence have been developed. As more of these systems are operationalized, it is important that the forecasts generated by these different approaches be formally reconciled so that individual forecast error and bias are reduced. Here we present a first example of such multi-system, or superensemble, forecast. We develop three distinct systems for predicting dengue, which are applied retrospectively to forecast outbreak characteristics in San Juan, Puerto Rico. We then use Bayesian averaging methods to combine the predictions from these systems and create superensemble forecasts. We demonstrate that on average, the superensemble approach produces more accurate forecasts than those made from any of the individual forecasting systems. PMID:27733698

  10. Enhanced Neural Responses to Imagined Primary Rewards Predict Reduced Monetary Temporal Discounting.

    PubMed

    Hakimi, Shabnam; Hare, Todd A

    2015-09-23

    The pervasive tendency to discount the value of future rewards varies considerably across individuals and has important implications for health and well-being. Here, we used fMRI with human participants to examine whether an individual's neural representation of an imagined primary reward predicts the degree to which the value of delayed monetary payments is discounted. Because future rewards can never be experienced at the time of choice, imagining or simulating the benefits of a future reward may play a critical role in decisions between alternatives with either immediate or delayed benefits. We found that enhanced ventromedial prefrontal cortex response during imagined primary reward receipt was correlated with reduced discounting in a separate monetary intertemporal choice task. Furthermore, activity in enhanced ventromedial prefrontal cortex during reward imagination predicted temporal discounting behavior both between- and within-individual decision makers with 62% and 73% mean balanced accuracy, respectively. These results suggest that the quality of reward imagination may impact the degree to which future outcomes are discounted. Significance statement: We report a novel test of the hypothesis that an important factor influencing the discount rate for future rewards is the quality with which they are imagined or estimated in the present. Previous work has shown that temporal discounting is linked to individual characteristics ranging from general intelligence to the propensity for addiction. We demonstrate that individual differences in a neurobiological measure of primary reward imagination are significantly correlated with discounting rates for future monetary payments. Moreover, our neurobiological measure of imagination can be used to accurately predict choice behavior both between and within individuals. These results suggest that improving reward imagination may be a useful therapeutic target for individuals whose high discount rates promote detrimental behaviors. Copyright © 2015 the authors 0270-6474/15/3513103-07$15.00/0.

  11. Stochastic Earthquake Rupture Modeling Using Nonparametric Co-Regionalization

    NASA Astrophysics Data System (ADS)

    Lee, Kyungbook; Song, Seok Goo

    2017-09-01

    Accurate predictions of the intensity and variability of ground motions are essential in simulation-based seismic hazard assessment. Advanced simulation-based ground motion prediction methods have been proposed to complement the empirical approach, which suffers from the lack of observed ground motion data, especially in the near-source region for large events. It is important to quantify the variability of the earthquake rupture process for future events and to produce a number of rupture scenario models to capture the variability in simulation-based ground motion predictions. In this study, we improved the previously developed stochastic earthquake rupture modeling method by applying the nonparametric co-regionalization, which was proposed in geostatistics, to the correlation models estimated from dynamically derived earthquake rupture models. The nonparametric approach adopted in this study is computationally efficient and, therefore, enables us to simulate numerous rupture scenarios, including large events ( M > 7.0). It also gives us an opportunity to check the shape of true input correlation models in stochastic modeling after being deformed for permissibility. We expect that this type of modeling will improve our ability to simulate a wide range of rupture scenario models and thereby predict ground motions and perform seismic hazard assessment more accurately.

  12. Decaying relevance of clinical data towards future decisions in data-driven inpatient clinical order sets.

    PubMed

    Chen, Jonathan H; Alagappan, Muthuraman; Goldstein, Mary K; Asch, Steven M; Altman, Russ B

    2017-06-01

    Determine how varying longitudinal historical training data can impact prediction of future clinical decisions. Estimate the "decay rate" of clinical data source relevance. We trained a clinical order recommender system, analogous to Netflix or Amazon's "Customers who bought A also bought B..." product recommenders, based on a tertiary academic hospital's structured electronic health record data. We used this system to predict future (2013) admission orders based on different subsets of historical training data (2009 through 2012), relative to existing human-authored order sets. Predicting future (2013) inpatient orders is more accurate with models trained on just one month of recent (2012) data than with 12 months of older (2009) data (ROC AUC 0.91 vs. 0.88, precision 27% vs. 22%, recall 52% vs. 43%, all P<10 -10 ). Algorithmically learned models from even the older (2009) data was still more effective than existing human-authored order sets (ROC AUC 0.81, precision 16% recall 35%). Training with more longitudinal data (2009-2012) was no better than using only the most recent (2012) data, unless applying a decaying weighting scheme with a "half-life" of data relevance about 4 months. Clinical practice patterns (automatically) learned from electronic health record data can vary substantially across years. Gold standards for clinical decision support are elusive moving targets, reinforcing the need for automated methods that can adapt to evolving information. Prioritizing small amounts of recent data is more effective than using larger amounts of older data towards future clinical predictions. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Geometry and mass model of ionizing radiation experiments on the LDEF satellite

    NASA Technical Reports Server (NTRS)

    Colborn, B. L.; Armstrong, T. W.

    1992-01-01

    Extensive measurements related to ionizing radiation environments and effects were made on the LDEF satellite during its mission lifetime of almost 6 years. These data, together with the opportunity they provide for evaluating predictive models and analysis methods, should allow more accurate assessments of the space radiation environment and related effects for future missions in low Earth orbit. The LDEF radiation dosimetry data is influenced to varying degrees by material shielding effects due to the dosimeter itself, nearby components and experiments, and the spacecraft structure. A geometry and mass model is generated of LDEF, incorporating sufficient detail that it can be applied in determining the influence of material shielding on ionizing radiation measurements and predictions. This model can be used as an aid in data interpretation by unfolding shielding effects from the LDEF radiation dosimeter responses. Use of the LDEF geometry/mass model, in conjunction with predictions and comparisons with LDEF dosimetry data currently underway, will also allow more definitive evaluations of current radiation models for future mission applications.

  14. Main Trend Extraction Based on Irregular Sampling Estimation and Its Application in Storage Volume of Internet Data Center

    PubMed Central

    Dou, Chao

    2016-01-01

    The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always “dirty,” which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the “dirty” data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. 
 PMID:28090205

  15. Deadline rush: a time management phenomenon and its mathematical description.

    PubMed

    König, Cornelius J; Kleinmann, Martin

    2005-01-01

    A typical time management phenomenon is the rush before a deadline. Behavioral decision making research can be used to predict how behavior changes before a deadline. People are likely not to work on a project with a deadline in the far future because they generally discount future outcomes. Only when the deadline is close are people likely to work. On the basis of recent intertemporal choice experiments, the authors argue that a hyperbolic function should provide a more accurate description of the deadline rush than an exponential function predicted by an economic model of discounted utility. To show this, the fit of the hyperbolic and the exponential function were compared with data sets that describe when students study for exams. As predicted, the hyperbolic function fit the data significantly better than the exponential function. The implication for time management decisions is that they are most likely to be inconsistent over time (i.e., people make a plan how to use their time but do not follow it).

  16. Main Trend Extraction Based on Irregular Sampling Estimation and Its Application in Storage Volume of Internet Data Center.

    PubMed

    Miao, Beibei; Dou, Chao; Jin, Xuebo

    2016-01-01

    The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always "dirty," which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the "dirty" data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. 
 .

  17. Auralization Architectures for NASA?s Next Generation Aircraft Noise Prediction Program

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.; Aumann, Aric R.

    2013-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The assessment of human response to noise from future aircraft can only be afforded through laboratory testing using simulated flyover noise. Recent work by the authors demonstrated the ability to auralize predicted flyover noise for a state-of-the-art reference aircraft and a future hybrid wing body aircraft concept. This auralization used source noise predictions from NASA's Aircraft NOise Prediction Program (ANOPP) as input. The results from this process demonstrated that auralization based upon system noise predictions is consistent with, and complementary to, system noise predictions alone. To further develop and validate the auralization process, improvements to the interfaces between the synthesis capability and the system noise tools are required. This paper describes the key elements required for accurate noise synthesis and introduces auralization architectures for use with the next-generation ANOPP (ANOPP2). The architectures are built around a new auralization library and its associated Application Programming Interface (API) that utilize ANOPP2 APIs to access data required for auralization. The architectures are designed to make the process of auralizing flyover noise a common element of system noise prediction.

  18. Prediction of sickness absenteeism, disability pension and sickness presenteeism among employees with back pain.

    PubMed

    Bergström, Gunnar; Hagberg, Jan; Busch, Hillevi; Jensen, Irene; Björklund, Christina

    2014-06-01

    The primary aim of this study was to evaluate the predictive ability of the Örebro Musculoskeletal Pain Screening Questionnaire (ÖMPSQ) concerning long-term sick leave, sickness presenteeism and disability pension during a follow-up period of 2 years. The study group consisted of 195 employees visiting the occupational health service (OHS) due to back pain. Using receiver operating characteristic (ROC) curves, the area under the curve (AUC) varied from 0.67 to 0.93, which was from less accurate for sickness presenteeism to highly accurate for the prediction of disability pension. For registered sick leave during 6 months following the baseline the AUC from the ROC analyses was moderately accurate (0.81) and a cut off score of 90 rendered a high sensitivity of 0.89 but a low specificity of 0.46 whereas a cut off score of 105 improves the specificity substantially but at the cost of some sensitivity. The predictive ability appears to decrease with time. Several workplace factors beyond those included in the ÖMPSQ were considered but only social support at the workplace was significantly related to future long-term sick leave besides the total score of the ÖMPSQ. The results of this study extend and confirm the findings of earlier research on the ÖMPSQ. Assessment of psychosocial risk factors among employees seeking help for back pain at the OHS could be helpful in the prevention of work disabling problems.

  19. Measurement of Physical Activity and Energy Expenditure in Wheelchair Users: Methods, Considerations and Future Directions.

    PubMed

    Nightingale, Tom E; Rouse, Peter C; Thompson, Dylan; Bilzon, James L J

    2017-12-01

    Accurately measuring physical activity and energy expenditure in persons with chronic physical disabilities who use wheelchairs is a considerable and ongoing challenge. Quantifying various free-living lifestyle behaviours in this group is at present restricted by our understanding of appropriate measurement tools and analytical techniques. This review provides a detailed evaluation of the currently available measurement tools used to predict physical activity and energy expenditure in persons who use wheelchairs. It also outlines numerous considerations specific to this population and suggests suitable future directions for the field. Of the existing three self-report methods utilised in this population, the 3-day Physical Activity Recall Assessment for People with Spinal Cord Injury (PARA-SCI) telephone interview demonstrates the best reliability and validity. However, the complexity of interview administration and potential for recall bias are notable limitations. Objective measurement tools, which overcome such considerations, have been validated using controlled laboratory protocols. These have consistently demonstrated the arm or wrist as the most suitable anatomical location to wear accelerometers. Yet, more complex data analysis methodologies may be necessary to further improve energy expenditure prediction for more intricate movements or behaviours. Multi-sensor devices that incorporate physiological signals and acceleration have recently been adapted for persons who use wheelchairs. Population specific algorithms offer considerable improvements in energy expenditure prediction accuracy. This review highlights the progress in the field and aims to encourage the wider scientific community to develop innovative solutions to accurately quantify physical activity in this population.

  20. Global Quantitative Modeling of Chromatin Factor Interactions

    PubMed Central

    Zhou, Jian; Troyanskaya, Olga G.

    2014-01-01

    Chromatin is the driver of gene regulation, yet understanding the molecular interactions underlying chromatin factor combinatorial patterns (or the “chromatin codes”) remains a fundamental challenge in chromatin biology. Here we developed a global modeling framework that leverages chromatin profiling data to produce a systems-level view of the macromolecular complex of chromatin. Our model ultilizes maximum entropy modeling with regularization-based structure learning to statistically dissect dependencies between chromatin factors and produce an accurate probability distribution of chromatin code. Our unsupervised quantitative model, trained on genome-wide chromatin profiles of 73 histone marks and chromatin proteins from modENCODE, enabled making various data-driven inferences about chromatin profiles and interactions. We provided a highly accurate predictor of chromatin factor pairwise interactions validated by known experimental evidence, and for the first time enabled higher-order interaction prediction. Our predictions can thus help guide future experimental studies. The model can also serve as an inference engine for predicting unknown chromatin profiles — we demonstrated that with this approach we can leverage data from well-characterized cell types to help understand less-studied cell type or conditions. PMID:24675896

  1. Branch classification: A new mechanism for improving branch predictor performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, P.Y.; Hao, E.; Patt, Y.

    There is wide agreement that one of the most significant impediments to the performance of current and future pipelined superscalar processors is the presence of conditional branches in the instruction stream. Speculative execution is one solution to the branch problem, but speculative work is discarded if a branch is mispredicted. For it to be effective, speculative work is discarded if a branch is mispredicted. For it to be effective, speculative execution requires a very accurate branch predictor; 95% accuracy is not good enough. This paper proposes branch classification, a methodology for building more accurate branch predictors. Branch classification allows anmore » individual branch instruction to be associated with the branch predictor best suited to predict its direction. Using this approach, a hybrid branch predictor can be constructed such that each component branch predictor predicts those branches for which it is best suited. To demonstrate the usefulness of branch classification, an example classification scheme is given and a new hybrid predictor is built based on this scheme which achieves a higher prediction accuracy than any branch predictor previously reported in the literature.« less

  2. Augmenting the SCaN Link Budget Tool with Validated Atmospheric Propagation

    NASA Technical Reports Server (NTRS)

    Steinkerchner, Leo; Welch, Bryan

    2017-01-01

    In any Earth-Space or Space-Earth communications link, atmospheric effects cause significant signal attenuation. In order to develop a communications system that is cost effective while meeting appropriate performance requirements, it is important to accurately predict these effects for the given link parameters. This project aimed to develop a Matlab(TradeMark) (The MathWorks, Inc.) program that could augment the existing Space Communications and Navigation (SCaN) Link Budget Tool with accurate predictions of atmospheric attenuation of both optical and radio-frequency signals according to the SCaN Optical Link Assessment Model Version 5 and the International Telecommunications Union, Radiocommunications Sector (ITU-R) atmospheric propagation loss model, respectively. When compared to data collected from the Advance Communications Technology Satellite (ACTS), the radio-frequency model predicted attenuation to within 1.3 dB of loss for 95 of measurements. Ultimately, this tool will be integrated into the SCaN Center for Engineering, Networks, Integration, and Communications (SCENIC) user interface in order to support analysis of existing SCaN systems and planning capabilities for future NASA missions.

  3. Life prediction and constitutive behavior

    NASA Technical Reports Server (NTRS)

    Halford, G. R.

    1983-01-01

    One of the primary drivers that prompted the initiation of the hot section technology (HOST) program was the recognized need for improved cyclic durability of costly hot section components. All too frequently, fatigue in one form or another was directly responsible for the less than desired durability, and prospects for the future weren't going to improve unless a significant effort was mounted to increase our knowledge and understanding of the elements governing cyclic crack initiation and propagation lifetime. Certainly one of the important factors is the ability to perform accurate structural stress-strain analyses on a routine basis to determine the magnitudes of the localized stresses and strains since it is these localized conditions that govern the initiation and crack growth processes. Developing the ability to more accurately predict crack initiation lifetimes and cyclic crack growth rates for the complex loading conditions found in turbine engine hot sections is of course the ultimate goal of the life prediction research efforts. It has been found convenient to divide the research efforts into those dealing with nominally isotropic and anisotropic alloys; the latter for application to directionally solidified and single crystal turbine blades.

  4. Phylogeny predicts future habitat shifts due to climate change.

    PubMed

    Kuntner, Matjaž; Năpăruş, Magdalena; Li, Daiqin; Coddington, Jonathan A

    2014-01-01

    Taxa may respond differently to climatic changes, depending on phylogenetic or ecological effects, but studies that discern among these alternatives are scarce. Here, we use two species pairs from globally distributed spider clades, each pair representing two lifestyles (generalist, specialist) to test the relative importance of phylogeny versus ecology in predicted responses to climate change. We used a recent phylogenetic hypothesis for nephilid spiders to select four species from two genera (Nephilingis and Nephilengys) that match the above criteria, are fully allopatric but combined occupy all subtropical-tropical regions. Based on their records, we modeled each species niche spaces and predicted their ecological shifts 20, 40, 60, and 80 years into the future using customized GIS tools and projected climatic changes. Phylogeny better predicts the species current ecological preferences than do lifestyles. By 2080 all species face dramatic reductions in suitable habitat (54.8-77.1%) and adapt by moving towards higher altitudes and latitudes, although at different tempos. Phylogeny and life style explain simulated habitat shifts in altitude, but phylogeny is the sole best predictor of latitudinal shifts. Models incorporating phylogenetic relatedness are an important additional tool to predict accurately biotic responses to global change.

  5. Can Rheumatoid Arthritis Be Prevented?

    PubMed Central

    Deane, Kevin

    2013-01-01

    The discovery of elevations of rheumatoid arthritis (RA)-related biomarkers prior to the onset of clinically apparent RA raises hopes that individuals who are at risk for future RA can be identified in a preclinical phase of disease that is defined as abnormalities of RA-related immune activity prior to the clinically apparent onset of joint disease. Additionally, there is a growing understanding of the immunologic processes that are occurring in preclinical RA, as well as a growing understanding of risk factors that may be mechanistically related to RA development. Furthermore, there are data supporting that treatment of early RA can lead to drug free remission. Taken as a whole, these findings suggest that it may be possible to use biomarkers and other factors to accurately identify the likelihood and timing of onset of future RA, and intervene with immunomodulatory therapies and/or risk factor modification to prevent the future onset of RA in at-risk individuals. Importantly, several clinical prevention trials for RA have already been tried, and one is underway. However, while our understanding of the growing understanding of the mechanisms and natural history of RA development may be leading us to the implementation of prevention strategies for RA, there are still several challenges to be met. These include developing sufficiently accurate methods of predicting those at high risk for future RA so that clinical trials can be developed based on accurate rates of development of arthritis and subjects can be adequately informed of their risk for disease, identifying the appropriate interventions and biologic targets for optimal prevention, and addressing the psychosocial and economic aspects that are crucial to developing broadly applicable prevention measures for RA. These issues notwithstanding, prevention of RA may be within reach in the near future. PMID:24315049

  6. Live Fast, Die Young: Experimental Evidence of Population Extinction Risk due to Climate Change.

    PubMed

    Bestion, Elvire; Teyssier, Aimeric; Richard, Murielle; Clobert, Jean; Cote, Julien

    2015-10-01

    Evidence has accumulated in recent decades on the drastic impact of climate change on biodiversity. Warming temperatures have induced changes in species physiology, phenology, and have decreased body size. Such modifications can impact population dynamics and could lead to changes in life cycle and demography. More specifically, conceptual frameworks predict that global warming will severely threaten tropical ectotherms while temperate ectotherms should resist or even benefit from higher temperatures. However, experimental studies measuring the impacts of future warming trends on temperate ectotherms' life cycle and population persistence are lacking. Here we investigate the impacts of future climates on a model vertebrate ectotherm species using a large-scale warming experiment. We manipulated climatic conditions in 18 seminatural populations over two years to obtain a present climate treatment and a warm climate treatment matching IPCC predictions for future climate. Warmer temperatures caused a faster body growth, an earlier reproductive onset, and an increased voltinism, leading to a highly accelerated life cycle but also to a decrease in adult survival. A matrix population model predicts that warm climate populations in our experiment should go extinct in around 20 y. Comparing our experimental climatic conditions to conditions encountered by populations across Europe, we suggest that warming climates should threaten a significant number of populations at the southern range of the distribution. Our findings stress the importance of experimental approaches on the entire life cycle to more accurately predict population and species persistence in future climates.

  7. Live Fast, Die Young: Experimental Evidence of Population Extinction Risk due to Climate Change

    PubMed Central

    Bestion, Elvire; Teyssier, Aimeric; Richard, Murielle; Clobert, Jean; Cote, Julien

    2015-01-01

    Evidence has accumulated in recent decades on the drastic impact of climate change on biodiversity. Warming temperatures have induced changes in species physiology, phenology, and have decreased body size. Such modifications can impact population dynamics and could lead to changes in life cycle and demography. More specifically, conceptual frameworks predict that global warming will severely threaten tropical ectotherms while temperate ectotherms should resist or even benefit from higher temperatures. However, experimental studies measuring the impacts of future warming trends on temperate ectotherms' life cycle and population persistence are lacking. Here we investigate the impacts of future climates on a model vertebrate ectotherm species using a large-scale warming experiment. We manipulated climatic conditions in 18 seminatural populations over two years to obtain a present climate treatment and a warm climate treatment matching IPCC predictions for future climate. Warmer temperatures caused a faster body growth, an earlier reproductive onset, and an increased voltinism, leading to a highly accelerated life cycle but also to a decrease in adult survival. A matrix population model predicts that warm climate populations in our experiment should go extinct in around 20 y. Comparing our experimental climatic conditions to conditions encountered by populations across Europe, we suggest that warming climates should threaten a significant number of populations at the southern range of the distribution. Our findings stress the importance of experimental approaches on the entire life cycle to more accurately predict population and species persistence in future climates. PMID:26501958

  8. The Recalibrated Sunspot Number: Impact on Solar Cycle Predictions

    NASA Astrophysics Data System (ADS)

    Clette, F.; Lefevre, L.

    2017-12-01

    Recently and for the first time since their creation, the sunspot number and group number series were entirely revisited and a first fully recalibrated version was officially released in July 2015 by the World Data Center SILSO (Brussels). Those reference long-term series are widely used as input data or as a calibration reference by various solar cycle prediction methods. Therefore, past predictions may now need to be redone using the new sunspot series, and methods already used for predicting cycle 24 will require adaptations before attempting predictions of the next cycles.In order to clarify the nature of the applied changes, we describe the different corrections applied to the sunspot and group number series, which affect extended time periods and can reach up to 40%. While some changes simply involve constant scale factors, other corrections vary with time or follow the solar cycle modulation. Depending on the prediction method and on the selected time interval, this can lead to different responses and biases. Moreover, together with the new series, standard error estimates are also progressively added to the new sunspot numbers, which may help deriving more accurate uncertainties for predicted activity indices. We conclude on the new round of recalibration that is now undertaken in the framework of a broad multi-team collaboration articulated around upcoming ISSI workshops. We outline the future corrections that can still be expected in the future, as part of a permanent upgrading process and quality control. From now on, future sunspot-based predictive models should thus be made more adaptable, and regular updates of predictions should become common practice in order to track periodic upgrades of the sunspot number series, just like it is done when using other modern solar observational series.

  9. Noninvasive Intracranial Pressure Monitoring Using Advanced Machine Learning Techniques

    DTIC Science & Technology

    2013-11-01

    drainage requiring removal to prevent infection or to allow computed tomography scan. 3. If clinicians had the ability to predict near-future ICP... drainage of cerebrospinal fluid from the ventricles; however, ICP readings are only accurate when the external drainage system is clamped. ICP clamping... craniotomy for hemorrhage evacuation or a craniectomy for treatment of cerebral edema. Overall in-hospital mortality was 19.4%. Eight hundred and ninety

  10. Aeropropulsion 1987. Session 3: Internal Fluid Mechanics Research

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Internal fluid mechanics research at Lewis is directed toward an improved understanding of the important flow physics affecting aerospace propulsion systems, and applying this improved understanding to formulate accurate predictive codes. To this end, research is conducted involving detailed experimentation and analysis. The presentations in this session summarize ongoing work and indicated future emphasis in three major research thrusts: namely, inlets, ducts, and nozzles; turbomachinery; and chemical reacting flows.

  11. Evaluation of Industry Standard Turbulence Models on an Axisymmetric Supersonic Compression Corner

    NASA Technical Reports Server (NTRS)

    DeBonis, James R.

    2015-01-01

    Reynolds-averaged Navier-Stokes computations of a shock-wave/boundary-layer interaction (SWBLI) created by a Mach 2.85 flow over an axisymmetric 30-degree compression corner were carried out. The objectives were to evaluate four turbulence models commonly used in industry, for SWBLIs, and to evaluate the suitability of this test case for use in further turbulence model benchmarking. The Spalart-Allmaras model, Menter's Baseline and Shear Stress Transport models, and a low-Reynolds number k- model were evaluated. Results indicate that the models do not accurately predict the separation location; with the SST model predicting the separation onset too early and the other models predicting the onset too late. Overall the Spalart-Allmaras model did the best job in matching the experimental data. However there is significant room for improvement, most notably in the prediction of the turbulent shear stress. Density data showed that the simulations did not accurately predict the thermal boundary layer upstream of the SWBLI. The effect of turbulent Prandtl number and wall temperature were studied in an attempt to improve this prediction and understand their effects on the interaction. The data showed that both parameters can significantly affect the separation size and location, but did not improve the agreement with the experiment. This case proved challenging to compute and should provide a good test for future turbulence modeling work.

  12. UAV Trajectory Modeling Using Neural Networks

    NASA Technical Reports Server (NTRS)

    Xue, Min

    2017-01-01

    Large amount of small Unmanned Aerial Vehicles (sUAVs) are projected to operate in the near future. Potential sUAV applications include, but not limited to, search and rescue, inspection and surveillance, aerial photography and video, precision agriculture, and parcel delivery. sUAVs are expected to operate in the uncontrolled Class G airspace, which is at or below 500 feet above ground level (AGL), where many static and dynamic constraints exist, such as ground properties and terrains, restricted areas, various winds, manned helicopters, and conflict avoidance among sUAVs. How to enable safe, efficient, and massive sUAV operations at the low altitude airspace remains a great challenge. NASA's Unmanned aircraft system Traffic Management (UTM) research initiative works on establishing infrastructure and developing policies, requirement, and rules to enable safe and efficient sUAVs' operations. To achieve this goal, it is important to gain insights of future UTM traffic operations through simulations, where the accurate trajectory model plays an extremely important role. On the other hand, like what happens in current aviation development, trajectory modeling should also serve as the foundation for any advanced concepts and tools in UTM. Accurate models of sUAV dynamics and control systems are very important considering the requirement of the meter level precision in UTM operations. The vehicle dynamics are relatively easy to derive and model, however, vehicle control systems remain unknown as they are usually kept by manufactures as a part of intellectual properties. That brings challenges to trajectory modeling for sUAVs. How to model the vehicle's trajectories with unknown control system? This work proposes to use a neural network to model a vehicle's trajectory. The neural network is first trained to learn the vehicle's responses at numerous conditions. Once being fully trained, given current vehicle states, winds, and desired future trajectory, the neural network should be able to predict the vehicle's future states at next time step. A complete 4-D trajectory are then generated step by step using the trained neural network. Experiments in this work show that the neural network can approximate the sUAV's model and predict the trajectory accurately.

  13. A new powerful parameterization tool for managing groundwater resources and predicting land subsidence in Las Vegas Valley

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Nunes, V. D.; Burbey, T. J.; Borggaard, J.

    2012-12-01

    More than 1.5 m of subsidence has been observed in Las Vegas Valley since 1935 as a result of groundwater pumping that commenced in 1905 (Bell, 2002). The compaction of the aquifer system has led to several large subsidence bowls and deleterious earth fissures. The highly heterogeneous aquifer system with its variably thick interbeds makes predicting the magnitude and location of subsidence extremely difficult. Several numerical groundwater flow models of the Las Vegas basin have been previously developed; however none of them have been able to accurately simulate the observed subsidence patterns or magnitudes because of inadequate parameterization. To better manage groundwater resources and predict future subsidence we have updated and developed a more accurate groundwater management model for Las Vegas Valley by developing a new adjoint parameter estimation package (APE) that is used in conjunction with UCODE along with MODFLOW and the SUB (subsidence) and HFB (horizontal flow barrier) packages. The APE package is used with UCODE to automatically identify suitable parameter zonations and inversely calculate parameter values from hydraulic head and subsidence measurements, which are highly sensitive to both elastic (Ske) and inelastic (Skv) storage coefficients. With the advent of InSAR (Interferometric synthetic aperture radar), distributed spatial and temporal subsidence measurements can be obtained, which greatly enhance the accuracy of parameter estimation. This automation process can remove user bias and provide a far more accurate and robust parameter zonation distribution. The outcome of this work yields a more accurate and powerful tool for managing groundwater resources in Las Vegas Valley to date.

  14. Annual temperature variation as a time machine to understand the effects of long-term climate change on a poleward range shift.

    PubMed

    Crickenberger, Sam; Wethey, David S

    2018-05-10

    Range shifts due to annual variation in temperature are more tractable than range shifts linked to decadal to century long temperature changes due to climate change, providing natural experiments to determine the mechanisms responsible for driving long-term distributional shifts. In this study we couple physiologically grounded mechanistic models with biogeographic surveys in 2 years with high levels of annual temperature variation to disentangle the drivers of a historical range shift driven by climate change. The distribution of the barnacle Semibalanus balanoides has shifted 350 km poleward in the past half century along the east coast of the United States. Recruits were present throughout the historical range following the 2015 reproductive season, when temperatures were similar to those in the past century, and absent following the 2016 reproductive season when temperatures were warmer than they have been since 1870, the earliest date for temperature records. Our dispersal dependent mechanistic models of reproductive success were highly accurate and predicted patterns of reproduction success documented in field surveys throughout the historical range in 2015 and 2016. Our mechanistic models of reproductive success not only predicted recruitment dynamics near the range edge but also predicted interior range fragmentation in a number of years between 1870 and 2016. All recruits monitored within the historical range following the 2015 colonization died before 2016 suggesting juvenile survival was likely the primary driver of the historical range retraction. However, if 2016 is indicative of future temperatures mechanisms of range limitation will shift and reproductive failure will lead to further range retraction in the future. Mechanistic models are necessary for accurately predicting the effects of climate change on ranges of species. © 2018 John Wiley & Sons Ltd.

  15. Exploiting Locality in Quantum Computation for Quantum Chemistry.

    PubMed

    McClean, Jarrod R; Babbush, Ryan; Love, Peter J; Aspuru-Guzik, Alán

    2014-12-18

    Accurate prediction of chemical and material properties from first-principles quantum chemistry is a challenging task on traditional computers. Recent developments in quantum computation offer a route toward highly accurate solutions with polynomial cost; however, this solution still carries a large overhead. In this Perspective, we aim to bring together known results about the locality of physical interactions from quantum chemistry with ideas from quantum computation. We show that the utilization of spatial locality combined with the Bravyi-Kitaev transformation offers an improvement in the scaling of known quantum algorithms for quantum chemistry and provides numerical examples to help illustrate this point. We combine these developments to improve the outlook for the future of quantum chemistry on quantum computers.

  16. Rising sea levels will reduce extreme temperature variations in tide-dominated reef habitats.

    PubMed

    Lowe, Ryan Joseph; Pivan, Xavier; Falter, James; Symonds, Graham; Gruber, Renee

    2016-08-01

    Temperatures within shallow reefs often differ substantially from those in the surrounding ocean; therefore, predicting future patterns of thermal stresses and bleaching at the scale of reefs depends on accurately predicting reef heat budgets. We present a new framework for quantifying how tidal and solar heating cycles interact with reef morphology to control diurnal temperature extremes within shallow, tidally forced reefs. Using data from northwestern Australia, we construct a heat budget model to investigate how frequency differences between the dominant lunar semidiurnal tide and diurnal solar cycle drive ~15-day modulations in diurnal temperature extremes. The model is extended to show how reefs with tidal amplitudes comparable to their depth, relative to mean sea level, tend to experience the largest temperature extremes globally. As a consequence, we reveal how even a modest sea level rise can substantially reduce temperature extremes within tide-dominated reefs, thereby partially offsetting the local effects of future ocean warming.

  17. The history and future of nursing labor research in a cost-control environment.

    PubMed

    Brewer, C S

    1998-04-01

    For the first time in nursing's history, the downsizing of hospitals, the increased use of managed care, reduced use of registered nurses and other factors may result in significant unemployment in nursing, with resulting downward adjustments in the wage. Understanding the labor supply response of nurses to changes in the wage is critical to predicting accurately how nurses will respond to changes in the market demand as it influences wages, and determining rational policy responses to the labor market. In this article, three generations of nursing labor research are summarized and critiqued. Methodological issues are discussed and specific directions for future studies are suggested.

  18. Evidence base and future research directions in the management of low back pain

    PubMed Central

    Abbott, Allan

    2016-01-01

    Low back pain (LBP) is a prevalent and costly condition. Awareness of valid and reliable patient history taking, physical examination and clinical testing is important for diagnostic accuracy. Stratified care which targets treatment to patient subgroups based on key characteristics is reliant upon accurate diagnostics. Models of stratified care that can potentially improve treatment effects include prognostic risk profiling for persistent LBP, likely response to specific treatment based on clinical prediction models or suspected underlying causal mechanisms. The focus of this editorial is to highlight current research status and future directions for LBP diagnostics and stratified care. PMID:27004162

  19. Big data integration shows Australian bush-fire frequency is increasing significantly.

    PubMed

    Dutta, Ritaban; Das, Aruneema; Aryal, Jagannath

    2016-02-01

    Increasing Australian bush-fire frequencies over the last decade has indicated a major climatic change in coming future. Understanding such climatic change for Australian bush-fire is limited and there is an urgent need of scientific research, which is capable enough to contribute to Australian society. Frequency of bush-fire carries information on spatial, temporal and climatic aspects of bush-fire events and provides contextual information to model various climate data for accurately predicting future bush-fire hot spots. In this study, we develop an ensemble method based on a two-layered machine learning model to establish relationship between fire incidence and climatic data. In a 336 week data trial, we demonstrate that the model provides highly accurate bush-fire incidence hot-spot estimation (91% global accuracy) from the weekly climatic surfaces. Our analysis also indicates that Australian weekly bush-fire frequencies increased by 40% over the last 5 years, particularly during summer months, implicating a serious climatic shift.

  20. Big data integration shows Australian bush-fire frequency is increasing significantly

    PubMed Central

    Dutta, Ritaban; Das, Aruneema; Aryal, Jagannath

    2016-01-01

    Increasing Australian bush-fire frequencies over the last decade has indicated a major climatic change in coming future. Understanding such climatic change for Australian bush-fire is limited and there is an urgent need of scientific research, which is capable enough to contribute to Australian society. Frequency of bush-fire carries information on spatial, temporal and climatic aspects of bush-fire events and provides contextual information to model various climate data for accurately predicting future bush-fire hot spots. In this study, we develop an ensemble method based on a two-layered machine learning model to establish relationship between fire incidence and climatic data. In a 336 week data trial, we demonstrate that the model provides highly accurate bush-fire incidence hot-spot estimation (91% global accuracy) from the weekly climatic surfaces. Our analysis also indicates that Australian weekly bush-fire frequencies increased by 40% over the last 5 years, particularly during summer months, implicating a serious climatic shift. PMID:26998312

  1. Validating Inertial Confinement Fusion (ICF) predictive capability using perturbed capsules

    NASA Astrophysics Data System (ADS)

    Schmitt, Mark; Magelssen, Glenn; Tregillis, Ian; Hsu, Scott; Bradley, Paul; Dodd, Evan; Cobble, James; Flippo, Kirk; Offerman, Dustin; Obrey, Kimberly; Wang, Yi-Ming; Watt, Robert; Wilke, Mark; Wysocki, Frederick; Batha, Steven

    2009-11-01

    Achieving ignition on NIF is a monumental step on the path toward utilizing fusion as a controlled energy source. Obtaining robust ignition requires accurate ICF models to predict the degradation of ignition caused by heterogeneities in capsule construction and irradiation. LANL has embarked on a project to induce controlled defects in capsules to validate our ability to predict their effects on fusion burn. These efforts include the validation of feature-driven hydrodynamics and mix in a convergent geometry. This capability is needed to determine the performance of capsules imploded under less-than-optimum conditions on future IFE facilities. LANL's recently initiated Defect Implosion Experiments (DIME) conducted at Rochester's Omega facility are providing input for these efforts. Recent simulation and experimental results will be shown.

  2. First-principles elastic constants of α- and θ-Al2O3

    NASA Astrophysics Data System (ADS)

    Shang, Shunli; Wang, Yi; Liu, Zi-Kui

    2007-03-01

    Using an efficient strain-stress method, the first-principles elastic constants cij's of α-Al2O3 and θ-Al2O3 have been predicted within the local density approximation and the generalized gradient approximation. It is indicated that more accurate calculations of cij's can be accomplished by the local density approximation. The predicted cij's of θ-Al2O3 provide helpful guidance for future measurements, especially the predicted negative c15. The present results make the stress estimation in thermally grown oxides containing of α- and θ-Al2O3 possible, which in turn provide helpful insights for preventing the failure of thermal barrier coatings on components in gas-turbine engines.

  3. Decaying Relevance of Clinical Data Towards Future Decisions in Data-Driven Inpatient Clinical Order Sets

    PubMed Central

    Chen, Jonathan H; Alagappan, Muthuraman; Goldstein, Mary K; Asch, Steven M; Altman, Russ B

    2017-01-01

    Objective Determine how varying longitudinal historical training data can impact prediction of future clinical decisions. Estimate the “decay rate” of clinical data source relevance. Materials and Methods We trained a clinical order recommender system, analogous to Netflix or Amazon’s “Customers who bought A also bought B…” product recommenders, based on a tertiary academic hospital’s structured electronic health record data. We used this system to predict future (2013) admission orders based on different subsets of historical training data (2009 through 2012), relative to existing human-authored order sets. Results Predicting future (2013) inpatient orders is more accurate with models trained on just one month of recent (2012) data than with 12 months of older (2009) data (ROC AUC 0.91 vs. 0.88, precision 27% vs. 22%, recall 52% vs. 43%, all P<10−10). Algorithmically learned models from even the older (2009) data was still more effective than existing human-authored order sets (ROC AUC 0.81, precision 16% recall 35%). Training with more longitudinal data (2009–2012) was no better than using only the most recent (2012) data, unless applying a decaying weighting scheme with a “half-life” of data relevance about 4 months. Discussion Clinical practice patterns (automatically) learned from electronic health record data can vary substantially across years. Gold standards for clinical decision support are elusive moving targets, reinforcing the need for automated methods that can adapt to evolving information. Conclusions and Relevance Prioritizing small amounts of recent data is more effective than using larger amounts of older data towards future clinical predictions. PMID:28495350

  4. Changes in Cleanup Strategies and Long-Term Monitoring Costs for DOE FUSRAP Sites-17241

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castillo, Darina; Carpenter, Cliff; Roberts, Rebecca

    2017-03-05

    LM is preparing for the transfer of 11 new FUSRAP sites within the next 10 years from USACE, many of which will have substantially greater LTSM requirements than the current Completed sites. LM is analyzing the estimates for the level of effort required to monitor the new sites in order to make more customized and accurate predictions of future life cycle costs and environmental liabilities of these sites.

  5. The impact of demographic change on the estimated future burden of infectious diseases: examples from hepatitis B and seasonal influenza in the Netherlands

    PubMed Central

    2012-01-01

    Background For accurate estimation of the future burden of communicable diseases, the dynamics of the population at risk – namely population growth and population ageing – need to be taken into account. Accurate burden estimates are necessary for informing policy-makers regarding the planning of vaccination and other control, intervention, and prevention measures. Our aim was to qualitatively explore the impact of population ageing on the estimated future burden of seasonal influenza and hepatitis B virus (HBV) infection in the Netherlands, in the period 2000–2030. Methods Population-level disease burden was quantified using the disability-adjusted life years (DALY) measure applied to all health outcomes following acute infection. We used national notification data, pre-defined disease progression models, and a simple model of demographic dynamics to investigate the impact of population ageing on the burden of seasonal influenza and HBV. Scenario analyses were conducted to explore the potential impact of intervention-associated changes in incidence rates. Results Including population dynamics resulted in increasing burden over the study period for influenza, whereas a relatively stable future burden was predicted for HBV. For influenza, the increase in DALYs was localised within YLL for the oldest age-groups (55 and older), and for HBV the effect of longer life expectancy in the future was offset by a reduction in incidence in the age-groups most at risk of infection. For both infections, the predicted disease burden was greater than if a static demography was assumed: 1.0 (in 2000) to 2.3-fold (in 2030) higher DALYs for influenza; 1.3 (in 2000) to 1.5-fold (in 2030) higher for HBV. Conclusions There are clear, but diverging effects of an ageing population on the estimated disease burden of influenza and HBV in the Netherlands. Replacing static assumptions with a dynamic demographic approach appears essential for deriving realistic burden estimates for informing health policy. PMID:23217094

  6. Precise positioning with sparse radio tracking: How LRO-LOLA and GRAIL enable future lunar exploration

    NASA Astrophysics Data System (ADS)

    Mazarico, E.; Goossens, S. J.; Barker, M. K.; Neumann, G. A.; Zuber, M. T.; Smith, D. E.

    2017-12-01

    Two recent NASA missions to the Moon, the Lunar Reconnaissance Orbiter (LRO) and the Gravity Recovery and Interior Laboratory (GRAIL), have obtained highly accurate information about the lunar shape and gravity field. These global geodetic datasets resolve long-standing issues with mission planning; the tidal lock of the Moon long prevented collection of accurate gravity measurements over the farside, and deteriorated precise positioning of topographic data. We describe key datasets and results from the LRO and GRAIL mission that are directly relevant to future lunar missions. SmallSat and CubeSat missions especially would benefit from these recent improvements, as they are typically more resource-constrained. Even with limited radio tracking data, accurate knowledge of topography and gravity enables precise orbit determination (OD) (e.g., limiting the scope of geolocation and co-registration tasks) and long-term predictions of altitude (e.g., dramatically reducing uncertainties in impact time). With one S-band tracking pass per day, LRO OD now routinely achieves total position knowledge better than 10 meters and radial position knowledge around 0.5 meter. Other tracking data, such as Laser Ranging from Earth-based SLR stations, can further support OD. We also show how altimetry can be used to substantially improve orbit reconstruction with the accurate topographic maps now available from Lunar Orbiter Laser Altimeter (LOLA) data. We present new results with SELENE extended mission and LRO orbits processed with direct altimetry measurements. With even a simple laser altimeter onboard, high-quality OD can be achieved for future missions because of the datasets acquired by LRO and GRAIL, without the need for regular radio contact. Onboard processing of altimetric ranges would bring high-quality real-time position knowledge to support autonomous operation. We also describe why optical ranging transponders are ideal payloads for future lunar missions, as they can address both communication and navigation needs with little resources.

  7. Regulating Cortical Neurodynamics for Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Liljenström, Hans

    2002-09-01

    Behaving systems, biological as well as artificial, need to respond quickly and accurately to changes in the environment. The response is dependent on stored memories, and novel situations should be learnt for the guidance of future behavior. A highly nonlinear system dynamics is required in order to cope with a complex and changing environment, and this dynamics should be regulated to match the demands of the current situation, and to predict future behavior. In many cases the dynamics should be regulated to minimize processing time. We use computer simulations of cortical structures in order to investigate how the neurodynamics of these systems can be regulated for optimal performance in an unknown and changing environment. In particular, we study how cortical oscillations can serve to amplify weak signals and sustain an input pattern for more accurate information processing, and how chaotic-like behavior could increase the sensitivity in initial, exploratory states. We mimic regulating mechanisms based on neuromodulators, intrinsic noise levels, and various synchronizing effects. We find optimal noise levels where system performance is maximized, and neuromodulatory strategies for an efficient pattern recognition, where the anticipatory state of the system plays an important role.

  8. Operational planning using Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)

    NASA Astrophysics Data System (ADS)

    O'Connor, Alison; Kirtman, Benjamin; Harrison, Scott; Gorman, Joe

    2016-05-01

    The US Navy faces several limitations when planning operations in regard to forecasting environmental conditions. Currently, mission analysis and planning tools rely heavily on short-term (less than a week) forecasts or long-term statistical climate products. However, newly available data in the form of weather forecast ensembles provides dynamical and statistical extended-range predictions that can produce more accurate predictions if ensemble members can be combined correctly. Charles River Analytics is designing the Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS), which performs data fusion over extended-range multi-model ensembles, such as the North American Multi-Model Ensemble (NMME), to produce a unified forecast for several weeks to several seasons in the future. We evaluated thirty years of forecasts using machine learning to select predictions for an all-encompassing and superior forecast that can be used to inform the Navy's decision planning process.

  9. Bringing modeling to the masses: A web based system to predict potential species distributions

    USGS Publications Warehouse

    Graham, Jim; Newman, Greg; Kumar, Sunil; Jarnevich, Catherine S.; Young, Nick; Crall, Alycia W.; Stohlgren, Thomas J.; Evangelista, Paul

    2010-01-01

    Predicting current and potential species distributions and abundance is critical for managing invasive species, preserving threatened and endangered species, and conserving native species and habitats. Accurate predictive models are needed at local, regional, and national scales to guide field surveys, improve monitoring, and set priorities for conservation and restoration. Modeling capabilities, however, are often limited by access to software and environmental data required for predictions. To address these needs, we built a comprehensive web-based system that: (1) maintains a large database of field data; (2) provides access to field data and a wealth of environmental data; (3) accesses values in rasters representing environmental characteristics; (4) runs statistical spatial models; and (5) creates maps that predict the potential species distribution. The system is available online at www.niiss.org, and provides web-based tools for stakeholders to create potential species distribution models and maps under current and future climate scenarios.

  10. Parental perception of child’s weight status and subsequent BMIz change: the KOALA birth cohort study

    PubMed Central

    2014-01-01

    Background Parents often fail to correctly perceive their children’s weight status, but no studies have examined the association between parental weight status perception and longitudinal BMIz change (BMI standardized to a reference population) at various ages. We investigated whether parents are able to accurately perceive their child’s weight status at age 5. We also investigated predictors of accurate weight status perception. Finally, we investigated the predictive value of accurate weight status perception in explaining children’s longitudinal weight development up to the age of 9, in children who were overweight at the age of 5. Methods We used longitudinal data from the KOALA Birth Cohort Study. At the child’s age of 5 years, parents filled out a questionnaire regarding child and parent characteristics and their perception of their child’s weight status. We calculated the children’s actual weight status from parental reports of weight and height at ages 2, 5, 6, 7, 8, and 9 years. Regression analyses were used to identify factors predicting which parents accurately perceived their child’s weight status. Finally, regression analyses were used to predict subsequent longitudinal BMIz change in overweight children. Results Eighty-five percent of the parents of overweight children underestimated their child’s weight status at age 5. The child’s BMIz at age 2 and 5 were significant positive predictors of accurate weight status perception (vs. underestimation) in normal weight and overweight children. Accurate weight status perception was a predictor of higher future BMI in overweight children, corrected for actual BMI at baseline. Conclusions Children of parents who accurately perceived their child’s weight status had a higher BMI over time, probably making it easier for parents to correctly perceive their child’s overweight. Parental awareness of the child’s overweight as such may not be sufficient for subsequent weight management by the parents, implying that parents who recognize their child’s overweight may not be able or willing to adequately manage the overweight. PMID:24678601

  11. Examination of Solar Cycle Statistical Model and New Prediction of Solar Cycle 23

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Wilson, John W.

    2000-01-01

    Sunspot numbers in the current solar cycle 23 were estimated by using a statistical model with the accumulating cycle sunspot data based on the odd-even behavior of historical sunspot cycles from 1 to 22. Since cycle 23 has progressed and the accurate solar minimum occurrence has been defined, the statistical model is validated by comparing the previous prediction with the new measured sunspot number; the improved sunspot projection in short range of future time is made accordingly. The current cycle is expected to have a moderate level of activity. Errors of this model are shown to be self-correcting as cycle observations become available.

  12. Space debris tracking at San Fernando laser station

    NASA Astrophysics Data System (ADS)

    Catalán, M.; Quijano, M.; Pazos, A.; Martín Davila, J.; Cortina, L. M.

    2016-12-01

    For years to come space debris will be a major issue for society. It has a negative impact on active artificial satellites, having implications for future missions. Tracking space debris as accurately as possible is the first step towards controlling this problem, yet it presents a challenge for science. The main limitation is the relatively low accuracy of the methods used to date for tracking these objects. Clearly, improving the predicted orbit accuracy is crucial (avoiding unnecessary anti-collision maneuvers). A new field of research was recently instituted by our satellite laser ranging station: tracking decommissioned artificial satellites equipped with retroreflectors. To this end we work in conjunction with international space agencies which provide increasing attention to this problem. We thus proposed to share our time-schedule of use of the satellite laser ranging station for obtaining data that would make orbital element predictions far more accurate (meter accuracy), whilst maintaining our tracking routines for active satellites. This manuscript reports on the actions carried out so far.

  13. Bayesian Framework Approach for Prognostic Studies in Electrolytic Capacitor under Thermal Overstress Conditions

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Goebel, Kai; Biswas, Gautam

    2012-01-01

    Electrolytic capacitors are used in several applications ranging from power supplies for safety critical avionics equipment to power drivers for electro-mechanical actuator. Past experiences show that capacitors tend to degrade and fail faster when subjected to high electrical or thermal stress conditions during operations. This makes them good candidates for prognostics and health management. Model-based prognostics captures system knowledge in the form of physics-based models of components in order to obtain accurate predictions of end of life based on their current state of heal th and their anticipated future use and operational conditions. The focus of this paper is on deriving first principles degradation models for thermal stress conditions and implementing Bayesian framework for making remaining useful life predictions. Data collected from simultaneous experiments are used to validate the models. Our overall goal is to derive accurate models of capacitor degradation, and use them to remaining useful life in DC-DC converters.

  14. Limitations of diagnostic precision and predictive utility in the individual case: a challenge for forensic practice.

    PubMed

    Cooke, David J; Michie, Christine

    2010-08-01

    Knowledge of group tendencies may not assist accurate predictions in the individual case. This has importance for forensic decision making and for the assessment tools routinely applied in forensic evaluations. In this article, we applied Monte Carlo methods to examine diagnostic agreement with different levels of inter-rater agreement given the distributional characteristics of PCL-R scores. Diagnostic agreement and score agreement were substantially less than expected. In addition, we examined the confidence intervals associated with individual predictions of violent recidivism. On the basis of empirical findings, statistical theory, and logic, we conclude that predictions of future offending cannot be achieved in the individual case with any degree of confidence. We discuss the problems identified in relation to the PCL-R in terms of the broader relevance to all instruments used in forensic decision making.

  15. Modeling ultrasound propagation through material of increasing geometrical complexity.

    PubMed

    Odabaee, Maryam; Odabaee, Mostafa; Pelekanos, Matthew; Leinenga, Gerhard; Götz, Jürgen

    2018-06-01

    Ultrasound is increasingly being recognized as a neuromodulatory and therapeutic tool, inducing a broad range of bio-effects in the tissue of experimental animals and humans. To achieve these effects in a predictable manner in the human brain, the thick cancellous skull presents a problem, causing attenuation. In order to overcome this challenge, as a first step, the acoustic properties of a set of simple bone-modeling resin samples that displayed an increasing geometrical complexity (increasing step sizes) were analyzed. Using two Non-Destructive Testing (NDT) transducers, we found that Wiener deconvolution predicted the Ultrasound Acoustic Response (UAR) and attenuation caused by the samples. However, whereas the UAR of samples with step sizes larger than the wavelength could be accurately estimated, the prediction was not accurate when the sample had a smaller step size. Furthermore, a Finite Element Analysis (FEA) performed in ANSYS determined that the scattering and refraction of sound waves was significantly higher in complex samples with smaller step sizes compared to simple samples with a larger step size. Together, this reveals an interaction of frequency and geometrical complexity in predicting the UAR and attenuation. These findings could in future be applied to poro-visco-elastic materials that better model the human skull. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Smoothing and Predicting Celestial Pole Offsets using a Kalman Filter and Smoother

    NASA Astrophysics Data System (ADS)

    Nastula, J.; Chin, T. M.; Gross, R. S.; Winska, M.; Winska, J.

    2017-12-01

    Since the early days of interplanetary spaceflight, accounting for changes in the Earth's rotation is recognized to be critical for accurate navigation. In the 1960s, tracking anomalies during the Ranger VII and VIII lunar missions were traced to errors in the Earth orientation parameters. As a result, Earth orientation calibration methods were improved to support the Mariner IV and V planetary missions. Today, accurate Earth orientation parameters are used to track and navigate every interplanetary spaceflight mission. The interplanetary spacecraft tracking and navigation teams at JPL require the UT1 and polar motion parameters, and these Earth orientation parameters are estimated by the use of a Kalman filter to combine past measurements of these parameters and predict their future evolution. A model was then used to provide the nutation/precession components of the Earth's orientation separately. As a result, variations caused by the free core nutation were not taken into account. But for the highest accuracy, these variations must be considered. So JPL recently developed an approach based upon the use of a Kalman filter and smoother to provide smoothed and predicted celestial pole offsets (CPOs) to the interplanetary spacecraft tracking and navigation teams. The approach used at JPL to do this and an evaluation of the accuracy of the predicted CPOs will be given here.

  17. Imagine All the People: How the Brain Creates and Uses Personality Models to Predict Behavior

    PubMed Central

    Hassabis, Demis; Spreng, R. Nathan; Rusu, Andrei A.; Robbins, Clifford A.; Mar, Raymond A.; Schacter, Daniel L.

    2014-01-01

    The behaviors of other people are often central to envisioning the future. The ability to accurately predict the thoughts and actions of others is essential for successful social interactions, with far-reaching consequences. Despite its importance, little is known about how the brain represents people in order to predict behavior. In this functional magnetic resonance imaging study, participants learned the unique personality of 4 protagonists and imagined how each would behave in different scenarios. The protagonists' personalities were composed of 2 traits: Agreeableness and Extraversion. Which protagonist was being imagined was accurately inferred based solely on activity patterns in the medial prefrontal cortex using multivariate pattern classification, providing novel evidence that brain activity can reveal whom someone is thinking about. Lateral temporal and posterior cingulate cortex discriminated between different degrees of agreeableness and extraversion, respectively. Functional connectivity analysis confirmed that regions associated with trait-processing and individual identities were functionally coupled. Activity during the imagination task, and revealed by functional connectivity, was consistent with the default network. Our results suggest that distinct regions code for personality traits, and that the brain combines these traits to represent individuals. The brain then uses this “personality model” to predict the behavior of others in novel situations. PMID:23463340

  18. Identification of fidgety movements and prediction of CP by the use of computer-based video analysis is more accurate when based on two video recordings.

    PubMed

    Adde, Lars; Helbostad, Jorunn; Jensenius, Alexander R; Langaas, Mette; Støen, Ragnhild

    2013-08-01

    This study evaluates the role of postterm age at assessment and the use of one or two video recordings for the detection of fidgety movements (FMs) and prediction of cerebral palsy (CP) using computer vision software. Recordings between 9 and 17 weeks postterm age from 52 preterm and term infants (24 boys, 28 girls; 26 born preterm) were used. Recordings were analyzed using computer vision software. Movement variables, derived from differences between subsequent video frames, were used for quantitative analysis. Sensitivities, specificities, and area under curve were estimated for the first and second recording, or a mean of both. FMs were classified based on the Prechtl approach of general movement assessment. CP status was reported at 2 years. Nine children developed CP of whom all recordings had absent FMs. The mean variability of the centroid of motion (CSD) from two recordings was more accurate than using only one recording, and identified all children who were diagnosed with CP at 2 years. Age at assessment did not influence the detection of FMs or prediction of CP. The accuracy of computer vision techniques in identifying FMs and predicting CP based on two recordings should be confirmed in future studies.

  19. Using Formal Grammars to Predict I/O Behaviors in HPC: The Omnisc'IO Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorier, Matthieu; Ibrahim, Shadi; Antoniu, Gabriel

    2016-08-01

    The increasing gap between the computation performance of post-petascale machines and the performance of their I/O subsystem has motivated many I/O optimizations including prefetching, caching, and scheduling. In order to further improve these techniques, modeling and predicting spatial and temporal I/O patterns of HPC applications as they run has become crucial. In this paper we present Omnisc'IO, an approach that builds a grammar-based model of the I/O behavior of HPC applications and uses it to predict when future I/O operations will occur, and where and how much data will be accessed. To infer grammars, Omnisc'IO is based on StarSequitur, amore » novel algorithm extending Nevill-Manning's Sequitur algorithm. Omnisc'IO is transparently integrated into the POSIX and MPI I/O stacks and does not require any modification in applications or higher-level I/O libraries. It works without any prior knowledge of the application and converges to accurate predictions of any N future I/O operations within a couple of iterations. Its implementation is efficient in both computation time and memory footprint.« less

  20. Prognostics Applied to Electric Propulsion UAV

    NASA Technical Reports Server (NTRS)

    Goebel, Kai; Saha, Bhaskar

    2013-01-01

    Health management plays an important role in operations of UAV. If there is equipment malfunction on critical components, safe operation of the UAV might possibly be compromised. A technology with particular promise in this arena is equipment prognostics. This technology provides a state assessment of the health of components of interest and, if a degraded state has been found, it estimates how long it will take before the equipment will reach a failure threshold, conditional on assumptions about future operating conditions and future environmental conditions. This chapter explores the technical underpinnings of how to perform prognostics and shows an implementation on the propulsion of an electric UAV. A particle filter is shown as the method of choice in performing state assessment and predicting future degradation. The method is then applied to the batteries that provide power to the propeller motors. An accurate run-time battery life prediction algorithm is of critical importance to ensure the safe operation of the vehicle if one wants to maximize in-air time. Current reliability based techniques turn out to be insufficient to manage the use of such batteries where loads vary frequently in uncertain environments.

  1. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  2. Future Weather Forecasting in the Year 2020-Investing in Technology Today: Improving Weather and Environmental Predictions

    NASA Technical Reports Server (NTRS)

    Anthes, Richard; Schoeberl, Mark

    2000-01-01

    Fast-forward twenty years to the nightly simultaneous TV/webcast. Accurate 8-14 day regional forecasts will be available as will be a whole host of linked products including economic impact, travel, energy usage, etc. On-demand, personalized street-level forecasts will be downloaded into your PDA. Your home system will automatically update the products of interest to you (e.g. severe storm forecasts, hurricane predictions, etc). Short and long range climate forecasts will be used by your "Quicken 2020" to make suggest changes in your "futures" investment portfolio. Through a lively and informative multi-media presentation, leading Space-Earth Science Researchers and Technologists will share their vision for the year 2020, offering a possible futuristic forecast enabled through the application of new technologies under development today. Copies of the 'broadcast' will be available on Beta Tape for your own future use. If sufficient interest exists, the program may also be made available for broadcasters wishing to do stand-ups with roll-ins from the San Francisco meeting for their viewers back home.

  3. Using a Magnetic Flux Transport Model to Predict the Solar Cycle

    NASA Technical Reports Server (NTRS)

    Lyatskaya, S.; Hathaway, D.; Winebarger, A.

    2007-01-01

    We present the results of an investigation into the use of a magnetic flux transport model to predict the amplitude of future solar cycles. Recently Dikpati, de Toma, & Gilman (2006) showed how their dynamo model could be used to accurately predict the amplitudes of the last eight solar cycles and offered a prediction for the next solar cycle - a large amplitude cycle. Cameron & Schussler (2007) found that they could reproduce this predictive skill with a simple 1-dimensional surface flux transport model - provided they used the same parameters and data as Dikpati, de Toma, & Gilman. However, when they tried incorporating the data in what they argued was a more realistic manner, they found that the predictive skill dropped dramatically. We have written our own code for examining this problem and have incorporated updated and corrected data for the source terms - the emergence of magnetic flux in active regions. We present both the model itself and our results from it - in particular our tests of its effectiveness at predicting solar cycles.

  4. A Unified Framework for Activity Recognition-Based Behavior Analysis and Action Prediction in Smart Homes

    PubMed Central

    Fatima, Iram; Fahim, Muhammad; Lee, Young-Koo; Lee, Sungyoung

    2013-01-01

    In recent years, activity recognition in smart homes is an active research area due to its applicability in many applications, such as assistive living and healthcare. Besides activity recognition, the information collected from smart homes has great potential for other application domains like lifestyle analysis, security and surveillance, and interaction monitoring. Therefore, discovery of users common behaviors and prediction of future actions from past behaviors become an important step towards allowing an environment to provide personalized service. In this paper, we develop a unified framework for activity recognition-based behavior analysis and action prediction. For this purpose, first we propose kernel fusion method for accurate activity recognition and then identify the significant sequential behaviors of inhabitants from recognized activities of their daily routines. Moreover, behaviors patterns are further utilized to predict the future actions from past activities. To evaluate the proposed framework, we performed experiments on two real datasets. The results show a remarkable improvement of 13.82% in the accuracy on average of recognized activities along with the extraction of significant behavioral patterns and precise activity predictions with 6.76% increase in F-measure. All this collectively help in understanding the users” actions to gain knowledge about their habits and preferences. PMID:23435057

  5. Impact of Predicting Health Care Utilization Via Web Search Behavior: A Data-Driven Analysis.

    PubMed

    Agarwal, Vibhu; Zhang, Liangliang; Zhu, Josh; Fang, Shiyuan; Cheng, Tim; Hong, Chloe; Shah, Nigam H

    2016-09-21

    By recent estimates, the steady rise in health care costs has deprived more than 45 million Americans of health care services and has encouraged health care providers to better understand the key drivers of health care utilization from a population health management perspective. Prior studies suggest the feasibility of mining population-level patterns of health care resource utilization from observational analysis of Internet search logs; however, the utility of the endeavor to the various stakeholders in a health ecosystem remains unclear. The aim was to carry out a closed-loop evaluation of the utility of health care use predictions using the conversion rates of advertisements that were displayed to the predicted future utilizers as a surrogate. The statistical models to predict the probability of user's future visit to a medical facility were built using effective predictors of health care resource utilization, extracted from a deidentified dataset of geotagged mobile Internet search logs representing searches made by users of the Baidu search engine between March 2015 and May 2015. We inferred presence within the geofence of a medical facility from location and duration information from users' search logs and putatively assigned medical facility visit labels to qualifying search logs. We constructed a matrix of general, semantic, and location-based features from search logs of users that had 42 or more search days preceding a medical facility visit as well as from search logs of users that had no medical visits and trained statistical learners for predicting future medical visits. We then carried out a closed-loop evaluation of the utility of health care use predictions using the show conversion rates of advertisements displayed to the predicted future utilizers. In the context of behaviorally targeted advertising, wherein health care providers are interested in minimizing their cost per conversion, the association between show conversion rate and predicted utilization score, served as a surrogate measure of the model's utility. We obtained the highest area under the curve (0.796) in medical visit prediction with our random forests model and daywise features. Ablating feature categories one at a time showed that the model performance worsened the most when location features were dropped. An online evaluation in which advertisements were served to users who had a high predicted probability of a future medical visit showed a 3.96% increase in the show conversion rate. Results from our experiments done in a research setting suggest that it is possible to accurately predict future patient visits from geotagged mobile search logs. Results from the offline and online experiments on the utility of health utilization predictions suggest that such prediction can have utility for health care providers.

  6. Impact of Predicting Health Care Utilization Via Web Search Behavior: A Data-Driven Analysis

    PubMed Central

    Zhang, Liangliang; Zhu, Josh; Fang, Shiyuan; Cheng, Tim; Hong, Chloe; Shah, Nigam H

    2016-01-01

    Background By recent estimates, the steady rise in health care costs has deprived more than 45 million Americans of health care services and has encouraged health care providers to better understand the key drivers of health care utilization from a population health management perspective. Prior studies suggest the feasibility of mining population-level patterns of health care resource utilization from observational analysis of Internet search logs; however, the utility of the endeavor to the various stakeholders in a health ecosystem remains unclear. Objective The aim was to carry out a closed-loop evaluation of the utility of health care use predictions using the conversion rates of advertisements that were displayed to the predicted future utilizers as a surrogate. The statistical models to predict the probability of user’s future visit to a medical facility were built using effective predictors of health care resource utilization, extracted from a deidentified dataset of geotagged mobile Internet search logs representing searches made by users of the Baidu search engine between March 2015 and May 2015. Methods We inferred presence within the geofence of a medical facility from location and duration information from users’ search logs and putatively assigned medical facility visit labels to qualifying search logs. We constructed a matrix of general, semantic, and location-based features from search logs of users that had 42 or more search days preceding a medical facility visit as well as from search logs of users that had no medical visits and trained statistical learners for predicting future medical visits. We then carried out a closed-loop evaluation of the utility of health care use predictions using the show conversion rates of advertisements displayed to the predicted future utilizers. In the context of behaviorally targeted advertising, wherein health care providers are interested in minimizing their cost per conversion, the association between show conversion rate and predicted utilization score, served as a surrogate measure of the model’s utility. Results We obtained the highest area under the curve (0.796) in medical visit prediction with our random forests model and daywise features. Ablating feature categories one at a time showed that the model performance worsened the most when location features were dropped. An online evaluation in which advertisements were served to users who had a high predicted probability of a future medical visit showed a 3.96% increase in the show conversion rate. Conclusions Results from our experiments done in a research setting suggest that it is possible to accurately predict future patient visits from geotagged mobile search logs. Results from the offline and online experiments on the utility of health utilization predictions suggest that such prediction can have utility for health care providers. PMID:27655225

  7. WIND Validation Cases: Computational Study of Thermally-perfect Gases

    NASA Technical Reports Server (NTRS)

    DalBello, Teryn; Georgiadis, Nick (Technical Monitor)

    2002-01-01

    The ability of the WIND Navier-Stokes code to predict the physics of multi-species gases is investigated in support of future high-speed, high-temperature propulsion applications relevant to NASA's Space Transportation efforts. Three benchmark cases are investigated to evaluate the capability of the WIND chemistry model to accurately predict the aerodynamics of multi-species chemically non-reacting (frozen) gases. Case 1 represents turbulent mixing of sonic hydrogen and supersonic vitiated air. Case 2 consists of heated and unheated round supersonic jet exiting to ambient. Case 3 represents 2-D flow through a converging-diverging Mach 2 nozzle. For Case 1, the WIND results agree fairly well with experimental results and that significant mixing occurs downstream of the hydrogen injection point. For Case 2, the results show that the Wilke and Sutherland viscosity laws gave similar results, and the available SST turbulence model does not predict round supersonic nozzle flows accurately. For Case 3, results show that experimental, frozen, and 1-D gas results agree fairly well, and that frozen, homogeneous, multi-species gas calculations can be approximated by running in perfect gas mode while specifying the mixture gas constant and Ratio of Specific Heats.

  8. An improvement in rollover detection of articulated vehicles using the grey system theory

    NASA Astrophysics Data System (ADS)

    Chou, Tao; Chu, Tzyy-Wen

    2014-05-01

    A Rollover Index combined with the grey system theory, called a Grey Rollover Index (GRI), is proposed to assess the rollover threat for articulated vehicles with a tractor-semitrailer combination. This index can predict future trends of vehicle dynamics based on current vehicle motion; thus, it is suitable for vehicle-rollover detection. Two difficulties are encountered when applying the GRI for rollover detection. The first difficulty is effectively predicting the rollover threat of the vehicles, and the second difficulty is achieving a definite definition of the real rollover timing of a vehicle. The following methods are used to resolve these problems. First, a nonlinear mathematical model is constructed to accurately describe the vehicle dynamics of articulated vehicles. This model is combined with the GRI to predict rollover propensity. Finally, TruckSim™ software is used to determine the real rollover timing and facilitate the accurate supply of information to the rollover detection system through the GRI. This index is used to verify the simulation based on the common manoeuvres that cause rollover accidents to reduce the occurrence of false signals and effectively increase the efficiency of the rollover detection system.

  9. Simulating polarized light scattering in terrestrial snow based on bicontinuous random medium and Monte Carlo ray tracing

    NASA Astrophysics Data System (ADS)

    Xiong, Chuan; Shi, Jiancheng

    2014-01-01

    To date, the light scattering models of snow consider very little about the real snow microstructures. The ideal spherical or other single shaped particle assumptions in previous snow light scattering models can cause error in light scattering modeling of snow and further cause errors in remote sensing inversion algorithms. This paper tries to build up a snow polarized reflectance model based on bicontinuous medium, with which the real snow microstructure is considered. The accurate specific surface area of bicontinuous medium can be analytically derived. The polarized Monte Carlo ray tracing technique is applied to the computer generated bicontinuous medium. With proper algorithms, the snow surface albedo, bidirectional reflectance distribution function (BRDF) and polarized BRDF can be simulated. The validation of model predicted spectral albedo and bidirectional reflectance factor (BRF) using experiment data shows good results. The relationship between snow surface albedo and snow specific surface area (SSA) were predicted, and this relationship can be used for future improvement of snow specific surface area (SSA) inversion algorithms. The model predicted polarized reflectance is validated and proved accurate, which can be further applied in polarized remote sensing.

  10. Job Forecasting. Hearings before the Subcommittee on Investigations and Oversight of the Committee on Science and Technology, U.S. House of Representatives, Ninety-Eighth Congress, First Session (April 6-7, 1983).

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Committee on Science and Technology.

    This is a report of congressional hearings that focus on an examination of job forecasting methods to learn how accurately future jobs can be predicted and the kinds of skills and training American workers will need to fill them. Testimony includes statements and prepared statements of the majority leader of the House of Representatives and…

  11. Satellite markers: a simple method for ground truth car pose on stereo video

    NASA Astrophysics Data System (ADS)

    Gil, Gustavo; Savino, Giovanni; Piantini, Simone; Pierini, Marco

    2018-04-01

    Artificial prediction of future location of other cars in the context of advanced safety systems is a must. The remote estimation of car pose and particularly its heading angle is key to predict its future location. Stereo vision systems allow to get the 3D information of a scene. Ground truth in this specific context is associated with referential information about the depth, shape and orientation of the objects present in the traffic scene. Creating 3D ground truth is a measurement and data fusion task associated with the combination of different kinds of sensors. The novelty of this paper is the method to generate ground truth car pose only from video data. When the method is applied to stereo video, it also provides the extrinsic camera parameters for each camera at frame level which are key to quantify the performance of a stereo vision system when it is moving because the system is subjected to undesired vibrations and/or leaning. We developed a video post-processing technique which employs a common camera calibration tool for the 3D ground truth generation. In our case study, we focus in accurate car heading angle estimation of a moving car under realistic imagery. As outcomes, our satellite marker method provides accurate car pose at frame level, and the instantaneous spatial orientation for each camera at frame level.

  12. Photovoltaic Engineering Testbed Designed for Calibrating Photovoltaic Devices in Space

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.

    2002-01-01

    Accurate prediction of the performance of solar arrays in space requires that the cells be tested in comparison with a space-flown standard. Recognizing that improvements in future solar cell technology will require an ever-increasing fidelity of standards, the Photovoltaics and Space Environment Branch at the NASA Glenn Research Center, in collaboration with the Ohio Aerospace Institute, designed a prototype facility to allow routine calibration, measurement, and qualification of solar cells on the International Space Station, and then the return of the cells to Earth for laboratory use. For solar cell testing, the Photovoltaic Engineering Testbed (PET) site provides a true air-mass-zero (AM0) solar spectrum. This allows solar cells to be accurately calibrated using the full spectrum of the Sun.

  13. Solar Cycle Predictions

    NASA Technical Reports Server (NTRS)

    Pesnell, William Dean

    2012-01-01

    Solar cycle predictions are needed to plan long-term space missions; just like weather predictions are needed to plan the launch. Fleets of satellites circle the Earth collecting many types of science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Predictions of drag on LEO spacecraft are one of the most important. Launching a satellite with less propellant can mean a higher orbit, but unanticipated solar activity and increased drag can make that a Pyrrhic victory as you consume the reduced propellant load more rapidly. Energetic events at the Sun can produce crippling radiation storms that endanger all assets in space. Solar cycle predictions also anticipate the shortwave emissions that cause degradation of solar panels. Testing solar dynamo theories by quantitative predictions of what will happen in 5-20 years is the next arena for solar cycle predictions. A summary and analysis of 75 predictions of the amplitude of the upcoming Solar Cycle 24 is presented. The current state of solar cycle predictions and some anticipations how those predictions could be made more accurate in the future will be discussed.

  14. Project for Solar-Terrestrial Environment Prediction (PSTEP): Towards Predicting Next Solar Cycle

    NASA Astrophysics Data System (ADS)

    Imada, S.; Iijima, H.; Hotta, H.; Shiota, D.; Kanou, O.; Fujiyama, M.; Kusano, K.

    2016-10-01

    It is believed that the longer-term variations of the solar activity can affect the Earth's climate. Therefore, predicting the next solar cycle is crucial for the forecast of the "solar-terrestrial environment". To build prediction schemes for the activity level of the next solar cycle is a key for the long-term space weather study. Although three-years prediction can be almost achieved, the prediction of next solar cycle is very limited, so far. We are developing a five-years prediction scheme by combining the Surface Flux Transport (SFT) model and the most accurate measurements of solar magnetic fields as a part of the PSTEP (Project for Solar-Terrestrial Environment Prediction),. We estimate the meridional flow, differential rotation, and turbulent diffusivity from recent modern observations (Hinode and Solar Dynamics Observatory). These parameters are used in the SFT models to predict the polar magnetic fields strength at the solar minimum. In this presentation, we will explain the outline of our strategy to predict the next solar cycle. We also report the present status and the future perspective of our project.

  15. A comparison of five methods to predict genomic breeding values of dairy bulls from genome-wide SNP markers

    PubMed Central

    2009-01-01

    Background Genomic selection (GS) uses molecular breeding values (MBV) derived from dense markers across the entire genome for selection of young animals. The accuracy of MBV prediction is important for a successful application of GS. Recently, several methods have been proposed to estimate MBV. Initial simulation studies have shown that these methods can accurately predict MBV. In this study we compared the accuracies and possible bias of five different regression methods in an empirical application in dairy cattle. Methods Genotypes of 7,372 SNP and highly accurate EBV of 1,945 dairy bulls were used to predict MBV for protein percentage (PPT) and a profit index (Australian Selection Index, ASI). Marker effects were estimated by least squares regression (FR-LS), Bayesian regression (Bayes-R), random regression best linear unbiased prediction (RR-BLUP), partial least squares regression (PLSR) and nonparametric support vector regression (SVR) in a training set of 1,239 bulls. Accuracy and bias of MBV prediction were calculated from cross-validation of the training set and tested against a test team of 706 young bulls. Results For both traits, FR-LS using a subset of SNP was significantly less accurate than all other methods which used all SNP. Accuracies obtained by Bayes-R, RR-BLUP, PLSR and SVR were very similar for ASI (0.39-0.45) and for PPT (0.55-0.61). Overall, SVR gave the highest accuracy. All methods resulted in biased MBV predictions for ASI, for PPT only RR-BLUP and SVR predictions were unbiased. A significant decrease in accuracy of prediction of ASI was seen in young test cohorts of bulls compared to the accuracy derived from cross-validation of the training set. This reduction was not apparent for PPT. Combining MBV predictions with pedigree based predictions gave 1.05 - 1.34 times higher accuracies compared to predictions based on pedigree alone. Some methods have largely different computational requirements, with PLSR and RR-BLUP requiring the least computing time. Conclusions The four methods which use information from all SNP namely RR-BLUP, Bayes-R, PLSR and SVR generate similar accuracies of MBV prediction for genomic selection, and their use in the selection of immediate future generations in dairy cattle will be comparable. The use of FR-LS in genomic selection is not recommended. PMID:20043835

  16. Representing winter wheat in the Community Land Model (version 4.5)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Yaqiong; Williams, Ian N.; Bagley, Justin E.

    Winter wheat is a staple crop for global food security, and is the dominant vegetation cover for a significant fraction of Earth's croplands. As such, it plays an important role in carbon cycling and land–atmosphere interactions in these key regions. Accurate simulation of winter wheat growth is not only crucial for future yield prediction under a changing climate, but also for accurately predicting the energy and water cycles for winter wheat dominated regions. We modified the winter wheat model in the Community Land Model (CLM) to better simulate winter wheat leaf area index, latent heat flux, net ecosystem exchange ofmore » CO 2, and grain yield. These included schemes to represent vernalization as well as frost tolerance and damage. We calibrated three key parameters (minimum planting temperature, maximum crop growth days, and initial value of leaf carbon allocation coefficient) and modified the grain carbon allocation algorithm for simulations at the US Southern Great Plains ARM site (US-ARM), and validated the model performance at eight additional sites across North America. We found that the new winter wheat model improved the prediction of monthly variation in leaf area index, reduced latent heat flux, and net ecosystem exchange root mean square error (RMSE) by 41 and 35 % during the spring growing season. The model accurately simulated the interannual variation in yield at the US-ARM site, but underestimated yield at sites and in regions (northwestern and southeastern US) with historically greater yields by 35 %.« less

  17. Representing winter wheat in the Community Land Model (version 4.5)

    NASA Astrophysics Data System (ADS)

    Lu, Yaqiong; Williams, Ian N.; Bagley, Justin E.; Torn, Margaret S.; Kueppers, Lara M.

    2017-05-01

    Winter wheat is a staple crop for global food security, and is the dominant vegetation cover for a significant fraction of Earth's croplands. As such, it plays an important role in carbon cycling and land-atmosphere interactions in these key regions. Accurate simulation of winter wheat growth is not only crucial for future yield prediction under a changing climate, but also for accurately predicting the energy and water cycles for winter wheat dominated regions. We modified the winter wheat model in the Community Land Model (CLM) to better simulate winter wheat leaf area index, latent heat flux, net ecosystem exchange of CO2, and grain yield. These included schemes to represent vernalization as well as frost tolerance and damage. We calibrated three key parameters (minimum planting temperature, maximum crop growth days, and initial value of leaf carbon allocation coefficient) and modified the grain carbon allocation algorithm for simulations at the US Southern Great Plains ARM site (US-ARM), and validated the model performance at eight additional sites across North America. We found that the new winter wheat model improved the prediction of monthly variation in leaf area index, reduced latent heat flux, and net ecosystem exchange root mean square error (RMSE) by 41 and 35 % during the spring growing season. The model accurately simulated the interannual variation in yield at the US-ARM site, but underestimated yield at sites and in regions (northwestern and southeastern US) with historically greater yields by 35 %.

  18. Ensemble perception of color in autistic adults.

    PubMed

    Maule, John; Stanworth, Kirstie; Pellicano, Elizabeth; Franklin, Anna

    2017-05-01

    Dominant accounts of visual processing in autism posit that autistic individuals have an enhanced access to details of scenes [e.g., weak central coherence] which is reflected in a general bias toward local processing. Furthermore, the attenuated priors account of autism predicts that the updating and use of summary representations is reduced in autism. Ensemble perception describes the extraction of global summary statistics of a visual feature from a heterogeneous set (e.g., of faces, sizes, colors), often in the absence of local item representation. The present study investigated ensemble perception in autistic adults using a rapidly presented (500 msec) ensemble of four, eight, or sixteen elements representing four different colors. We predicted that autistic individuals would be less accurate when averaging the ensembles, but more accurate in recognizing individual ensemble colors. The results were consistent with the predictions. Averaging was impaired in autism, but only when ensembles contained four elements. Ensembles of eight or sixteen elements were averaged equally accurately across groups. The autistic group also showed a corresponding advantage in rejecting colors that were not originally seen in the ensemble. The results demonstrate the local processing bias in autism, but also suggest that the global perceptual averaging mechanism may be compromised under some conditions. The theoretical implications of the findings and future avenues for research on summary statistics in autism are discussed. Autism Res 2017, 10: 839-851. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  19. Representing winter wheat in the Community Land Model (version 4.5)

    DOE PAGES

    Lu, Yaqiong; Williams, Ian N.; Bagley, Justin E.; ...

    2017-05-05

    Winter wheat is a staple crop for global food security, and is the dominant vegetation cover for a significant fraction of Earth's croplands. As such, it plays an important role in carbon cycling and land–atmosphere interactions in these key regions. Accurate simulation of winter wheat growth is not only crucial for future yield prediction under a changing climate, but also for accurately predicting the energy and water cycles for winter wheat dominated regions. We modified the winter wheat model in the Community Land Model (CLM) to better simulate winter wheat leaf area index, latent heat flux, net ecosystem exchange ofmore » CO 2, and grain yield. These included schemes to represent vernalization as well as frost tolerance and damage. We calibrated three key parameters (minimum planting temperature, maximum crop growth days, and initial value of leaf carbon allocation coefficient) and modified the grain carbon allocation algorithm for simulations at the US Southern Great Plains ARM site (US-ARM), and validated the model performance at eight additional sites across North America. We found that the new winter wheat model improved the prediction of monthly variation in leaf area index, reduced latent heat flux, and net ecosystem exchange root mean square error (RMSE) by 41 and 35 % during the spring growing season. The model accurately simulated the interannual variation in yield at the US-ARM site, but underestimated yield at sites and in regions (northwestern and southeastern US) with historically greater yields by 35 %.« less

  20. Ensemble perception of color in autistic adults

    PubMed Central

    Stanworth, Kirstie; Pellicano, Elizabeth; Franklin, Anna

    2016-01-01

    Dominant accounts of visual processing in autism posit that autistic individuals have an enhanced access to details of scenes [e.g., weak central coherence] which is reflected in a general bias toward local processing. Furthermore, the attenuated priors account of autism predicts that the updating and use of summary representations is reduced in autism. Ensemble perception describes the extraction of global summary statistics of a visual feature from a heterogeneous set (e.g., of faces, sizes, colors), often in the absence of local item representation. The present study investigated ensemble perception in autistic adults using a rapidly presented (500 msec) ensemble of four, eight, or sixteen elements representing four different colors. We predicted that autistic individuals would be less accurate when averaging the ensembles, but more accurate in recognizing individual ensemble colors. The results were consistent with the predictions. Averaging was impaired in autism, but only when ensembles contained four elements. Ensembles of eight or sixteen elements were averaged equally accurately across groups. The autistic group also showed a corresponding advantage in rejecting colors that were not originally seen in the ensemble. The results demonstrate the local processing bias in autism, but also suggest that the global perceptual averaging mechanism may be compromised under some conditions. The theoretical implications of the findings and future avenues for research on summary statistics in autism are discussed. Autism Res 2017, 10: 839–851. © 2016 The Authors Autism Research published by Wiley Periodicals, Inc. on behalf of International Society for Autism Research PMID:27874263

  1. Experimental evaluation of a recursive model identification technique for type 1 diabetes.

    PubMed

    Finan, Daniel A; Doyle, Francis J; Palerm, Cesar C; Bevier, Wendy C; Zisser, Howard C; Jovanovic, Lois; Seborg, Dale E

    2009-09-01

    A model-based controller for an artificial beta cell requires an accurate model of the glucose-insulin dynamics in type 1 diabetes subjects. To ensure the robustness of the controller for changing conditions (e.g., changes in insulin sensitivity due to illnesses, changes in exercise habits, or changes in stress levels), the model should be able to adapt to the new conditions by means of a recursive parameter estimation technique. Such an adaptive strategy will ensure that the most accurate model is used for the current conditions, and thus the most accurate model predictions are used in model-based control calculations. In a retrospective analysis, empirical dynamic autoregressive exogenous input (ARX) models were identified from glucose-insulin data for nine type 1 diabetes subjects in ambulatory conditions. Data sets consisted of continuous (5-minute) glucose concentration measurements obtained from a continuous glucose monitor, basal insulin infusion rates and times and amounts of insulin boluses obtained from the subjects' insulin pumps, and subject-reported estimates of the times and carbohydrate content of meals. Two identification techniques were investigated: nonrecursive, or batch methods, and recursive methods. Batch models were identified from a set of training data, whereas recursively identified models were updated at each sampling instant. Both types of models were used to make predictions of new test data. For the purpose of comparison, model predictions were compared to zero-order hold (ZOH) predictions, which were made by simply holding the current glucose value constant for p steps into the future, where p is the prediction horizon. Thus, the ZOH predictions are model free and provide a base case for the prediction metrics used to quantify the accuracy of the model predictions. In theory, recursive identification techniques are needed only when there are changing conditions in the subject that require model adaptation. Thus, the identification and validation techniques were performed with both "normal" data and data collected during conditions of reduced insulin sensitivity. The latter were achieved by having the subjects self-administer a medication, prednisone, for 3 consecutive days. The recursive models were allowed to adapt to this condition of reduced insulin sensitivity, while the batch models were only identified from normal data. Data from nine type 1 diabetes subjects in ambulatory conditions were analyzed; six of these subjects also participated in the prednisone portion of the study. For normal test data, the batch ARX models produced 30-, 45-, and 60-minute-ahead predictions that had average root mean square error (RMSE) values of 26, 34, and 40 mg/dl, respectively. For test data characterized by reduced insulin sensitivity, the batch ARX models produced 30-, 60-, and 90-minute-ahead predictions with average RMSE values of 27, 46, and 59 mg/dl, respectively; the recursive ARX models demonstrated similar performance with corresponding values of 27, 45, and 61 mg/dl, respectively. The identified ARX models (batch and recursive) produced more accurate predictions than the model-free ZOH predictions, but only marginally. For test data characterized by reduced insulin sensitivity, RMSE values for the predictions of the batch ARX models were 9, 5, and 5% more accurate than the ZOH predictions for prediction horizons of 30, 60, and 90 minutes, respectively. In terms of RMSE values, the 30-, 60-, and 90-minute predictions of the recursive models were more accurate than the ZOH predictions, by 10, 5, and 2%, respectively. In this experimental study, the recursively identified ARX models resulted in predictions of test data that were similar, but not superior, to the batch models. Even for the test data characteristic of reduced insulin sensitivity, the batch and recursive models demonstrated similar prediction accuracy. The predictions of the identified ARX models were only marginally more accurate than the model-free ZOH predictions. Given the simplicity of the ARX models and the computational ease with which they are identified, however, even modest improvements may justify the use of these models in a model-based controller for an artificial beta cell. 2009 Diabetes Technology Society.

  2. Three-dimensional computational aerodynamics in the 1980's

    NASA Technical Reports Server (NTRS)

    Lomax, H.

    1978-01-01

    The future requirements for constructing codes that can be used to compute three-dimensional flows about aerodynamic shapes should be assessed in light of the constraints imposed by future computer architectures and the reality of usable algorithms that can provide practical three-dimensional simulations. On the hardware side, vector processing is inevitable in order to meet the CPU speeds required. To cope with three-dimensional geometries, massive data bases with fetch/store conflicts and transposition problems are inevitable. On the software side, codes must be prepared that: (1) can be adapted to complex geometries, (2) can (at the very least) predict the location of laminar and turbulent boundary layer separation, and (3) will converge rapidly to sufficiently accurate solutions.

  3. Past speculations of the future: a review of the methods used for forecasting emerging health technologies

    PubMed Central

    Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew

    2016-01-01

    Objectives Forecasting can support rational decision-making around the introduction and use of emerging health technologies and prevent investment in technologies that have limited long-term potential. However, forecasting methods need to be credible. We performed a systematic search to identify the methods used in forecasting studies to predict future health technologies within a 3–20-year timeframe. Identification and retrospective assessment of such methods potentially offer a route to more reliable prediction. Design Systematic search of the literature to identify studies reported on methods of forecasting in healthcare. Participants People are not needed in this study. Data sources The authors searched MEDLINE, EMBASE, PsychINFO and grey literature sources, and included articles published in English that reported their methods and a list of identified technologies. Main outcome measure Studies reporting methods used to predict future health technologies within a 3–20-year timeframe with an identified list of individual healthcare technologies. Commercially sponsored reviews, long-term futurology studies (with over 20-year timeframes) and speculative editorials were excluded. Results 15 studies met our inclusion criteria. Our results showed that the majority of studies (13/15) consulted experts either alone or in combination with other methods such as literature searching. Only 2 studies used more complex forecasting tools such as scenario building. Conclusions The methodological fundamentals of formal 3–20-year prediction are consistent but vary in details. Further research needs to be conducted to ascertain if the predictions made were accurate and whether accuracy varies by the methods used or by the types of technologies identified. PMID:26966060

  4. Predictive Monitoring for Improved Management of Glucose Levels

    PubMed Central

    Reifman, Jaques; Rajaraman, Srinivasan; Gribok, Andrei; Ward, W. Kenneth

    2007-01-01

    Background Recent developments and expected near-future improvements in continuous glucose monitoring (CGM) devices provide opportunities to couple them with mathematical forecasting models to produce predictive monitoring systems for early, proactive glycemia management of diabetes mellitus patients before glucose levels drift to undesirable levels. This article assesses the feasibility of data-driven models to serve as the forecasting engine of predictive monitoring systems. Methods We investigated the capabilities of data-driven autoregressive (AR) models to (1) capture the correlations in glucose time-series data, (2) make accurate predictions as a function of prediction horizon, and (3) be made portable from individual to individual without any need for model tuning. The investigation is performed by employing CGM data from nine type 1 diabetic subjects collected over a continuous 5-day period. Results With CGM data serving as the gold standard, AR model-based predictions of glucose levels assessed over nine subjects with Clarke error grid analysis indicated that, for a 30-minute prediction horizon, individually tuned models yield 97.6 to 100.0% of data in the clinically acceptable zones A and B, whereas cross-subject, portable models yield 95.8 to 99.7% of data in zones A and B. Conclusions This study shows that, for a 30-minute prediction horizon, data-driven AR models provide sufficiently-accurate and clinically-acceptable estimates of glucose levels for timely, proactive therapy and should be considered as the modeling engine for predictive monitoring of patients with type 1 diabetes mellitus. It also suggests that AR models can be made portable from individual to individual with minor performance penalties, while greatly reducing the burden associated with model tuning and data collection for model development. PMID:19885110

  5. Machine Learning and Neurosurgical Outcome Prediction: A Systematic Review.

    PubMed

    Senders, Joeky T; Staples, Patrick C; Karhade, Aditya V; Zaki, Mark M; Gormley, William B; Broekman, Marike L D; Smith, Timothy R; Arnaout, Omar

    2018-01-01

    Accurate measurement of surgical outcomes is highly desirable to optimize surgical decision-making. An important element of surgical decision making is identification of the patient cohort that will benefit from surgery before the intervention. Machine learning (ML) enables computers to learn from previous data to make accurate predictions on new data. In this systematic review, we evaluate the potential of ML for neurosurgical outcome prediction. A systematic search in the PubMed and Embase databases was performed to identify all potential relevant studies up to January 1, 2017. Thirty studies were identified that evaluated ML algorithms used as prediction models for survival, recurrence, symptom improvement, and adverse events in patients undergoing surgery for epilepsy, brain tumor, spinal lesions, neurovascular disease, movement disorders, traumatic brain injury, and hydrocephalus. Depending on the specific prediction task evaluated and the type of input features included, ML models predicted outcomes after neurosurgery with a median accuracy and area under the receiver operating curve of 94.5% and 0.83, respectively. Compared with logistic regression, ML models performed significantly better and showed a median absolute improvement in accuracy and area under the receiver operating curve of 15% and 0.06, respectively. Some studies also demonstrated a better performance in ML models compared with established prognostic indices and clinical experts. In the research setting, ML has been studied extensively, demonstrating an excellent performance in outcome prediction for a wide range of neurosurgical conditions. However, future studies should investigate how ML can be implemented as a practical tool supporting neurosurgical care. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Understanding the evolution and propagation of coronal mass ejections and associated plasma sheaths in interplanetary space

    NASA Astrophysics Data System (ADS)

    Hess, Phillip

    A Coronal Mass Ejection (CME) is an eruption of magnetized plasma from the Coronaof the Sun. Understanding the physical process of CMEs is a fundamental challenge in solarphysics, and is also of increasing importance for our technological society. CMEs are knownthe main driver of space weather that has adverse effects on satellites, power grids, com-munication and navigation systems and astronauts. Understanding and predicting CMEs is still in the early stage of research. In this dissertation, improved observational methods and advanced theoretical analysis are used to study CMEs. Unlike many studies in the past that treat CMEs as a single object, this study divides aCME into two separate components: the ejecta from the corona and the sheath region thatis the ambient plasma compressed by the shock/wave running ahead of the ejecta; bothstructures are geo-effective but evolve differently. Stereoscopic observations from multiplespacecraft, including STEREO and SOHO, are combined to provide a three-dimensionalgeometric reconstruction of the structures studied. True distances and velocities of CMEs are accurately determined, free of projection effects, and with continuous tracking from the low corona to 1 AU.To understand the kinematic evolution of CMEs, an advanced drag-based model (DBM) is proposed, with several improvements to the original DBM model. The new model varies the drag parameter with distance; the variation is constrained by thenecessary conservation of physical parameters. Second, the deviation of CME-nose from the Sun-Earth-line is taken into account. Third, the geometric correction of the shape of the ejecta front is considered, based on the assumption that the true front is a flattened croissant-shaped flux rope front. These improvements of the DBM model provide a framework for using measurement data to make accurate prediction of the arrival times of CME ejecta and sheaths. Using a set of seven events to test the model, it is found that the evolution of the ejecta front can be accurately predicted, with a slightly poorer performance on the sheath front. To improve the sheath prediction, the standoff-distance between the ejecta and the sheath front is used to model the evolution. The predicted arrivals of both the sheath and ejecta fronts at Earth are determined to within an average 3.5 hours and 1.5 hours of observed arrivals,respectively. These prediction errors show a significant improvement over predictions made by other researches. The results of this dissertation study demonstrate that accurate space weather prediction is possible, and also reveals what observations are needed in the future for realistic operational space weather prediction.

  7. Experimental Evaluation of Acoustic Engine Liner Models Developed with COMSOL Multiphysics

    NASA Technical Reports Server (NTRS)

    Schiller, Noah H.; Jones, Michael G.; Bertolucci, Brandon

    2017-01-01

    Accurate modeling tools are needed to design new engine liners capable of reducing aircraft noise. The purpose of this study is to determine if a commercially-available finite element package, COMSOL Multiphysics, can be used to accurately model a range of different acoustic engine liner designs, and in the process, collect and document a benchmark dataset that can be used in both current and future code evaluation activities. To achieve these goals, a variety of liner samples, ranging from conventional perforate-over-honeycomb to extended-reaction designs, were installed in one wall of the grazing flow impedance tube at the NASA Langley Research Center. The liners were exposed to high sound pressure levels and grazing flow, and the effect of the liner on the sound field in the flow duct was measured. These measurements were then compared with predictions. While this report only includes comparisons for a subset of the configurations, the full database of all measurements and predictions is available in electronic format upon request. The results demonstrate that both conventional perforate-over-honeycomb and extended-reaction liners can be accurately modeled using COMSOL. Therefore, this modeling tool can be used with confidence to supplement the current suite of acoustic propagation codes, and ultimately develop new acoustic engine liners designed to reduce aircraft noise.

  8. Should coastal planners have concern over where land ice is melting?

    PubMed Central

    Larour, Eric; Ivins, Erik R.; Adhikari, Surendra

    2017-01-01

    There is a general consensus among Earth scientists that melting of land ice greatly contributes to sea-level rise (SLR) and that future warming will exacerbate the risks posed to human civilization. As land ice is lost to the oceans, both the Earth’s gravitational and rotational potentials are perturbed, resulting in strong spatial patterns in SLR, termed sea-level fingerprints. We lack robust forecasting models for future ice changes, which diminishes our ability to use these fingerprints to accurately predict local sea-level (LSL) changes. We exploit an advanced mathematical property of adjoint systems and determine the exact gradient of sea-level fingerprints with respect to local variations in the ice thickness of all of the world’s ice drainage systems. By exhaustively mapping these fingerprint gradients, we form a new diagnosis tool, henceforth referred to as gradient fingerprint mapping (GFM), that readily allows for improved assessments of future coastal inundation or emergence. We demonstrate that for Antarctica and Greenland, changes in the predictions of inundation at major port cities depend on the location of the drainage system. For example, in London, GFM shows LSL that is significantly affected by changes on the western part of the Greenland Ice Sheet (GrIS), whereas in New York, LSL change predictions are greatly sensitive to changes in the northeastern portions of the GrIS. We apply GFM to 293 major port cities to allow coastal planners to readily calculate LSL change as more reliable predictions of cryospheric mass changes become available. PMID:29152565

  9. Forecasting irregular variations of UT1-UTC and LOD data caused by ENSO

    NASA Astrophysics Data System (ADS)

    Niedzielski, T.; Kosek, W.

    2008-04-01

    The research focuses on prediction of LOD and UT1-UTC time series up to one-year in the future with the particular emphasis on the prediction improvement during El Nĩ o or La Nĩ a n n events. The polynomial-harmonic least-squares model is applied to fit the deterministic function to LOD data. The stochastic residuals computed as the difference between LOD data and the polynomial- harmonic model reveal the extreme values driven by El Nĩ o or La Nĩ a. These peaks are modeled by the n n stochastic bivariate autoregressive prediction. This approach focuses on the auto- and cross-correlations between LOD and the axial component of the atmospheric angular momentum. This technique allows one to derive more accurate predictions than purely univariate forecasts, particularly during El Nĩ o/La n Nĩ a events. n

  10. Sliding contact fracture of dental ceramics: Principles and validation

    PubMed Central

    Ren, Linlin; Zhang, Yu

    2014-01-01

    Ceramic prostheses are subject to sliding contact under normal and tangential loads. Accurate prediction of the onset of fracture at two contacting surfaces holds the key to greater long-term performance of these prostheses. In this study, building on stress analysis of Hertzian contact and considering fracture criteria for linear elastic materials, a constitutive fracture mechanics relation was developed to incorporate the critical fracture load with the contact geometry, coefficient of friction and material fracture toughness. Critical loads necessary to cause fracture under a sliding indenter were calculated from the constitutive equation, and compared with the loads predicted from elastic stress analysis in conjunction with measured critical load for frictionless normal contact—a semi-empirical approach. The major predictions of the models were calibrated with experimentally determined critical loads of current and future dental ceramics after contact with a rigid spherical slider. Experimental results conform with the trends predicted by the models. PMID:24632538

  11. Constructing high-accuracy intermolecular potential energy surface with multi-dimension Morse/Long-Range model

    NASA Astrophysics Data System (ADS)

    Zhai, Yu; Li, Hui; Le Roy, Robert J.

    2018-04-01

    Spectroscopically accurate Potential Energy Surfaces (PESs) are fundamental for explaining and making predictions of the infrared and microwave spectra of van der Waals (vdW) complexes, and the model used for the potential energy function is critically important for providing accurate, robust and portable analytical PESs. The Morse/Long-Range (MLR) model has proved to be one of the most general, flexible and accurate one-dimensional (1D) model potentials, as it has physically meaningful parameters, is flexible, smooth and differentiable everywhere, to all orders and extrapolates sensibly at both long and short ranges. The Multi-Dimensional Morse/Long-Range (mdMLR) potential energy model described herein is based on that 1D MLR model, and has proved to be effective and accurate in the potentiology of various types of vdW complexes. In this paper, we review the current status of development of the mdMLR model and its application to vdW complexes. The future of the mdMLR model is also discussed. This review can serve as a tutorial for the construction of an mdMLR PES.

  12. Water Level Prediction of Lake Cascade Mahakam Using Adaptive Neural Network Backpropagation (ANNBP)

    NASA Astrophysics Data System (ADS)

    Mislan; Gaffar, A. F. O.; Haviluddin; Puspitasari, N.

    2018-04-01

    A natural hazard information and flood events are indispensable as a form of prevention and improvement. One of the causes is flooding in the areas around the lake. Therefore, forecasting the surface of Lake water level to anticipate flooding is required. The purpose of this paper is implemented computational intelligence method namely Adaptive Neural Network Backpropagation (ANNBP) to forecasting the Lake Cascade Mahakam. Based on experiment, performance of ANNBP indicated that Lake water level prediction have been accurate by using mean square error (MSE) and mean absolute percentage error (MAPE). In other words, computational intelligence method can produce good accuracy. A hybrid and optimization of computational intelligence are focus in the future work.

  13. Forecasting vegetation greenness with satellite and climate data

    USGS Publications Warehouse

    Ji, Lei; Peters, Albert J.

    2004-01-01

    A new and unique vegetation greenness forecast (VGF) model was designed to predict future vegetation conditions to three months through the use of current and historical climate data and satellite imagery. The VGF model is implemented through a seasonality-adjusted autoregressive distributed-lag function, based on our finding that the normalized difference vegetation index is highly correlated with lagged precipitation and temperature. Accurate forecasts were obtained from the VGF model in Nebraska grassland and cropland. The regression R2 values range from 0.97-0.80 for 2-12 week forecasts, with higher R2 associated with a shorter prediction. An important application would be to produce real-time forecasts of greenness images.

  14. Modeling behavioral thermoregulation in a climate change sentinel.

    PubMed

    Moyer-Horner, Lucas; Mathewson, Paul D; Jones, Gavin M; Kearney, Michael R; Porter, Warren P

    2015-12-01

    When possible, many species will shift in elevation or latitude in response to rising temperatures. However, before such shifts occur, individuals will first tolerate environmental change and then modify their behavior to maintain heat balance. Behavioral thermoregulation allows animals a range of climatic tolerances and makes predicting geographic responses under future warming scenarios challenging. Because behavioral modification may reduce an individual's fecundity by, for example, limiting foraging time and thus caloric intake, we must consider the range of behavioral options available for thermoregulation to accurately predict climate change impacts on individual species. To date, few studies have identified mechanistic links between an organism's daily activities and the need to thermoregulate. We used a biophysical model, Niche Mapper, to mechanistically model microclimate conditions and thermoregulatory behavior for a temperature-sensitive mammal, the American pika (Ochotona princeps). Niche Mapper accurately simulated microclimate conditions, as well as empirical metabolic chamber data for a range of fur properties, animal sizes, and environmental parameters. Niche Mapper predicted pikas would be behaviorally constrained because of the need to thermoregulate during the hottest times of the day. We also showed that pikas at low elevations could receive energetic benefits by being smaller in size and maintaining summer pelage during longer stretches of the active season under a future warming scenario. We observed pika behavior for 288 h in Glacier National Park, Montana, and thermally characterized their rocky, montane environment. We found that pikas were most active when temperatures were cooler, and at sites characterized by high elevations and north-facing slopes. Pikas became significantly less active across a suite of behaviors in the field when temperatures surpassed 20°C, which supported a metabolic threshold predicted by Niche Mapper. In general, mechanistic predictions and empirical observations were congruent. This research is unique in providing both an empirical and mechanistic description of the effects of temperature on a mammalian sentinel of climate change, the American pika. Our results suggest that previously underinvestigated characteristics, specifically fur properties and body size, may play critical roles in pika populations' response to climate change. We also demonstrate the potential importance of considering behavioral thermoregulation and microclimate variability when predicting animal responses to climate change.

  15. NASA and CFD - Making investments for the future

    NASA Technical Reports Server (NTRS)

    Hessenius, Kristin A.; Richardson, P. F.

    1992-01-01

    From a NASA perspective, CFD is a new tool for fluid flow simulation and prediction with virtually none of the inherent limitations of other ground-based simulation techniques. A primary goal of NASA's CFD research program is to develop efficient and accurate computational techniques for utilization in the design and analysis of aerospace vehicles. The program in algorithm development has systematically progressed through the hierarchy of engineering simplifications of the Navier-Stokes equations, starting with the inviscid formulations such as transonic small disturbance, full potential, and Euler.

  16. Self-perception of competencies in adolescents with autism spectrum disorders.

    PubMed

    Furlano, Rosaria; Kelley, Elizabeth A; Hall, Layla; Wilson, Daryl E

    2015-12-01

    Research has demonstrated that, despite difficulties in multiple domains, children with autism spectrum disorders (ASD) show a lack of awareness of these difficulties. A misunderstanding of poor competencies may make it difficult for individuals to adjust their behaviour in accordance with feedback and may lead to greater impairments over time. This study examined self-perceptions of adolescents with ASD (n = 19) and typically developing (TD) mental-age-matched controls (n = 22) using actual performance on objective academic tasks as the basis for ratings. Before completing the tasks, participants were asked how well they thought they would do (pre-task prediction). After completing each task, they were asked how well they thought they did (immediate post-performance) and how well they would do in the future (hypothetical future post-performance). Adolescents with ASD had more positively biased self-perceptions of competence than TD controls. The ASD group tended to overestimate their performance on all ratings of self-perceptions (pre-task prediction, immediate, and hypothetical future post-performance). In contrast, while the TD group was quite accurate at estimating their performance immediately before and after performing the task, they showed some tendency to overestimate their future performance. Future investigation is needed to systematically examine possible mechanisms that may be contributing to these biased self-perceptions. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.

  17. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    PubMed Central

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models. PMID:26890307

  18. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness.

    PubMed

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and caution should be taken when applying filter FS methods in selecting predictive models.

  19. Coupling centennial-scale shoreline change to sea-level rise and coastal morphology in the Gulf of Mexico using a Bayesian network

    USGS Publications Warehouse

    Plant, Nathaniel G.

    2016-01-01

    Predictions of coastal evolution driven by episodic and persistent processes associated with storms and relative sea-level rise (SLR) are required to test our understanding, evaluate our predictive capability, and to provide guidance for coastal management decisions. Previous work demonstrated that the spatial variability of long-term shoreline change can be predicted using observed SLR rates, tide range, wave height, coastal slope, and a characterization of the geomorphic setting. The shoreline is not suf- ficient to indicate which processes are important in causing shoreline change, such as overwash that depends on coastal dune elevations. Predicting dune height is intrinsically important to assess future storm vulnerability. Here, we enhance shoreline-change predictions by including dune height as a vari- able in a statistical modeling approach. Dune height can also be used as an input variable, but it does not improve the shoreline-change prediction skill. Dune-height input does help to reduce prediction uncer- tainty. That is, by including dune height, the prediction is more precise but not more accurate. Comparing hindcast evaluations, better predictive skill was found when predicting dune height (0.8) compared with shoreline change (0.6). The skill depends on the level of detail of the model and we identify an optimized model that has high skill and minimal overfitting. The predictive model can be implemented with a range of forecast scenarios, and we illustrate the impacts of a higher future sea-level. This scenario shows that the shoreline change becomes increasingly erosional and more uncertain. Predicted dune heights are lower and the dune height uncertainty decreases.

  20. Prospective and retrospective episodic metamemory in posttraumatic stress disorder.

    PubMed

    Sacher, Mathilde; Tudorache, Andrei-Cristian; Clarys, David; Boudjarane, Mohamed; Landré, Lionel; El-Hage, Wissam

    2018-03-14

    Posttraumatic stress disorder (PTSD) has been consistently associated with episodic memory deficits. To some extent, these deficits could be related to an impairment of metamemory in individuals with PTSD. This research consequently aims at investigating prospective (feeling-of-knowing, FOK) and retrospective (confidence) metamemory judgments for episodic information in PTSD. Twenty participants with PTSD and without depression were compared to 30 healthy comparison participants on metamemory judgments during an episodic memory task. The concordance between metamemory judgments and recognition performance was then assessed by gamma correlations. The results confirmed that PTSD is associated with episodic memory impairment. Regarding metamemory, gamma correlations indicated that participants with PTSD failed to accurately predict their future memory performance as compared to the comparison group (mean FOK gamma correlations: .23 vs. .42, respectively). Furthermore, participants with PTSD made less accurate confidence judgments than comparison participants (mean confidence gamma correlations: .62 vs. .74, respectively). Our results demonstrate an alteration of both prospective and retrospective metamemory processes in PTSD, which could be of particular relevance to future therapeutic interventions focusing on metacognitive strategies.

  1. Potential Improvements in Space Weather Forecasting using New Products Developed for the Upcoming DSCOVR Solar Wind Mission

    NASA Astrophysics Data System (ADS)

    Cash, M. D.; Biesecker, D. A.; Reinard, A. A.

    2013-05-01

    The Deep Space Climate Observatory (DSCOVR) mission, which is scheduled for launch in late 2014, will provide real-time solar wind thermal plasma and magnetic measurements to ensure continuous monitoring for space weather forecasting. DSCOVR will be located at the L1 Lagrangian point and will include a Faraday cup to measure the proton and alpha components of the solar wind and a triaxial fluxgate magnetometer to measure the magnetic field in three dimensions. The real-time data provided by DSCOVR will be used to generate space weather applications and products that have been demonstrated to be highly accurate and provide actionable information for customers. We present several future space weather products currently under evaluation for development. New potential space weather products for use with DSCOVR real-time data include: automated shock detection, more accurate L1 to Earth delay time, automatic solar wind regime identification, and prediction of rotations in solar wind Bz within magnetic clouds. Additional ideas from the community on future space weather products are encouraged.

  2. Accuracy of risk scales for predicting repeat self-harm and suicide: a multicentre, population-level cohort study using routine clinical data.

    PubMed

    Steeg, Sarah; Quinlivan, Leah; Nowland, Rebecca; Carroll, Robert; Casey, Deborah; Clements, Caroline; Cooper, Jayne; Davies, Linda; Knipe, Duleeka; Ness, Jennifer; O'Connor, Rory C; Hawton, Keith; Gunnell, David; Kapur, Nav

    2018-04-25

    Risk scales are used widely in the management of patients presenting to hospital following self-harm. However, there is evidence that their diagnostic accuracy in predicting repeat self-harm is limited. Their predictive accuracy in population settings, and in identifying those at highest risk of suicide is not known. We compared the predictive accuracy of the Manchester Self-Harm Rule (MSHR), ReACT Self-Harm Rule (ReACT), SAD PERSONS Scale (SPS) and Modified SAD PERSONS Scale (MSPS) in an unselected sample of patients attending hospital following self-harm. Data on 4000 episodes of self-harm presenting to Emergency Departments (ED) between 2010 and 2012 were obtained from four established monitoring systems in England. Episodes were assigned a risk category for each scale and followed up for 6 months. The episode-based repeat rate was 28% (1133/4000) and the incidence of suicide was 0.5% (18/3962). The MSHR and ReACT performed with high sensitivity (98% and 94% respectively) and low specificity (15% and 23%). The SPS and the MSPS performed with relatively low sensitivity (24-29% and 9-12% respectively) and high specificity (76-77% and 90%). The area under the curve was 71% for both MSHR and ReACT, 51% for SPS and 49% for MSPS. Differences in predictive accuracy by subgroup were small. The scales were less accurate at predicting suicide than repeat self-harm. The scales failed to accurately predict repeat self-harm and suicide. The findings support existing clinical guidance not to use risk classification scales alone to determine treatment or predict future risk.

  3. Spatio-temporal environmental variation mediates geographical differences in phenotypic responses to ocean acidification

    PubMed Central

    Villanueva, Paola A.; Lopez, Jorge; Torres, Rodrigo; Navarro, Jorge M.; Bacigalupe, Leonardo D.

    2017-01-01

    Phenotypic plasticity is expected to play a major adaptive role in the response of species to ocean acidification (OA), by providing broader tolerances to changes in pCO2 conditions. However, tolerances and sensitivities to future OA may differ among populations within a species because of their particular environmental context and genetic backgrounds. Here, using the climatic variability hypothesis (CVH), we explored this conceptual framework in populations of the sea urchin Loxechinus albus across natural fluctuating pCO2/pH environments. Although elevated pCO2 affected the morphology, physiology, development and survival of sea urchin larvae, the magnitude of these effects differed among populations. These differences were consistent with the predictions of the CVH showing greater tolerance to OA in populations experiencing greater local variation in seawater pCO2/pH. Considering geographical differences in plasticity, tolerances and sensitivities to increased pCO2 will provide more accurate predictions for species responses to future OA. PMID:28179409

  4. Spatio-temporal environmental variation mediates geographical differences in phenotypic responses to ocean acidification.

    PubMed

    Gaitán-Espitia, Juan Diego; Villanueva, Paola A; Lopez, Jorge; Torres, Rodrigo; Navarro, Jorge M; Bacigalupe, Leonardo D

    2017-02-01

    Phenotypic plasticity is expected to play a major adaptive role in the response of species to ocean acidification (OA), by providing broader tolerances to changes in p CO 2 conditions. However, tolerances and sensitivities to future OA may differ among populations within a species because of their particular environmental context and genetic backgrounds. Here, using the climatic variability hypothesis (CVH), we explored this conceptual framework in populations of the sea urchin Loxechinus albus across natural fluctuating p CO 2 /pH environments. Although elevated p CO 2 affected the morphology, physiology, development and survival of sea urchin larvae, the magnitude of these effects differed among populations. These differences were consistent with the predictions of the CVH showing greater tolerance to OA in populations experiencing greater local variation in seawater p CO 2 /pH. Considering geographical differences in plasticity, tolerances and sensitivities to increased p CO 2 will provide more accurate predictions for species responses to future OA. © 2017 The Author(s).

  5. Rising sea levels will reduce extreme temperature variations in tide-dominated reef habitats

    PubMed Central

    Lowe, Ryan Joseph; Pivan, Xavier; Falter, James; Symonds, Graham; Gruber, Renee

    2016-01-01

    Temperatures within shallow reefs often differ substantially from those in the surrounding ocean; therefore, predicting future patterns of thermal stresses and bleaching at the scale of reefs depends on accurately predicting reef heat budgets. We present a new framework for quantifying how tidal and solar heating cycles interact with reef morphology to control diurnal temperature extremes within shallow, tidally forced reefs. Using data from northwestern Australia, we construct a heat budget model to investigate how frequency differences between the dominant lunar semidiurnal tide and diurnal solar cycle drive ~15-day modulations in diurnal temperature extremes. The model is extended to show how reefs with tidal amplitudes comparable to their depth, relative to mean sea level, tend to experience the largest temperature extremes globally. As a consequence, we reveal how even a modest sea level rise can substantially reduce temperature extremes within tide-dominated reefs, thereby partially offsetting the local effects of future ocean warming. PMID:27540589

  6. Algorithm aversion: people erroneously avoid algorithms after seeing them err.

    PubMed

    Dietvorst, Berkeley J; Simmons, Joseph P; Massey, Cade

    2015-02-01

    Research shows that evidence-based algorithms more accurately predict the future than do human forecasters. Yet when forecasters are deciding whether to use a human forecaster or a statistical algorithm, they often choose the human forecaster. This phenomenon, which we call algorithm aversion, is costly, and it is important to understand its causes. We show that people are especially averse to algorithmic forecasters after seeing them perform, even when they see them outperform a human forecaster. This is because people more quickly lose confidence in algorithmic than human forecasters after seeing them make the same mistake. In 5 studies, participants either saw an algorithm make forecasts, a human make forecasts, both, or neither. They then decided whether to tie their incentives to the future predictions of the algorithm or the human. Participants who saw the algorithm perform were less confident in it, and less likely to choose it over an inferior human forecaster. This was true even among those who saw the algorithm outperform the human.

  7. Dawn Orbit Determination Team: Trajectory and Gravity Prediction Performance During Vesta Science Phases

    NASA Technical Reports Server (NTRS)

    Kennedy, Brian; Abrahamson, Matt; Ardito, Alessandro; Han, Dongsuk; Haw, Robert; Mastrodemos, Nicholas; Nandi, Sumita; Park, Ryan; Rush, Brian; Vaughan, Andrew

    2013-01-01

    The Dawn spacecraft was launched on September 27th, 2007. Its mission is to consecutively rendezvous with and observe the two largest bodies in the asteroid belt, Vesta and Ceres. It has already completed over a year's worth of direct observations of Vesta (spanning from early 2011 through late 2012) and is currently on a cruise trajectory to Ceres, where it will begin scientific observations in mid-2015. Achieving this data collection required careful planning and execution from all spacecraft teams. Dawn's Orbit Determination (OD) team was tasked with accurately predicting the trajectory of the Dawn spacecraft during the Vesta science phases, and also determining the parameters of Vesta to support future science orbit design. The future orbits included the upcoming science phase orbits as well as the transfer orbits between science phases. In all, five science phases were executed at Vesta, and this paper will describe some of the OD team contributions to the planning and execution of those phases.

  8. A Deep Learning Framework for Robust and Accurate Prediction of ncRNA-Protein Interactions Using Evolutionary Information.

    PubMed

    Yi, Hai-Cheng; You, Zhu-Hong; Huang, De-Shuang; Li, Xiao; Jiang, Tong-Hai; Li, Li-Ping

    2018-06-01

    The interactions between non-coding RNAs (ncRNAs) and proteins play an important role in many biological processes, and their biological functions are primarily achieved by binding with a variety of proteins. High-throughput biological techniques are used to identify protein molecules bound with specific ncRNA, but they are usually expensive and time consuming. Deep learning provides a powerful solution to computationally predict RNA-protein interactions. In this work, we propose the RPI-SAN model by using the deep-learning stacked auto-encoder network to mine the hidden high-level features from RNA and protein sequences and feed them into a random forest (RF) model to predict ncRNA binding proteins. Stacked assembling is further used to improve the accuracy of the proposed method. Four benchmark datasets, including RPI2241, RPI488, RPI1807, and NPInter v2.0, were employed for the unbiased evaluation of five established prediction tools: RPI-Pred, IPMiner, RPISeq-RF, lncPro, and RPI-SAN. The experimental results show that our RPI-SAN model achieves much better performance than other methods, with accuracies of 90.77%, 89.7%, 96.1%, and 99.33%, respectively. It is anticipated that RPI-SAN can be used as an effective computational tool for future biomedical researches and can accurately predict the potential ncRNA-protein interacted pairs, which provides reliable guidance for biological research. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  9. Predicting survival time in noncurative patients with advanced cancer: a prospective study in China.

    PubMed

    Cui, Jing; Zhou, Lingjun; Wee, B; Shen, Fengping; Ma, Xiuqiang; Zhao, Jijun

    2014-05-01

    Accurate prediction of prognosis for cancer patients is important for good clinical decision making in therapeutic and care strategies. The application of prognostic tools and indicators could improve prediction accuracy. This study aimed to develop a new prognostic scale to predict survival time of advanced cancer patients in China. We prospectively collected items that we anticipated might influence survival time of advanced cancer patients. Participants were recruited from 12 hospitals in Shanghai, China. We collected data including demographic information, clinical symptoms and signs, and biochemical test results. Log-rank tests, Cox regression, and linear regression were performed to develop a prognostic scale. Three hundred twenty patients with advanced cancer were recruited. Fourteen prognostic factors were included in the prognostic scale: Karnofsky Performance Scale (KPS) score, pain, ascites, hydrothorax, edema, delirium, cachexia, white blood cell (WBC) count, hemoglobin, sodium, total bilirubin, direct bilirubin, aspartate aminotransferase (AST), and alkaline phosphatase (ALP) values. The score was calculated by summing the partial scores, ranging from 0 to 30. When using the cutoff points of 7-day, 30-day, 90-day, and 180-day survival time, the scores were calculated as 12, 10, 8, and 6, respectively. We propose a new prognostic scale including KPS, pain, ascites, hydrothorax, edema, delirium, cachexia, WBC count, hemoglobin, sodium, total bilirubin, direct bilirubin, AST, and ALP values, which may help guide physicians in predicting the likely survival time of cancer patients more accurately. More studies are needed to validate this scale in the future.

  10. Computational Prediction of miRNA Genes from Small RNA Sequencing Data

    PubMed Central

    Kang, Wenjing; Friedländer, Marc R.

    2015-01-01

    Next-generation sequencing now for the first time allows researchers to gage the depth and variation of entire transcriptomes. However, now as rare transcripts can be detected that are present in cells at single copies, more advanced computational tools are needed to accurately annotate and profile them. microRNAs (miRNAs) are 22 nucleotide small RNAs (sRNAs) that post-transcriptionally reduce the output of protein coding genes. They have established roles in numerous biological processes, including cancers and other diseases. During miRNA biogenesis, the sRNAs are sequentially cleaved from precursor molecules that have a characteristic hairpin RNA structure. The vast majority of new miRNA genes that are discovered are mined from small RNA sequencing (sRNA-seq), which can detect more than a billion RNAs in a single run. However, given that many of the detected RNAs are degradation products from all types of transcripts, the accurate identification of miRNAs remain a non-trivial computational problem. Here, we review the tools available to predict animal miRNAs from sRNA sequencing data. We present tools for generalist and specialist use cases, including prediction from massively pooled data or in species without reference genome. We also present wet-lab methods used to validate predicted miRNAs, and approaches to computationally benchmark prediction accuracy. For each tool, we reference validation experiments and benchmarking efforts. Last, we discuss the future of the field. PMID:25674563

  11. Ligand and structure-based methodologies for the prediction of the activity of G protein-coupled receptor ligands

    NASA Astrophysics Data System (ADS)

    Costanzi, Stefano; Tikhonova, Irina G.; Harden, T. Kendall; Jacobson, Kenneth A.

    2009-11-01

    Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.

  12. Statistical Approaches for Spatiotemporal Prediction of Low Flows

    NASA Astrophysics Data System (ADS)

    Fangmann, A.; Haberlandt, U.

    2017-12-01

    An adequate assessment of regional climate change impacts on streamflow requires the integration of various sources of information and modeling approaches. This study proposes simple statistical tools for inclusion into model ensembles, which are fast and straightforward in their application, yet able to yield accurate streamflow predictions in time and space. Target variables for all approaches are annual low flow indices derived from a data set of 51 records of average daily discharge for northwestern Germany. The models require input of climatic data in the form of meteorological drought indices, derived from observed daily climatic variables, averaged over the streamflow gauges' catchments areas. Four different modeling approaches are analyzed. Basis for all pose multiple linear regression models that estimate low flows as a function of a set of meteorological indices and/or physiographic and climatic catchment descriptors. For the first method, individual regression models are fitted at each station, predicting annual low flow values from a set of annual meteorological indices, which are subsequently regionalized using a set of catchment characteristics. The second method combines temporal and spatial prediction within a single panel data regression model, allowing estimation of annual low flow values from input of both annual meteorological indices and catchment descriptors. The third and fourth methods represent non-stationary low flow frequency analyses and require fitting of regional distribution functions. Method three is subject to a spatiotemporal prediction of an index value, method four to estimation of L-moments that adapt the regional frequency distribution to the at-site conditions. The results show that method two outperforms successive prediction in time and space. Method three also shows a high performance in the near future period, but since it relies on a stationary distribution, its application for prediction of far future changes may be problematic. Spatiotemporal prediction of L-moments appeared highly uncertain for higher-order moments resulting in unrealistic future low flow values. All in all, the results promote an inclusion of simple statistical methods in climate change impact assessment.

  13. Numerical Modeling of Propellant Boil-Off in a Cryogenic Storage Tank

    NASA Technical Reports Server (NTRS)

    Majumdar, A. K.; Steadman, T. E.; Maroney, J. L.; Sass, J. P.; Fesmire, J. E.

    2007-01-01

    A numerical model to predict boil-off of stored propellant in large spherical cryogenic tanks has been developed. Accurate prediction of tank boil-off rates for different thermal insulation systems was the goal of this collaboration effort. The Generalized Fluid System Simulation Program, integrating flow analysis and conjugate heat transfer for solving complex fluid system problems, was used to create the model. Calculation of tank boil-off rate requires simultaneous simulation of heat transfer processes among liquid propellant, vapor ullage space, and tank structure. The reference tank for the boil-off model was the 850,000 gallon liquid hydrogen tank at Launch Complex 39B (LC- 39B) at Kennedy Space Center, which is under study for future infrastructure improvements to support the Constellation program. The methodology employed in the numerical model was validated using a sub-scale model and tank. Experimental test data from a 1/15th scale version of the LC-39B tank using both liquid hydrogen and liquid nitrogen were used to anchor the analytical predictions of the sub-scale model. Favorable correlations between sub-scale model and experimental test data have provided confidence in full-scale tank boil-off predictions. These methods are now being used in the preliminary design for other cases including future launch vehicles

  14. Biomarker Surrogates Do Not Accurately Predict Sputum Eosinophils and Neutrophils in Asthma

    PubMed Central

    Hastie, Annette T.; Moore, Wendy C.; Li, Huashi; Rector, Brian M.; Ortega, Victor E.; Pascual, Rodolfo M.; Peters, Stephen P.; Meyers, Deborah A.; Bleecker, Eugene R.

    2013-01-01

    Background Sputum eosinophils (Eos) are a strong predictor of airway inflammation, exacerbations, and aid asthma management, whereas sputum neutrophils (Neu) indicate a different severe asthma phenotype, potentially less responsive to TH2-targeted therapy. Variables such as blood Eos, total IgE, fractional exhaled nitric oxide (FeNO) or FEV1% predicted, may predict airway Eos, while age, FEV1%predicted, or blood Neu may predict sputum Neu. Availability and ease of measurement are useful characteristics, but accuracy in predicting airway Eos and Neu, individually or combined, is not established. Objectives To determine whether blood Eos, FeNO, and IgE accurately predict sputum eosinophils, and age, FEV1% predicted, and blood Neu accurately predict sputum neutrophils (Neu). Methods Subjects in the Wake Forest Severe Asthma Research Program (N=328) were characterized by blood and sputum cells, healthcare utilization, lung function, FeNO, and IgE. Multiple analytical techniques were utilized. Results Despite significant association with sputum Eos, blood Eos, FeNO and total IgE did not accurately predict sputum Eos, and combinations of these variables failed to improve prediction. Age, FEV1%predicted and blood Neu were similarly unsatisfactory for prediction of sputum Neu. Factor analysis and stepwise selection found FeNO, IgE and FEV1% predicted, but not blood Eos, correctly predicted 69% of sputum Eos

  15. Predictive Temperature Equations for Three Sites at the Grand Canyon

    NASA Astrophysics Data System (ADS)

    McLaughlin, Katrina Marie Neitzel

    Climate data collected at a number of automated weather stations were used to create a series of predictive equations spanning from December 2009 to May 2010 in order to better predict the temperatures along hiking trails within the Grand Canyon. The central focus of this project is how atmospheric variables interact and can be combined to predict the weather in the Grand Canyon at the Indian Gardens, Phantom Ranch, and Bright Angel sites. Through the use of statistical analysis software and data regression, predictive equations were determined. The predictive equations are simple or multivariable best fits that reflect the curvilinear nature of the data. With data analysis software curves resulting from the predictive equations were plotted along with the observed data. Each equation's reduced chi2 was determined to aid the visual examination of the predictive equations' ability to reproduce the observed data. From this information an equation or pair of equations was determined to be the best of the predictive equations. Although a best predictive equation for each month and season was determined for each site, future work may refine equations to result in a more accurate predictive equation.

  16. Towards more accurate and reliable predictions for nuclear applications

    NASA Astrophysics Data System (ADS)

    Goriely, Stephane; Hilaire, Stephane; Dubray, Noel; Lemaître, Jean-François

    2017-09-01

    The need for nuclear data far from the valley of stability, for applications such as nuclear astrophysics or future nuclear facilities, challenges the robustness as well as the predictive power of present nuclear models. Most of the nuclear data evaluation and prediction are still performed on the basis of phenomenological nuclear models. For the last decades, important progress has been achieved in fundamental nuclear physics, making it now feasible to use more reliable, but also more complex microscopic or semi-microscopic models in the evaluation and prediction of nuclear data for practical applications. Nowadays mean-field models can be tuned at the same level of accuracy as the phenomenological models, renormalized on experimental data if needed, and therefore can replace the phenomenological inputs in the evaluation of nuclear data. The latest achievements to determine nuclear masses within the non-relativistic HFB approach, including the related uncertainties in the model predictions, are discussed. Similarly, recent efforts to determine fission observables within the mean-field approach are described and compared with more traditional existing models.

  17. Optimal temperature for malaria transmission is dramaticallylower than previously predicted

    USGS Publications Warehouse

    Mordecai, Eerin A.; Paaijmans, Krijin P.; Johnson, Leah R.; Balzer, Christian; Ben-Horin, Tal; de Moor, Emily; McNally, Amy; Pawar, Samraat; Ryan, Sadie J.; Smith, Thomas C.; Lafferty, Kevin D.

    2013-01-01

    The ecology of mosquito vectors and malaria parasites affect the incidence, seasonal transmission and geographical range of malaria. Most malaria models to date assume constant or linear responses of mosquito and parasite life-history traits to temperature, predicting optimal transmission at 31 °C. These models are at odds with field observations of transmission dating back nearly a century. We build a model with more realistic ecological assumptions about the thermal physiology of insects. Our model, which includes empirically derived nonlinear thermal responses, predicts optimal malaria transmission at 25 °C (6 °C lower than previous models). Moreover, the model predicts that transmission decreases dramatically at temperatures > 28 °C, altering predictions about how climate change will affect malaria. A large data set on malaria transmission risk in Africa validates both the 25 °C optimum and the decline above 28 °C. Using these more accurate nonlinear thermal-response models will aid in understanding the effects of current and future temperature regimes on disease transmission.

  18. Optimal temperature for malaria transmission is dramatically lower than previously predicted

    USGS Publications Warehouse

    Mordecai, Erin A.; Paaijmans, Krijn P.; Johnson, Leah R.; Balzer, Christian; Ben-Horin, Tal; de Moor, Emily; McNally, Amy; Pawar, Samraat; Ryan, Sadie J.; Smith, Thomas C.; Lafferty, Kevin D.

    2013-01-01

    The ecology of mosquito vectors and malaria parasites affect the incidence, seasonal transmission and geographical range of malaria. Most malaria models to date assume constant or linear responses of mosquito and parasite life-history traits to temperature, predicting optimal transmission at 31 °C. These models are at odds with field observations of transmission dating back nearly a century. We build a model with more realistic ecological assumptions about the thermal physiology of insects. Our model, which includes empirically derived nonlinear thermal responses, predicts optimal malaria transmission at 25 °C (6 °C lower than previous models). Moreover, the model predicts that transmission decreases dramatically at temperatures > 28 °C, altering predictions about how climate change will affect malaria. A large data set on malaria transmission risk in Africa validates both the 25 °C optimum and the decline above 28 °C. Using these more accurate nonlinear thermal-response models will aid in understanding the effects of current and future temperature regimes on disease transmission.

  19. TankSIM: A Cryogenic Tank Performance Prediction Program

    NASA Technical Reports Server (NTRS)

    Bolshinskiy, L. G.; Hedayat, A.; Hastings, L. J.; Moder, J. P.; Schnell, A. R.; Sutherlin, S. G.

    2015-01-01

    Accurate prediction of the thermodynamic state of the cryogenic propellants in launch vehicle tanks is necessary for mission planning and successful execution. Cryogenic propellant storage and transfer in space environments requires that tank pressure be controlled. The pressure rise rate is determined by the complex interaction of external heat leak, fluid temperature stratification, and interfacial heat and mass transfer. If the required storage duration of a space mission is longer than the period in which the tank pressure reaches its allowable maximum, an appropriate pressure control method must be applied. Therefore, predictions of the pressurization rate and performance of pressure control techniques in cryogenic tanks are required for development of cryogenic fluid long-duration storage technology and planning of future space exploration missions. This paper describes an analytical tool, Tank System Integrated Model (TankSIM), which can be used for modeling pressure control and predicting the behavior of cryogenic propellant for long-term storage for future space missions. It is written in the FORTRAN 90 language and can be compiled with any Visual FORTRAN compiler. A thermodynamic vent system (TVS) is used to achieve tank pressure control. Utilizing TankSIM, the following processes can be modeled: tank self-pressurization, boiloff, ullage venting, and mixing. Details of the TankSIM program and comparisons of its predictions with test data for liquid hydrogen and liquid methane will be presented in the final paper.

  20. Improving the accuracy of protein stability predictions with multistate design using a variety of backbone ensembles.

    PubMed

    Davey, James A; Chica, Roberto A

    2014-05-01

    Multistate computational protein design (MSD) with backbone ensembles approximating conformational flexibility can predict higher quality sequences than single-state design with a single fixed backbone. However, it is currently unclear what characteristics of backbone ensembles are required for the accurate prediction of protein sequence stability. In this study, we aimed to improve the accuracy of protein stability predictions made with MSD by using a variety of backbone ensembles to recapitulate the experimentally measured stability of 85 Streptococcal protein G domain β1 sequences. Ensembles tested here include an NMR ensemble as well as those generated by molecular dynamics (MD) simulations, by Backrub motions, and by PertMin, a new method that we developed involving the perturbation of atomic coordinates followed by energy minimization. MSD with the PertMin ensembles resulted in the most accurate predictions by providing the highest number of stable sequences in the top 25, and by correctly binning sequences as stable or unstable with the highest success rate (≈90%) and the lowest number of false positives. The performance of PertMin ensembles is due to the fact that their members closely resemble the input crystal structure and have low potential energy. Conversely, the NMR ensemble as well as those generated by MD simulations at 500 or 1000 K reduced prediction accuracy due to their low structural similarity to the crystal structure. The ensembles tested herein thus represent on- or off-target models of the native protein fold and could be used in future studies to design for desired properties other than stability. Copyright © 2013 Wiley Periodicals, Inc.

  1. Predicting watershed sediment yields after wildland fire with the InVEST sediment retention model at large geographic extent in the western USA: accuracy and uncertainties

    NASA Astrophysics Data System (ADS)

    Sankey, J. B.; Kreitler, J.; McVay, J.; Hawbaker, T. J.; Vaillant, N.; Lowe, S. E.

    2014-12-01

    Wildland fire is a primary threat to watersheds that can impact water supply through increased sedimentation, water quality decline, and change the timing and amount of runoff leading to increased risk from flood and sediment natural hazards. It is of great societal importance in the western USA and throughout the world to improve understanding of how changing fire frequency, extent, and location, in conjunction with fuel treatments will affect watersheds and the ecosystem services they supply to communities. In this work we assess the utility of the InVEST Sediment Retention Model to accurately characterize vulnerability of burned watersheds to erosion and sedimentation. The InVEST tools are GIS-based implementations of common process models, engineered for high-end computing to allow the faster simulation of larger landscapes and incorporation into decision-making. The InVEST Sediment Retention Model is based on common soil erosion models (e.g., RUSLE -Revised Universal Soil Loss Equation) and determines which areas of the landscape contribute the greatest sediment loads to a hydrological network and conversely evaluate the ecosystem service of sediment retention on a watershed basis. We evaluate the accuracy and uncertainties for InVEST predictions of increased sedimentation after fire, using measured post-fire sedimentation rates available for many watersheds in different rainfall regimes throughout the western USA from an existing, large USGS database of post-fire sediment yield [synthesized in Moody J, Martin D (2009) Synthesis of sediment yields after wildland fire in different rainfall regimes in the western United States. International Journal of Wildland Fire 18: 96-115]. The ultimate goal of this work is to calibrate and implement the model to accurately predict variability in post-fire sediment yield as a function of future landscape heterogeneity predicted by wildfire simulations, and future landscape fuel treatment scenarios, within watersheds.

  2. Shifts in frog size and phenology: Testing predictions of climate change on a widespread anuran using data from prior to rapid climate warming.

    PubMed

    Sheridan, Jennifer A; Caruso, Nicholas M; Apodaca, Joseph J; Rissler, Leslie J

    2018-01-01

    Changes in body size and breeding phenology have been identified as two major ecological consequences of climate change, yet it remains unclear whether climate acts directly or indirectly on these variables. To better understand the relationship between climate and ecological changes, it is necessary to determine environmental predictors of both size and phenology using data from prior to the onset of rapid climate warming, and then to examine spatially explicit changes in climate, size, and phenology, not just general spatial and temporal trends. We used 100 years of natural history collection data for the wood frog, Lithobates sylvaticus with a range >9 million km 2 , and spatially explicit environmental data to determine the best predictors of size and phenology prior to rapid climate warming (1901-1960). We then tested how closely size and phenology changes predicted by those environmental variables reflected actual changes from 1961 to 2000. Size, phenology, and climate all changed as expected (smaller, earlier, and warmer, respectively) at broad spatial scales across the entire study range. However, while spatially explicit changes in climate variables accurately predicted changes in phenology, they did not accurately predict size changes during recent climate change (1961-2000), contrary to expectations from numerous recent studies. Our results suggest that changes in climate are directly linked to observed phenological shifts. However, the mechanisms driving observed body size changes are yet to be determined, given the less straightforward relationship between size and climate factors examined in this study. We recommend that caution be used in "space-for-time" studies where measures of a species' traits at lower latitudes or elevations are considered representative of those under future projected climate conditions. Future studies should aim to determine mechanisms driving trends in phenology and body size, as well as the impact of climate on population density, which may influence body size.

  3. Bayesian flood forecasting methods: A review

    NASA Astrophysics Data System (ADS)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.

  4. Path integral Monte Carlo simulations of dense carbon-hydrogen plasmas

    NASA Astrophysics Data System (ADS)

    Zhang, Shuai; Militzer, Burkhard; Benedict, Lorin X.; Soubiran, François; Sterne, Philip A.; Driver, Kevin P.

    2018-03-01

    Carbon-hydrogen plasmas and hydrocarbon materials are of broad interest to laser shock experimentalists, high energy density physicists, and astrophysicists. Accurate equations of state (EOSs) of hydrocarbons are valuable for various studies from inertial confinement fusion to planetary science. By combining path integral Monte Carlo (PIMC) results at high temperatures and density functional theory molecular dynamics results at lower temperatures, we compute the EOSs for hydrocarbons from simulations performed at 1473 separate (ρ, T)-points distributed over a range of compositions. These methods accurately treat electronic excitation effects with neither adjustable parameter nor experimental input. PIMC is also an accurate simulation method that is capable of treating many-body interaction and nuclear quantum effects at finite temperatures. These methods therefore provide a benchmark-quality EOS that surpasses that of semi-empirical and Thomas-Fermi-based methods in the warm dense matter regime. By comparing our first-principles EOS to the LEOS 5112 model for CH, we validate the specific heat assumptions in this model but suggest that the Grüneisen parameter is too large at low temperatures. Based on our first-principles EOSs, we predict the principal Hugoniot curve of polystyrene to be 2%-5% softer at maximum shock compression than that predicted by orbital-free density functional theory and SESAME 7593. By investigating the atomic structure and chemical bonding of hydrocarbons, we show a drastic decrease in the lifetime of chemical bonds in the pressure interval from 0.4 to 4 megabar. We find the assumption of linear mixing to be valid for describing the EOS and the shock Hugoniot curve of hydrocarbons in the regime of partially ionized atomic liquids. We make predictions of the shock compression of glow-discharge polymers and investigate the effects of oxygen content and C:H ratio on its Hugoniot curve. Our full suite of first-principles simulation results may be used to benchmark future theoretical investigations pertaining to hydrocarbon EOSs and should be helpful in guiding the design of future experiments on hydrocarbons in the gigabar regime.

  5. Climate change and future fire regimes: Examples from California

    USGS Publications Warehouse

    Keeley, Jon E.; Syphard, Alexandra D.

    2016-01-01

    Climate and weather have long been noted as playing key roles in wildfire activity, and global warming is expected to exacerbate fire impacts on natural and urban ecosystems. Predicting future fire regimes requires an understanding of how temperature and precipitation interact to control fire activity. Inevitably this requires historical analyses that relate annual burning to climate variation. Fuel structure plays a critical role in determining which climatic parameters are most influential on fire activity, and here, by focusing on the diversity of ecosystems in California, we illustrate some principles that need to be recognized in predicting future fire regimes. Spatial scale of analysis is important in that large heterogeneous landscapes may not fully capture accurate relationships between climate and fires. Within climatically homogeneous subregions, montane forested landscapes show strong relationships between annual fluctuations in temperature and precipitation with area burned; however, this is strongly seasonal dependent; e.g., winter temperatures have very little or no effect but spring and summer temperatures are critical. Climate models that predict future seasonal temperature changes are needed to improve fire regime projections. Climate does not appear to be a major determinant of fire activity on all landscapes. Lower elevations and lower latitudes show little or no increase in fire activity with hotter and drier conditions. On these landscapes climate is not usually limiting to fires but these vegetation types are ignition-limited. Moreover, because they are closely juxtaposed with human habitations, fire regimes are more strongly controlled by other direct anthropogenic impacts. Predicting future fire regimes is not rocket science; it is far more complicated than that. Climate change is not relevant to some landscapes, but where climate is relevant, the relationship will change due to direct climate effects on vegetation trajectories, as well as by feedback processes of fire effects on vegetation distribution, plus policy changes in how we manage ecosystems.

  6. Unified Performance and Power Modeling of Scientific Workloads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Shuaiwen; Barker, Kevin J.; Kerbyson, Darren J.

    2013-11-17

    It is expected that scientific applications executing on future large-scale HPC must be optimized not only in terms of performance, but also in terms of power consumption. As power and energy become increasingly constrained resources, researchers and developers must have access to tools that will allow for accurate prediction of both performance and power consumption. Reasoning about performance and power consumption in concert will be critical for achieving maximum utilization of limited resources on future HPC systems. To this end, we present a unified performance and power model for the Nek-Bone mini-application developed as part of the DOE's CESAR Exascalemore » Co-Design Center. Our models consider the impact of computation, point-to-point communication, and collective communication« less

  7. Failure to use corollary discharge to remap visual target locations is associated with psychotic symptom severity in schizophrenia

    PubMed Central

    Rösler, Lara; Rolfs, Martin; van der Stigchel, Stefan; Neggers, Sebastiaan F. W.; Cahn, Wiepke; Kahn, René S.

    2015-01-01

    Corollary discharge (CD) refers to “copies” of motor signals sent to sensory areas, allowing prediction of future sensory states. They enable the putative mechanisms supporting the distinction between self-generated and externally generated sensations. Accordingly, many authors have suggested that disturbed CD engenders psychotic symptoms of schizophrenia, which are characterized by agency distortions. CD also supports perceived visual stability across saccadic eye movements and is used to predict the postsaccadic retinal coordinates of visual stimuli, a process called remapping. We tested whether schizophrenia patients (SZP) show remapping disturbances as evidenced by systematic transsaccadic mislocalizations of visual targets. SZP and healthy controls (HC) performed a task in which a saccadic target disappeared upon saccade initiation and, after a brief delay, reappeared at a horizontally displaced position. HC judged the direction of this displacement accurately, despite spatial errors in saccade landing site, indicating that their comparison of the actual to predicted postsaccadic target location relied on accurate CD. SZP performed worse and relied more on saccade landing site as a proxy for the presaccadic target, consistent with disturbed CD. This remapping failure was strongest in patients with more severe psychotic symptoms, consistent with the theoretical link between disturbed CD and phenomenological experiences in schizophrenia. PMID:26108951

  8. Detecting Presymptomatic Infection Is Necessary to Forecast Major Epidemics in the Earliest Stages of Infectious Disease Outbreaks

    PubMed Central

    Thompson, Robin N.; Gilligan, Christopher A.; Cunniffe, Nik J.

    2016-01-01

    We assess how presymptomatic infection affects predictability of infectious disease epidemics. We focus on whether or not a major outbreak (i.e. an epidemic that will go on to infect a large number of individuals) can be predicted reliably soon after initial cases of disease have appeared within a population. For emerging epidemics, significant time and effort is spent recording symptomatic cases. Scientific attention has often focused on improving statistical methodologies to estimate disease transmission parameters from these data. Here we show that, even if symptomatic cases are recorded perfectly, and disease spread parameters are estimated exactly, it is impossible to estimate the probability of a major outbreak without ambiguity. Our results therefore provide an upper bound on the accuracy of forecasts of major outbreaks that are constructed using data on symptomatic cases alone. Accurate prediction of whether or not an epidemic will occur requires records of symptomatic individuals to be supplemented with data concerning the true infection status of apparently uninfected individuals. To forecast likely future behavior in the earliest stages of an emerging outbreak, it is therefore vital to develop and deploy accurate diagnostic tests that can determine whether asymptomatic individuals are actually uninfected, or instead are infected but just do not yet show detectable symptoms. PMID:27046030

  9. Past speculations of the future: a review of the methods used for forecasting emerging health technologies.

    PubMed

    Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew

    2016-03-10

    Forecasting can support rational decision-making around the introduction and use of emerging health technologies and prevent investment in technologies that have limited long-term potential. However, forecasting methods need to be credible. We performed a systematic search to identify the methods used in forecasting studies to predict future health technologies within a 3-20-year timeframe. Identification and retrospective assessment of such methods potentially offer a route to more reliable prediction. Systematic search of the literature to identify studies reported on methods of forecasting in healthcare. People are not needed in this study. The authors searched MEDLINE, EMBASE, PsychINFO and grey literature sources, and included articles published in English that reported their methods and a list of identified technologies. Studies reporting methods used to predict future health technologies within a 3-20-year timeframe with an identified list of individual healthcare technologies. Commercially sponsored reviews, long-term futurology studies (with over 20-year timeframes) and speculative editorials were excluded. 15 studies met our inclusion criteria. Our results showed that the majority of studies (13/15) consulted experts either alone or in combination with other methods such as literature searching. Only 2 studies used more complex forecasting tools such as scenario building. The methodological fundamentals of formal 3-20-year prediction are consistent but vary in details. Further research needs to be conducted to ascertain if the predictions made were accurate and whether accuracy varies by the methods used or by the types of technologies identified. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Accurate prediction of protein-protein interactions by integrating potential evolutionary information embedded in PSSM profile and discriminative vector machine classifier.

    PubMed

    Li, Zheng-Wei; You, Zhu-Hong; Chen, Xing; Li, Li-Ping; Huang, De-Shuang; Yan, Gui-Ying; Nie, Ru; Huang, Yu-An

    2017-04-04

    Identification of protein-protein interactions (PPIs) is of critical importance for deciphering the underlying mechanisms of almost all biological processes of cell and providing great insight into the study of human disease. Although much effort has been devoted to identifying PPIs from various organisms, existing high-throughput biological techniques are time-consuming, expensive, and have high false positive and negative results. Thus it is highly urgent to develop in silico methods to predict PPIs efficiently and accurately in this post genomic era. In this article, we report a novel computational model combining our newly developed discriminative vector machine classifier (DVM) and an improved Weber local descriptor (IWLD) for the prediction of PPIs. Two components, differential excitation and orientation, are exploited to build evolutionary features for each protein sequence. The main characteristics of the proposed method lies in introducing an effective feature descriptor IWLD which can capture highly discriminative evolutionary information from position-specific scoring matrixes (PSSM) of protein data, and employing the powerful and robust DVM classifier. When applying the proposed method to Yeast and H. pylori data sets, we obtained excellent prediction accuracies as high as 96.52% and 91.80%, respectively, which are significantly better than the previous methods. Extensive experiments were then performed for predicting cross-species PPIs and the predictive results were also pretty promising. To further validate the performance of the proposed method, we compared it with the state-of-the-art support vector machine (SVM) classifier on Human data set. The experimental results obtained indicate that our method is highly effective for PPIs prediction and can be taken as a supplementary tool for future proteomics research.

  11. Multidimensional severity assessment in bronchiectasis: an analysis of seven European cohorts

    PubMed Central

    McDonnell, M J; Aliberti, S; Goeminne, P C; Dimakou, K; Zucchetti, S C; Davidson, J; Ward, C; Laffey, J G; Finch, S; Pesci, A; Dupont, L J; Fardon, T C; Skrbic, D; Obradovic, D; Cowman, S; Loebinger, M R; Rutherford, R M; De Soyza, A; Chalmers, J D

    2016-01-01

    Introduction Bronchiectasis is a multidimensional disease associated with substantial morbidity and mortality. Two disease-specific clinical prediction tools have been developed, the Bronchiectasis Severity Index (BSI) and the FACED score, both of which stratify patients into severity risk categories to predict the probability of mortality. Methods We aimed to compare the predictive utility of BSI and FACED in assessing clinically relevant disease outcomes across seven European cohorts independent of their original validation studies. Results The combined cohorts totalled 1612. Pooled analysis showed that both scores had a good discriminatory predictive value for mortality (pooled area under the curve (AUC) 0.76, 95% CI 0.74 to 0.78 for both scores) with the BSI demonstrating a higher sensitivity (65% vs 28%) but lower specificity (70% vs 93%) compared with the FACED score. Calibration analysis suggested that the BSI performed consistently well across all cohorts, while FACED consistently overestimated mortality in ‘severe’ patients (pooled OR 0.33 (0.23 to 0.48), p<0.0001). The BSI accurately predicted hospitalisations (pooled AUC 0.82, 95% CI 0.78 to 0.84), exacerbations, quality of life (QoL) and respiratory symptoms across all risk categories. FACED had poor discrimination for hospital admissions (pooled AUC 0.65, 95% CI 0.63 to 0.67) with low sensitivity at 16% and did not consistently predict future risk of exacerbations, QoL or respiratory symptoms. No association was observed with FACED and 6 min walk distance (6MWD) or lung function decline. Conclusion The BSI accurately predicts mortality, hospital admissions, exacerbations, QoL, respiratory symptoms, 6MWD and lung function decline in bronchiectasis, providing a clinically relevant evaluation of disease severity. PMID:27516225

  12. A hybrid solution using computational prediction and measured data to accurately determine process corrections with reduced overlay sampling

    NASA Astrophysics Data System (ADS)

    Noyes, Ben F.; Mokaberi, Babak; Mandoy, Ram; Pate, Alex; Huijgen, Ralph; McBurney, Mike; Chen, Owen

    2017-03-01

    Reducing overlay error via an accurate APC feedback system is one of the main challenges in high volume production of the current and future nodes in the semiconductor industry. The overlay feedback system directly affects the number of dies meeting overlay specification and the number of layers requiring dedicated exposure tools through the fabrication flow. Increasing the former number and reducing the latter number is beneficial for the overall efficiency and yield of the fabrication process. An overlay feedback system requires accurate determination of the overlay error, or fingerprint, on exposed wafers in order to determine corrections to be automatically and dynamically applied to the exposure of future wafers. Since current and future nodes require correction per exposure (CPE), the resolution of the overlay fingerprint must be high enough to accommodate CPE in the overlay feedback system, or overlay control module (OCM). Determining a high resolution fingerprint from measured data requires extremely dense overlay sampling that takes a significant amount of measurement time. For static corrections this is acceptable, but in an automated dynamic correction system this method creates extreme bottlenecks for the throughput of said system as new lots have to wait until the previous lot is measured. One solution is using a less dense overlay sampling scheme and employing computationally up-sampled data to a dense fingerprint. That method uses a global fingerprint model over the entire wafer; measured localized overlay errors are therefore not always represented in its up-sampled output. This paper will discuss a hybrid system shown in Fig. 1 that combines a computationally up-sampled fingerprint with the measured data to more accurately capture the actual fingerprint, including local overlay errors. Such a hybrid system is shown to result in reduced modelled residuals while determining the fingerprint, and better on-product overlay performance.

  13. Using radiance predicted by the P3 approximation in a spherical geometry to predict tissue optical properties

    NASA Astrophysics Data System (ADS)

    Dickey, Dwayne J.; Moore, Ronald B.; Tulip, John

    2001-01-01

    For photodynamic therapy of solid tumors, such as prostatic carcinoma, to be achieved, an accurate model to predict tissue parameters and light dose must be found. Presently, most analytical light dosimetry models are fluence based and are not clinically viable for tissue characterization. Other methods of predicting optical properties, such as Monet Carlo, are accurate but far too time consuming for clinical application. However, radiance predicted by the P3-Approximation, an anaylitical solution to the transport equation, may be a viable and accurate alternative. The P3-Approximation accurately predicts optical parameters in intralipid/methylene blue based phantoms in a spherical geometry. The optical parameters furnished by the radiance, when introduced into fluence predicted by both P3- Approximation and Grosjean Theory, correlate well with experimental data. The P3-Approximation also predicts the optical properties of prostate tissue, agreeing with documented optical parameters. The P3-Approximation could be the clinical tool necessary to facilitate PDT of solid tumors because of the limited number of invasive measurements required and the speed in which accurate calculations can be performed.

  14. Sensory prediction on a whiskered robot: a tactile analogy to “optical flow”

    PubMed Central

    Schroeder, Christopher L.; Hartmann, Mitra J. Z.

    2012-01-01

    When an animal moves an array of sensors (e.g., the hand, the eye) through the environment, spatial and temporal gradients of sensory data are related by the velocity of the moving sensory array. In vision, the relationship between spatial and temporal brightness gradients is quantified in the “optical flow” equation. In the present work, we suggest an analog to optical flow for the rodent vibrissal (whisker) array, in which the perceptual intensity that “flows” over the array is bending moment. Changes in bending moment are directly related to radial object distance, defined as the distance between the base of a whisker and the point of contact with the object. Using both simulations and a 1×5 array (row) of artificial whiskers, we demonstrate that local object curvature can be estimated based on differences in radial distance across the array. We then develop two algorithms, both based on tactile flow, to predict the future contact points that will be obtained as the whisker array translates along the object. The translation of the robotic whisker array represents the rat's head velocity. The first algorithm uses a calculation of the local object slope, while the second uses a calculation of the local object curvature. Both algorithms successfully predict future contact points for simple surfaces. The algorithm based on curvature was found to more accurately predict future contact points as surfaces became more irregular. We quantify the inter-related effects of whisker spacing and the object's spatial frequencies, and examine the issues that arise in the presence of real-world noise, friction, and slip. PMID:23097641

  15. Sensory prediction on a whiskered robot: a tactile analogy to "optical flow".

    PubMed

    Schroeder, Christopher L; Hartmann, Mitra J Z

    2012-01-01

    When an animal moves an array of sensors (e.g., the hand, the eye) through the environment, spatial and temporal gradients of sensory data are related by the velocity of the moving sensory array. In vision, the relationship between spatial and temporal brightness gradients is quantified in the "optical flow" equation. In the present work, we suggest an analog to optical flow for the rodent vibrissal (whisker) array, in which the perceptual intensity that "flows" over the array is bending moment. Changes in bending moment are directly related to radial object distance, defined as the distance between the base of a whisker and the point of contact with the object. Using both simulations and a 1×5 array (row) of artificial whiskers, we demonstrate that local object curvature can be estimated based on differences in radial distance across the array. We then develop two algorithms, both based on tactile flow, to predict the future contact points that will be obtained as the whisker array translates along the object. The translation of the robotic whisker array represents the rat's head velocity. The first algorithm uses a calculation of the local object slope, while the second uses a calculation of the local object curvature. Both algorithms successfully predict future contact points for simple surfaces. The algorithm based on curvature was found to more accurately predict future contact points as surfaces became more irregular. We quantify the inter-related effects of whisker spacing and the object's spatial frequencies, and examine the issues that arise in the presence of real-world noise, friction, and slip.

  16. Testing the prospective evaluation of a new healthcare system

    PubMed Central

    Planitz, Birgit; Sanderson, Penelope; Freeman, Clinton; Xiao, Tania; Botea, Adi; Orihuela, Cristina Beltran

    2012-01-01

    Research into health ICT adoption suggests that the failure to understand the clinical workplace has been a major contributing factor to the failure of many computer-based clinical systems. We suggest that clinicians and administrators need methods for envisioning future use when adopting new ICT. This paper presents and evaluates a six-stage “prospective evaluation” model that clinicians can use when assessing the impact of a new electronic patient information system on a Specialist Outpatients Department (SOPD). The prospective evaluation model encompasses normative, descriptive, formative and projective approaches. We show that this combination helped health informaticians to make reasonably accurate predictions for technology adoption at the SOPD. We suggest some refinements, however, to improve the scope and accuracy of predictions. PMID:23304347

  17. Next-generation prognostic assessment for diffuse large B-cell lymphoma

    PubMed Central

    Staton, Ashley D; Kof, Jean L; Chen, Qiushi; Ayer, Turgay; Flowers, Christopher R

    2015-01-01

    Current standard of care therapy for diffuse large B-cell lymphoma (DLBCL) cures a majority of patients with additional benefit in salvage therapy and autologous stem cell transplant for patients who relapse. The next generation of prognostic models for DLBCL aims to more accurately stratify patients for novel therapies and risk-adapted treatment strategies. This review discusses the significance of host genetic and tumor genomic alterations seen in DLBCL, clinical and epidemiologic factors, and how each can be integrated into risk stratification algorithms. In the future, treatment prediction and prognostic model development and subsequent validation will require data from a large number of DLBCL patients to establish sufficient statistical power to correctly predict outcome. Novel modeling approaches can augment these efforts. PMID:26289217

  18. Next-generation prognostic assessment for diffuse large B-cell lymphoma.

    PubMed

    Staton, Ashley D; Koff, Jean L; Chen, Qiushi; Ayer, Turgay; Flowers, Christopher R

    2015-01-01

    Current standard of care therapy for diffuse large B-cell lymphoma (DLBCL) cures a majority of patients with additional benefit in salvage therapy and autologous stem cell transplant for patients who relapse. The next generation of prognostic models for DLBCL aims to more accurately stratify patients for novel therapies and risk-adapted treatment strategies. This review discusses the significance of host genetic and tumor genomic alterations seen in DLBCL, clinical and epidemiologic factors, and how each can be integrated into risk stratification algorithms. In the future, treatment prediction and prognostic model development and subsequent validation will require data from a large number of DLBCL patients to establish sufficient statistical power to correctly predict outcome. Novel modeling approaches can augment these efforts.

  19. New technologies in predicting, preventing and controlling emerging infectious diseases.

    PubMed

    Christaki, Eirini

    2015-01-01

    Surveillance of emerging infectious diseases is vital for the early identification of public health threats. Emergence of novel infections is linked to human factors such as population density, travel and trade and ecological factors like climate change and agricultural practices. A wealth of new technologies is becoming increasingly available for the rapid molecular identification of pathogens but also for the more accurate monitoring of infectious disease activity. Web-based surveillance tools and epidemic intelligence methods, used by all major public health institutions, are intended to facilitate risk assessment and timely outbreak detection. In this review, we present new methods for regional and global infectious disease surveillance and advances in epidemic modeling aimed to predict and prevent future infectious diseases threats.

  20. New technologies in predicting, preventing and controlling emerging infectious diseases

    PubMed Central

    Christaki, Eirini

    2015-01-01

    Surveillance of emerging infectious diseases is vital for the early identification of public health threats. Emergence of novel infections is linked to human factors such as population density, travel and trade and ecological factors like climate change and agricultural practices. A wealth of new technologies is becoming increasingly available for the rapid molecular identification of pathogens but also for the more accurate monitoring of infectious disease activity. Web-based surveillance tools and epidemic intelligence methods, used by all major public health institutions, are intended to facilitate risk assessment and timely outbreak detection. In this review, we present new methods for regional and global infectious disease surveillance and advances in epidemic modeling aimed to predict and prevent future infectious diseases threats. PMID:26068569

  1. Nuclear fuel in a reactor accident.

    PubMed

    Burns, Peter C; Ewing, Rodney C; Navrotsky, Alexandra

    2012-03-09

    Nuclear accidents that lead to melting of a reactor core create heterogeneous materials containing hundreds of radionuclides, many with short half-lives. The long-lived fission products and transuranium elements within damaged fuel remain a concern for millennia. Currently, accurate fundamental models for the prediction of release rates of radionuclides from fuel, especially in contact with water, after an accident remain limited. Relatively little is known about fuel corrosion and radionuclide release under the extreme chemical, radiation, and thermal conditions during and subsequent to a nuclear accident. We review the current understanding of nuclear fuel interactions with the environment, including studies over the relatively narrow range of geochemical, hydrological, and radiation environments relevant to geological repository performance, and discuss priorities for research needed to develop future predictive models.

  2. A temperature match based optimization method for daily load prediction considering DLC effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Z.

    This paper presents a unique optimization method for short term load forecasting. The new method is based on the optimal template temperature match between the future and past temperatures. The optimal error reduction technique is a new concept introduced in this paper. Two case studies show that for hourly load forecasting, this method can yield results as good as the rather complicated Box-Jenkins Transfer Function method, and better than the Box-Jenkins method; for peak load prediction, this method is comparable in accuracy to the neural network method with back propagation, and can produce more accurate results than the multi-linear regressionmore » method. The DLC effect on system load is also considered in this method.« less

  3. Predictive local receptive fields based respiratory motion tracking for motion-adaptive radiotherapy.

    PubMed

    Yubo Wang; Tatinati, Sivanagaraja; Liyu Huang; Kim Jeong Hong; Shafiq, Ghufran; Veluvolu, Kalyana C; Khong, Andy W H

    2017-07-01

    Extracranial robotic radiotherapy employs external markers and a correlation model to trace the tumor motion caused by the respiration. The real-time tracking of tumor motion however requires a prediction model to compensate the latencies induced by the software (image data acquisition and processing) and hardware (mechanical and kinematic) limitations of the treatment system. A new prediction algorithm based on local receptive fields extreme learning machines (pLRF-ELM) is proposed for respiratory motion prediction. All the existing respiratory motion prediction methods model the non-stationary respiratory motion traces directly to predict the future values. Unlike these existing methods, the pLRF-ELM performs prediction by modeling the higher-level features obtained by mapping the raw respiratory motion into the random feature space of ELM instead of directly modeling the raw respiratory motion. The developed method is evaluated using the dataset acquired from 31 patients for two horizons in-line with the latencies of treatment systems like CyberKnife. Results showed that pLRF-ELM is superior to that of existing prediction methods. Results further highlight that the abstracted higher-level features are suitable to approximate the nonlinear and non-stationary characteristics of respiratory motion for accurate prediction.

  4. Kinetic Modeling of Radiative Turbulence in Relativistic Astrophysical Plasmas: Particle Acceleration and High-Energy Flares

    NASA Astrophysics Data System (ADS)

    Wise, John

    In the near future, next-generation telescopes, covering most of the electromagnetic spectrum, will provide a view into the very earliest stages of galaxy formation. To accurately interpret these future observations, accurate and high-resolution simulations of the first stars and galaxies are vital. This proposal is centered on the formation of the first galaxies in the Universe and their observational signatures in preparation for these future observatories. This proposal has two overall goals: 1. To simulate the formation and evolution of a statistically significant sample of galaxies during the first billion years of the Universe, including all relevant astrophysics while resolving individual molecular clouds, in various cosmological environments. These simulations will utilize a sophisticated physical model of star and black hole formation and feedback, including radiation transport and magnetic fields, which will lead to the most realistic and resolved predictions for the early universe; 2. To predict the observational features of the first galaxies throughout the electromagnetic spectrum, allowing for optimal extraction of galaxy and dark matter halo properties from their photometry, imaging, and spectra; The proposed research plan addresses a timely and relevant issue to theoretically prepare for the interpretation of future observations of the first galaxies in the Universe. A suite of adaptive mesh refinement simulations will be used to follow the formation and evolution of thousands of galaxies observable with the James Webb Space Telescope (JWST) that will be launched during the second year of this project. The simulations will have also tracked the formation and death of over 100,000 massive metal-free stars. Currently, there is a gap of two orders of magnitude in stellar mass between the smallest observed z > 6 galaxy and the largest simulated galaxy from "first principles", capturing its entire star formation history. This project will eliminate this gap between simulations and observations of the first galaxies, providing predictions for next-generation observations coming online throughout the next decade. The proposed activities present the graduate students involved in the project with opportunities to gain expertise in numerical algorithms, high performance computing, and software engineering. With this experience, the students will be in a powerful position to face the challenging job market. The computational tools produced by this project will be made freely available and incorporated into their respective frameworks to preserve their sustainability.

  5. Motor system contribution to action prediction: Temporal accuracy depends on motor experience.

    PubMed

    Stapel, Janny C; Hunnius, Sabine; Meyer, Marlene; Bekkering, Harold

    2016-03-01

    Predicting others' actions is essential for well-coordinated social interactions. In two experiments including an infant population, this study addresses to what extent motor experience of an observer determines prediction accuracy for others' actions. Results show that infants who were proficient crawlers but inexperienced walkers predicted crawling more accurately than walking, whereas age groups mastering both skills (i.e. toddlers and adults) were equally accurate in predicting walking and crawling. Regardless of experience, human movements were predicted more accurately by all age groups than non-human movement control stimuli. This suggests that for predictions to be accurate, the observed act needs to be established in the motor repertoire of the observer. Through the acquisition of new motor skills, we also become better at predicting others' actions. The findings thus stress the relevance of motor experience for social-cognitive development. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Risk terrain modeling predicts child maltreatment.

    PubMed

    Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye

    2016-12-01

    As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Predicting lettuce canopy photosynthesis with statistical and neural network models

    NASA Technical Reports Server (NTRS)

    Frick, J.; Precetti, C.; Mitchell, C. A.

    1998-01-01

    An artificial neural network (NN) and a statistical regression model were developed to predict canopy photosynthetic rates (Pn) for 'Waldman's Green' leaf lettuce (Latuca sativa L.). All data used to develop and test the models were collected for crop stands grown hydroponically and under controlled-environment conditions. In the NN and regression models, canopy Pn was predicted as a function of three independent variables: shootzone CO2 concentration (600 to 1500 micromoles mol-1), photosynthetic photon flux (PPF) (600 to 1100 micromoles m-2 s-1), and canopy age (10 to 20 days after planting). The models were used to determine the combinations of CO2 and PPF setpoints required each day to maintain maximum canopy Pn. The statistical model (a third-order polynomial) predicted Pn more accurately than the simple NN (a three-layer, fully connected net). Over an 11-day validation period, average percent difference between predicted and actual Pn was 12.3% and 24.6% for the statistical and NN models, respectively. Both models lost considerable accuracy when used to determine relatively long-range Pn predictions (> or = 6 days into the future).

  8. Software Design Challenges in Time Series Prediction Systems Using Parallel Implementation of Artificial Neural Networks.

    PubMed

    Manikandan, Narayanan; Subha, Srinivasan

    2016-01-01

    Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used.

  9. Software Design Challenges in Time Series Prediction Systems Using Parallel Implementation of Artificial Neural Networks

    PubMed Central

    Manikandan, Narayanan; Subha, Srinivasan

    2016-01-01

    Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used. PMID:26881271

  10. Simulations of eddy kinetic energy transport in barotropic turbulence

    NASA Astrophysics Data System (ADS)

    Grooms, Ian

    2017-11-01

    Eddy energy transport in rotating two-dimensional turbulence is investigated using numerical simulation. Stochastic forcing is used to generate an inhomogeneous field of turbulence and the time-mean energy profile is diagnosed. An advective-diffusive model for the transport is fit to the simulation data by requiring the model to accurately predict the observed time-mean energy distribution. Isotropic harmonic diffusion of energy is found to be an accurate model in the case of uniform, solid-body background rotation (the f plane), with a diffusivity that scales reasonably well with a mixing-length law κ ∝V ℓ , where V and ℓ are characteristic eddy velocity and length scales. Passive tracer dynamics are added and it is found that the energy diffusivity is 75 % of the tracer diffusivity. The addition of a differential background rotation with constant vorticity gradient β leads to significant changes to the energy transport. The eddies generate and interact with a mean flow that advects the eddy energy. Mean advection plus anisotropic diffusion (with reduced diffusivity in the direction of the background vorticity gradient) is moderately accurate for flows with scale separation between the eddies and mean flow, but anisotropic diffusion becomes a much less accurate model of the transport when scale separation breaks down. Finally, it is observed that the time-mean eddy energy does not look like the actual eddy energy distribution at any instant of time. In the future, stochastic models of the eddy energy transport may prove more useful than models of the mean transport for predicting realistic eddy energy distributions.

  11. Heavy dark matter annihilation from effective field theory.

    PubMed

    Ovanesyan, Grigory; Slatyer, Tracy R; Stewart, Iain W

    2015-05-29

    We formulate an effective field theory description for SU(2)_{L} triplet fermionic dark matter by combining nonrelativistic dark matter with gauge bosons in the soft-collinear effective theory. For a given dark matter mass, the annihilation cross section to line photons is obtained with 5% precision by simultaneously including Sommerfeld enhancement and the resummation of electroweak Sudakov logarithms at next-to-leading logarithmic order. Using these results, we present more accurate and precise predictions for the gamma-ray line signal from annihilation, updating both existing constraints and the reach of future experiments.

  12. Physiological and biochemical basis of clinical liver function tests: a review.

    PubMed

    Hoekstra, Lisette T; de Graaf, Wilmar; Nibourg, Geert A A; Heger, Michal; Bennink, Roelof J; Stieger, Bruno; van Gulik, Thomas M

    2013-01-01

    To review the literature on the most clinically relevant and novel liver function tests used for the assessment of hepatic function before liver surgery. Postoperative liver failure is the major cause of mortality and morbidity after partial liver resection and develops as a result of insufficient remnant liver function. Therefore, accurate preoperative assessment of the future remnant liver function is mandatory in the selection of candidates for safe partial liver resection. A MEDLINE search was performed using the key words "liver function tests," "functional studies in the liver," "compromised liver," "physiological basis," and "mechanistic background," with and without Boolean operators. Passive liver function tests, including biochemical parameters and clinical grading systems, are not accurate enough in predicting outcome after liver surgery. Dynamic quantitative liver function tests, such as the indocyanine green test and galactose elimination capacity, are more accurate as they measure the elimination process of a substance that is cleared and/or metabolized almost exclusively by the liver. However, these tests only measure global liver function. Nuclear imaging techniques ((99m)Tc-galactosyl serum albumin scintigraphy and (99m)Tc-mebrofenin hepatobiliary scintigraphy) can measure both total and future remnant liver function and potentially identify patients at risk for postresectional liver failure. Because of the complexity of liver function, one single test does not represent overall liver function. In addition to computed tomography volumetry, quantitative liver function tests should be used to determine whether a safe resection can be performed. Presently, (99m)Tc-mebrofenin hepatobiliary scintigraphy seems to be the most valuable quantitative liver function test, as it can measure multiple aspects of liver function in, specifically, the future remnant liver.

  13. A systematic review of breast cancer incidence risk prediction models with meta-analysis of their performance.

    PubMed

    Meads, Catherine; Ahmed, Ikhlaaq; Riley, Richard D

    2012-04-01

    A risk prediction model is a statistical tool for estimating the probability that a currently healthy individual with specific risk factors will develop a condition in the future such as breast cancer. Reliably accurate prediction models can inform future disease burdens, health policies and individual decisions. Breast cancer prediction models containing modifiable risk factors, such as alcohol consumption, BMI or weight, condom use, exogenous hormone use and physical activity, are of particular interest to women who might be considering how to reduce their risk of breast cancer and clinicians developing health policies to reduce population incidence rates. We performed a systematic review to identify and evaluate the performance of prediction models for breast cancer that contain modifiable factors. A protocol was developed and a sensitive search in databases including MEDLINE and EMBASE was conducted in June 2010. Extensive use was made of reference lists. Included were any articles proposing or validating a breast cancer prediction model in a general female population, with no language restrictions. Duplicate data extraction and quality assessment were conducted. Results were summarised qualitatively, and where possible meta-analysis of model performance statistics was undertaken. The systematic review found 17 breast cancer models, each containing a different but often overlapping set of modifiable and other risk factors, combined with an estimated baseline risk that was also often different. Quality of reporting was generally poor, with characteristics of included participants and fitted model results often missing. Only four models received independent validation in external data, most notably the 'Gail 2' model with 12 validations. None of the models demonstrated consistently outstanding ability to accurately discriminate between those who did and those who did not develop breast cancer. For example, random-effects meta-analyses of the performance of the 'Gail 2' model showed the average C statistic was 0.63 (95% CI 0.59-0.67), and the expected/observed ratio of events varied considerably across studies (95% prediction interval for E/O ratio when the model was applied in practice was 0.75-1.19). There is a need for models with better predictive performance but, given the large amount of work already conducted, further improvement of existing models based on conventional risk factors is perhaps unlikely. Research to identify new risk factors with large additionally predictive ability is therefore needed, alongside clearer reporting and continual validation of new models as they develop.

  14. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  15. Measurement of Muon Neutrino Quasielastic Scattering on Carbon

    NASA Astrophysics Data System (ADS)

    Aguilar-Arevalo, A. A.; Bazarko, A. O.; Brice, S. J.; Brown, B. C.; Bugel, L.; Cao, J.; Coney, L.; Conrad, J. M.; Cox, D. C.; Curioni, A.; Djurcic, Z.; Finley, D. A.; Fleming, B. T.; Ford, R.; Garcia, F. G.; Garvey, G. T.; Green, C.; Green, J. A.; Hart, T. L.; Hawker, E.; Imlay, R.; Johnson, R. A.; Kasper, P.; Katori, T.; Kobilarcik, T.; Kourbanis, I.; Koutsoliotas, S.; Laird, E. M.; Link, J. M.; Liu, Y.; Liu, Y.; Louis, W. C.; Mahn, K. B. M.; Marsh, W.; Martin, P. S.; McGregor, G.; Metcalf, W.; Meyers, P. D.; Mills, F.; Mills, G. B.; Monroe, J.; Moore, C. D.; Nelson, R. H.; Nienaber, P.; Ouedraogo, S.; Patterson, R. B.; Perevalov, D.; Polly, C. C.; Prebys, E.; Raaf, J. L.; Ray, H.; Roe, B. P.; Russell, A. D.; Sandberg, V.; Schirato, R.; Schmitz, D.; Shaevitz, M. H.; Shoemaker, F. C.; Smith, D.; Sorel, M.; Spentzouris, P.; Stancu, I.; Stefanski, R. J.; Sung, M.; Tanaka, H. A.; Tayloe, R.; Tzanov, M.; van de Water, R.; Wascko, M. O.; White, D. H.; Wilking, M. J.; Yang, H. J.; Zeller, G. P.; Zimmerman, E. D.

    2008-01-01

    The observation of neutrino oscillations is clear evidence for physics beyond the standard model. To make precise measurements of this phenomenon, neutrino oscillation experiments, including MiniBooNE, require an accurate description of neutrino charged current quasielastic (CCQE) cross sections to predict signal samples. Using a high-statistics sample of νμ CCQE events, MiniBooNE finds that a simple Fermi gas model, with appropriate adjustments, accurately characterizes the CCQE events observed in a carbon-based detector. The extracted parameters include an effective axial mass, MAeff=1.23±0.20GeV, that describes the four-momentum dependence of the axial-vector form factor of the nucleon, and a Pauli-suppression parameter, κ=1.019±0.011. Such a modified Fermi gas model may also be used by future accelerator-based experiments measuring neutrino oscillations on nuclear targets.

  16. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Taxi-Out Time Prediction for Departures at Charlotte Airport Using Machine Learning Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong; Malik, Waqar; Jung, Yoon C.

    2016-01-01

    Predicting the taxi-out times of departures accurately is important for improving airport efficiency and takeoff time predictability. In this paper, we attempt to apply machine learning techniques to actual traffic data at Charlotte Douglas International Airport for taxi-out time prediction. To find the key factors affecting aircraft taxi times, surface surveillance data is first analyzed. From this data analysis, several variables, including terminal concourse, spot, runway, departure fix and weight class, are selected for taxi time prediction. Then, various machine learning methods such as linear regression, support vector machines, k-nearest neighbors, random forest, and neural networks model are applied to actual flight data. Different traffic flow and weather conditions at Charlotte airport are also taken into account for more accurate prediction. The taxi-out time prediction results show that linear regression and random forest techniques can provide the most accurate prediction in terms of root-mean-square errors. We also discuss the operational complexity and uncertainties that make it difficult to predict the taxi times accurately.

  18. Predicted and experienced affective responses to the outcome of the 2008 U.S. presidential election.

    PubMed

    Kitchens, Michael B; Corser, Grant C; Gohm, Carol L; VonWaldner, Kristen L; Foreman, Elizabeth L

    2010-12-01

    People typically have intense feelings about politics. Therefore, it was no surprise that the campaign and eventual election of Barack Obama were highly anticipated and emotionally charged events, making it and the emotion experienced afterward a useful situation in which to replicate prior research showing that people typically overestimate the intensity and duration of their future affective states. Consequently, it was expected that Obama supporters and McCain supporters might overestimate the intensity of their affective responses to the outcome of the election. Data showed that while McCain supporters underestimated how happy they would be following the election, Obama supporters accurately predicted how happy they would be following the election. These data provide descriptive information on the accuracy of people's predicted reactions to the 2008 U.S. presidential election. The findings are discussed in the context of the broad literature and this specific and unique event.

  19. Arthroplasty Utilization in the United States is Predicted by Age-Specific Population Groups.

    PubMed

    Bashinskaya, Bronislava; Zimmerman, Ryan M; Walcott, Brian P; Antoci, Valentin

    2012-01-01

    Osteoarthritis is a common indication for hip and knee arthroplasty. An accurate assessment of current trends in healthcare utilization as they relate to arthroplasty may predict the needs of a growing elderly population in the United States. First, incidence data was queried from the United States Nationwide Inpatient Sample from 1993 to 2009. Patients undergoing total knee and hip arthroplasty were identified. Then, the United States Census Bureau was queried for population data from the same study period as well as to provide future projections. Arthroplasty followed linear regression models with the population group >64 years in both hip and knee groups. Projections for procedure incidence in the year 2050 based on these models were calculated to be 1,859,553 cases (hip) and 4,174,554 cases (knee). The need for hip and knee arthroplasty is expected to grow significantly in the upcoming years, given population growth predictions.

  20. Pollen dispersal slows geographical range shift and accelerates ecological niche shift under climate change

    PubMed Central

    Aguilée, Robin; Raoul, Gaël; Rousset, François; Ronce, Ophélie

    2016-01-01

    Species may survive climate change by migrating to track favorable climates and/or adapting to different climates. Several quantitative genetics models predict that species escaping extinction will change their geographical distribution while keeping the same ecological niche. We introduce pollen dispersal in these models, which affects gene flow but not directly colonization. We show that plant populations may escape extinction because of both spatial range and ecological niche shifts. Exact analytical formulas predict that increasing pollen dispersal distance slows the expected spatial range shift and accelerates the ecological niche shift. There is an optimal distance of pollen dispersal, which maximizes the sustainable rate of climate change. These conclusions hold in simulations relaxing several strong assumptions of our analytical model. Our results imply that, for plants with long distance of pollen dispersal, models assuming niche conservatism may not accurately predict their future distribution under climate change. PMID:27621443

  1. Pollen dispersal slows geographical range shift and accelerates ecological niche shift under climate change.

    PubMed

    Aguilée, Robin; Raoul, Gaël; Rousset, François; Ronce, Ophélie

    2016-09-27

    Species may survive climate change by migrating to track favorable climates and/or adapting to different climates. Several quantitative genetics models predict that species escaping extinction will change their geographical distribution while keeping the same ecological niche. We introduce pollen dispersal in these models, which affects gene flow but not directly colonization. We show that plant populations may escape extinction because of both spatial range and ecological niche shifts. Exact analytical formulas predict that increasing pollen dispersal distance slows the expected spatial range shift and accelerates the ecological niche shift. There is an optimal distance of pollen dispersal, which maximizes the sustainable rate of climate change. These conclusions hold in simulations relaxing several strong assumptions of our analytical model. Our results imply that, for plants with long distance of pollen dispersal, models assuming niche conservatism may not accurately predict their future distribution under climate change.

  2. Exploring the knowledge behind predictions in everyday cognition: an iterated learning study.

    PubMed

    Stephens, Rachel G; Dunn, John C; Rao, Li-Lin; Li, Shu

    2015-10-01

    Making accurate predictions about events is an important but difficult task. Recent work suggests that people are adept at this task, making predictions that reflect surprisingly accurate knowledge of the distributions of real quantities. Across three experiments, we used an iterated learning procedure to explore the basis of this knowledge: to what extent is domain experience critical to accurate predictions and how accurate are people when faced with unfamiliar domains? In Experiment 1, two groups of participants, one resident in Australia, the other in China, predicted the values of quantities familiar to both (movie run-times), unfamiliar to both (the lengths of Pharaoh reigns), and familiar to one but unfamiliar to the other (cake baking durations and the lengths of Beijing bus routes). While predictions from both groups were reasonably accurate overall, predictions were inaccurate in the selectively unfamiliar domains and, surprisingly, predictions by the China-resident group were also inaccurate for a highly familiar domain: local bus route lengths. Focusing on bus routes, two follow-up experiments with Australia-resident groups clarified the knowledge and strategies that people draw upon, plus important determinants of accurate predictions. For unfamiliar domains, people appear to rely on extrapolating from (not simply directly applying) related knowledge. However, we show that people's predictions are subject to two sources of error: in the estimation of quantities in a familiar domain and extension to plausible values in an unfamiliar domain. We propose that the key to successful predictions is not simply domain experience itself, but explicit experience of relevant quantities.

  3. Current State and Future Perspectives in QSAR Models to Predict Blood- Brain Barrier Penetration in Central Nervous System Drug R&D.

    PubMed

    Morales, Juan F; Montoto, Sebastian Scioli; Fagiolino, Pietro; Ruiz, Maria E

    2017-01-01

    The Blood-Brain Barrier (BBB) is a physical and biochemical barrier that restricts the entry of certain drugs to the Central Nervous System (CNS), while allowing the passage of others. The ability to predict the permeability of a given molecule through the BBB is a key aspect in CNS drug discovery and development, since neurotherapeutic agents with molecular targets in the CNS should be able to cross the BBB, whereas peripherally acting agents should not, to minimize the risk of CNS adverse effects. In this review we examine and discuss QSAR approaches and current availability of experimental data for the construction of BBB permeability predictive models, focusing on the modeling of the biorelevant parameter unbound partitioning coefficient (Kp,uu). Emphasis is made on two possible strategies to overcome the current limitations of in silico models: considering the prediction of brain penetration as a multifactorial problem, and increasing experimental datasets through accurate and standardized experimental techniques.

  4. Predicting Long-Term Cognitive Outcome Following Breast Cancer with Pre-Treatment Resting State fMRI and Random Forest Machine Learning.

    PubMed

    Kesler, Shelli R; Rao, Arvind; Blayney, Douglas W; Oakley-Girvan, Ingrid A; Karuturi, Meghan; Palesh, Oxana

    2017-01-01

    We aimed to determine if resting state functional magnetic resonance imaging (fMRI) acquired at pre-treatment baseline could accurately predict breast cancer-related cognitive impairment at long-term follow-up. We evaluated 31 patients with breast cancer (age 34-65) prior to any treatment, post-chemotherapy and 1 year later. Cognitive testing scores were normalized based on data obtained from 43 healthy female controls and then used to categorize patients as impaired or not based on longitudinal changes. We measured clustering coefficient, a measure of local connectivity, by applying graph theory to baseline resting state fMRI and entered these metrics along with relevant patient-related and medical variables into random forest classification. Incidence of cognitive impairment at 1 year follow-up was 55% and was predicted by classification algorithms with up to 100% accuracy ( p < 0.0001). The neuroimaging-based model was significantly more accurate than a model involving patient-related and medical variables ( p = 0.005). Hub regions belonging to several distinct functional networks were the most important predictors of cognitive outcome. Characteristics of these hubs indicated potential spread of brain injury from default mode to other networks over time. These findings suggest that resting state fMRI is a promising tool for predicting future cognitive impairment associated with breast cancer. This information could inform treatment decision making by identifying patients at highest risk for long-term cognitive impairment.

  5. Predicting Long-Term Cognitive Outcome Following Breast Cancer with Pre-Treatment Resting State fMRI and Random Forest Machine Learning

    PubMed Central

    Kesler, Shelli R.; Rao, Arvind; Blayney, Douglas W.; Oakley-Girvan, Ingrid A.; Karuturi, Meghan; Palesh, Oxana

    2017-01-01

    We aimed to determine if resting state functional magnetic resonance imaging (fMRI) acquired at pre-treatment baseline could accurately predict breast cancer-related cognitive impairment at long-term follow-up. We evaluated 31 patients with breast cancer (age 34–65) prior to any treatment, post-chemotherapy and 1 year later. Cognitive testing scores were normalized based on data obtained from 43 healthy female controls and then used to categorize patients as impaired or not based on longitudinal changes. We measured clustering coefficient, a measure of local connectivity, by applying graph theory to baseline resting state fMRI and entered these metrics along with relevant patient-related and medical variables into random forest classification. Incidence of cognitive impairment at 1 year follow-up was 55% and was predicted by classification algorithms with up to 100% accuracy (p < 0.0001). The neuroimaging-based model was significantly more accurate than a model involving patient-related and medical variables (p = 0.005). Hub regions belonging to several distinct functional networks were the most important predictors of cognitive outcome. Characteristics of these hubs indicated potential spread of brain injury from default mode to other networks over time. These findings suggest that resting state fMRI is a promising tool for predicting future cognitive impairment associated with breast cancer. This information could inform treatment decision making by identifying patients at highest risk for long-term cognitive impairment. PMID:29187817

  6. Life prediction technologies for aeronautical propulsion systems

    NASA Technical Reports Server (NTRS)

    Mcgaw, Michael A.

    1990-01-01

    Fatigue and fracture problems continue to occur in aeronautical gas turbine engines. Components whose useful life is limited by these failure modes include turbine hot-section blades, vanes, and disks. Safety considerations dictate that catastrophic failures be avoided, while economic considerations dictate that catastrophic failures be avoided, while economic considerations dictate that noncatastrophic failures occur as infrequently as possible. Therefore, the decision in design is making the tradeoff between engine performance and durability. LeRC has contributed to the aeropropulsion industry in the area of life prediction technology for over 30 years, developing creep and fatigue life prediction methodologies for hot-section materials. At the present time, emphasis is being placed on the development of methods capable of handling both thermal and mechanical fatigue under severe environments. Recent accomplishments include the development of more accurate creep-fatigue life prediction methods such as the total strain version of LeRC's strain-range partitioning (SRP) and the HOST-developed cyclic damage accumulation (CDA) model. Other examples include the development of a more accurate cumulative fatigue damage rule - the double damage curve approach (DDCA), which provides greatly improved accuracy in comparison with usual cumulative fatigue design rules. Other accomplishments in the area of high-temperature fatigue crack growth may also be mentioned. Finally, we are looking to the future and are beginning to do research on the advanced methods which will be required for development of advanced materials and propulsion systems over the next 10-20 years.

  7. Ensemble positive unlabeled learning for disease gene identification.

    PubMed

    Yang, Peng; Li, Xiaoli; Chua, Hon-Nian; Kwoh, Chee-Keong; Ng, See-Kiong

    2014-01-01

    An increasing number of genes have been experimentally confirmed in recent years as causative genes to various human diseases. The newly available knowledge can be exploited by machine learning methods to discover additional unknown genes that are likely to be associated with diseases. In particular, positive unlabeled learning (PU learning) methods, which require only a positive training set P (confirmed disease genes) and an unlabeled set U (the unknown candidate genes) instead of a negative training set N, have been shown to be effective in uncovering new disease genes in the current scenario. Using only a single source of data for prediction can be susceptible to bias due to incompleteness and noise in the genomic data and a single machine learning predictor prone to bias caused by inherent limitations of individual methods. In this paper, we propose an effective PU learning framework that integrates multiple biological data sources and an ensemble of powerful machine learning classifiers for disease gene identification. Our proposed method integrates data from multiple biological sources for training PU learning classifiers. A novel ensemble-based PU learning method EPU is then used to integrate multiple PU learning classifiers to achieve accurate and robust disease gene predictions. Our evaluation experiments across six disease groups showed that EPU achieved significantly better results compared with various state-of-the-art prediction methods as well as ensemble learning classifiers. Through integrating multiple biological data sources for training and the outputs of an ensemble of PU learning classifiers for prediction, we are able to minimize the potential bias and errors in individual data sources and machine learning algorithms to achieve more accurate and robust disease gene predictions. In the future, our EPU method provides an effective framework to integrate the additional biological and computational resources for better disease gene predictions.

  8. Short-term prediction of solar energy in Saudi Arabia using automated-design fuzzy logic systems

    PubMed Central

    2017-01-01

    Solar energy is considered as one of the main sources for renewable energy in the near future. However, solar energy and other renewable energy sources have a drawback related to the difficulty in predicting their availability in the near future. This problem affects optimal exploitation of solar energy, especially in connection with other resources. Therefore, reliable solar energy prediction models are essential to solar energy management and economics. This paper presents work aimed at designing reliable models to predict the global horizontal irradiance (GHI) for the next day in 8 stations in Saudi Arabia. The designed models are based on computational intelligence methods of automated-design fuzzy logic systems. The fuzzy logic systems are designed and optimized with two models using fuzzy c-means clustering (FCM) and simulated annealing (SA) algorithms. The first model uses FCM based on the subtractive clustering algorithm to automatically design the predictor fuzzy rules from data. The second model is using FCM followed by simulated annealing algorithm to enhance the prediction accuracy of the fuzzy logic system. The objective of the predictor is to accurately predict next-day global horizontal irradiance (GHI) using previous-day meteorological and solar radiation observations. The proposed models use observations of 10 variables of measured meteorological and solar radiation data to build the model. The experimentation and results of the prediction are detailed where the root mean square error of the prediction was approximately 88% for the second model tuned by simulated annealing compared to 79.75% accuracy using the first model. This results demonstrate a good modeling accuracy of the second model despite that the training and testing of the proposed models were carried out using spatially and temporally independent data. PMID:28806754

  9. Short-term prediction of solar energy in Saudi Arabia using automated-design fuzzy logic systems.

    PubMed

    Almaraashi, Majid

    2017-01-01

    Solar energy is considered as one of the main sources for renewable energy in the near future. However, solar energy and other renewable energy sources have a drawback related to the difficulty in predicting their availability in the near future. This problem affects optimal exploitation of solar energy, especially in connection with other resources. Therefore, reliable solar energy prediction models are essential to solar energy management and economics. This paper presents work aimed at designing reliable models to predict the global horizontal irradiance (GHI) for the next day in 8 stations in Saudi Arabia. The designed models are based on computational intelligence methods of automated-design fuzzy logic systems. The fuzzy logic systems are designed and optimized with two models using fuzzy c-means clustering (FCM) and simulated annealing (SA) algorithms. The first model uses FCM based on the subtractive clustering algorithm to automatically design the predictor fuzzy rules from data. The second model is using FCM followed by simulated annealing algorithm to enhance the prediction accuracy of the fuzzy logic system. The objective of the predictor is to accurately predict next-day global horizontal irradiance (GHI) using previous-day meteorological and solar radiation observations. The proposed models use observations of 10 variables of measured meteorological and solar radiation data to build the model. The experimentation and results of the prediction are detailed where the root mean square error of the prediction was approximately 88% for the second model tuned by simulated annealing compared to 79.75% accuracy using the first model. This results demonstrate a good modeling accuracy of the second model despite that the training and testing of the proposed models were carried out using spatially and temporally independent data.

  10. Monitoring and regulation of learning in medical education: the need for predictive cues.

    PubMed

    de Bruin, Anique B H; Dunlosky, John; Cavalcanti, Rodrigo B

    2017-06-01

    Being able to accurately monitor learning activities is a key element in self-regulated learning in all settings, including medical schools. Yet students' ability to monitor their progress is often limited, leading to inefficient use of study time. Interventions that improve the accuracy of students' monitoring can optimise self-regulated learning, leading to higher achievement. This paper reviews findings from cognitive psychology and explores potential applications in medical education, as well as areas for future research. Effective monitoring depends on students' ability to generate information ('cues') that accurately reflects their knowledge and skills. The ability of these 'cues' to predict achievement is referred to as 'cue diagnosticity'. Interventions that improve the ability of students to elicit predictive cues typically fall into two categories: (i) self-generation of cues and (ii) generation of cues that is delayed after self-study. Providing feedback and support is useful when cues are predictive but may be too complex to be readily used. Limited evidence exists about interventions to improve the accuracy of self-monitoring among medical students or trainees. Developing interventions that foster use of predictive cues can enhance the accuracy of self-monitoring, thereby improving self-study and clinical reasoning. First, insight should be gained into the characteristics of predictive cues used by medical students and trainees. Next, predictive cue prompts should be designed and tested to improve monitoring and regulation of learning. Finally, the use of predictive cues should be explored in relation to teaching and learning clinical reasoning. Improving self-regulated learning is important to help medical students and trainees efficiently acquire knowledge and skills necessary for clinical practice. Interventions that help students generate and use predictive cues hold the promise of improved self-regulated learning and achievement. This framework is applicable to learning in several areas, including the development of clinical reasoning. © 2017 The Authors Medical Education published by Association for the Study of Medical Education and John Wiley & Sons Ltd.

  11. Future fundamental combustion research for aeropropulsion systems

    NASA Technical Reports Server (NTRS)

    Mularz, E. J.

    1985-01-01

    Physical fluid mechanics, heat transfer, and chemical kinetic processes which occur in the combustion chamber of aeropropulsion systems were investigated. With the component requirements becoming more severe for future engines, the current design methodology needs the new tools to obtain the optimum configuration in a reasonable design and development cycle. Research efforts in the last few years were encouraging but to achieve these benefits research is required into the fundamental aerothermodynamic processes of combustion. It is recommended that research continues in the areas of flame stabilization, combustor aerodynamics, heat transfer, multiphase flow and atomization, turbulent reacting flows, and chemical kinetics. Associated with each of these engineering sciences is the need for research into computational methods to accurately describe and predict these complex physical processes. Research needs in each of these areas are highlighted.

  12. The Uniform Pattern of Growth and Skeletal Maturation during the Human Adolescent Growth Spurt.

    PubMed

    Sanders, James O; Qiu, Xing; Lu, Xiang; Duren, Dana L; Liu, Raymond W; Dang, Debbie; Menendez, Mariano E; Hans, Sarah D; Weber, David R; Cooperman, Daniel R

    2017-12-01

    Humans are one of the few species undergoing an adolescent growth spurt. Because children enter the spurt at different ages making age a poor maturity measure, longitudinal studies are necessary to identify the growth patterns and identify commonalities in adolescent growth. The standard maturity determinant, peak height velocity (PHV) timing, is difficult to estimate in individuals due to diurnal, postural, and measurement variation. Using prospective longitudinal populations of healthy children from two North American populations, we compared the timing of the adolescent growth spurt's peak height velocity to normalized heights and hand skeletal maturity radiographs. We found that in healthy children, the adolescent growth spurt is standardized at 90% of final height with similar patterns for children of both sexes beginning at the initiation of the growth spurt. Once children enter the growth spurt, their growth pattern is consistent between children with peak growth at 90% of final height and skeletal maturity closely reflecting growth remaining. This ability to use 90% of final height as easily identified important maturity standard with its close relationship to skeletal maturity represents a significant advance allowing accurate prediction of future growth for individual children and accurate maturity comparisons for future studies of children's growth.

  13. The DSCOVR Solar Wind Mission and Future Space Weather Products

    NASA Astrophysics Data System (ADS)

    Cash, M. D.; Biesecker, D. A.; Reinard, A. A.

    2012-12-01

    The Deep Space Climate Observatory (DSCOVR) mission, scheduled for launch in mid-2014, will provide real-time solar wind thermal plasma and magnetic measurements to ensure continuous monitoring for space weather forecasting. DSCOVR will orbit L1 and will serve as a follow-on mission to NASA's Advanced Composition Explorer (ACE), which was launched in 1997. DSCOVR will have a total of six instruments, two of which will provide real-time data necessary for space weather forecasting: a Faraday cup to measure the proton and alpha components of the solar wind, and a triaxial fluxgate magnetometer to measure the magnetic field in three dimensions. Real-time data provided by DSCOVR will include Vx, Vy, Vz, n, T, Bx, By, and Bz. Such real-time L1 data is used in generating space weather applications and products that have been demonstrated to be highly accurate and provide actionable information for customers. We evaluate current space weather products driven by ACE and discuss future products under development for DSCOVR. New space weather products under consideration include: automated shock detection, more accurate L1 to Earth delay time, and prediction of rotations in solar wind Bz within magnetic clouds. Suggestions from the community on product ideas are welcome.

  14. Mapping, Bayesian Geostatistical Analysis and Spatial Prediction of Lymphatic Filariasis Prevalence in Africa

    PubMed Central

    Slater, Hannah; Michael, Edwin

    2013-01-01

    There is increasing interest to control or eradicate the major neglected tropical diseases. Accurate modelling of the geographic distributions of parasitic infections will be crucial to this endeavour. We used 664 community level infection prevalence data collated from the published literature in conjunction with eight environmental variables, altitude and population density, and a multivariate Bayesian generalized linear spatial model that allows explicit accounting for spatial autocorrelation and incorporation of uncertainty in input data and model parameters, to construct the first spatially-explicit map describing LF prevalence distribution in Africa. We also ran the best-fit model against predictions made by the HADCM3 and CCCMA climate models for 2050 to predict the likely distributions of LF under future climate and population changes. We show that LF prevalence is strongly influenced by spatial autocorrelation between locations but is only weakly associated with environmental covariates. Infection prevalence, however, is found to be related to variations in population density. All associations with key environmental/demographic variables appear to be complex and non-linear. LF prevalence is predicted to be highly heterogenous across Africa, with high prevalences (>20%) estimated to occur primarily along coastal West and East Africa, and lowest prevalences predicted for the central part of the continent. Error maps, however, indicate a need for further surveys to overcome problems with data scarcity in the latter and other regions. Analysis of future changes in prevalence indicates that population growth rather than climate change per se will represent the dominant factor in the predicted increase/decrease and spread of LF on the continent. We indicate that these results could play an important role in aiding the development of strategies that are best able to achieve the goals of parasite elimination locally and globally in a manner that may also account for the effects of future climate change on parasitic infection. PMID:23951194

  15. Modeling of Sensor Placement Strategy for Shape Sensing and Structural Health Monitoring of a Wing-Shaped Sandwich Panel Using Inverse Finite Element Method.

    PubMed

    Kefal, Adnan; Yildiz, Mehmet

    2017-11-30

    This paper investigated the effect of sensor density and alignment for three-dimensional shape sensing of an airplane-wing-shaped thick panel subjected to three different loading conditions, i.e., bending, torsion, and membrane loads. For shape sensing analysis of the panel, the Inverse Finite Element Method (iFEM) was used together with the Refined Zigzag Theory (RZT), in order to enable accurate predictions for transverse deflection and through-the-thickness variation of interfacial displacements. In this study, the iFEM-RZT algorithm is implemented by utilizing a novel three-node C°-continuous inverse-shell element, known as i3-RZT. The discrete strain data is generated numerically through performing a high-fidelity finite element analysis on the wing-shaped panel. This numerical strain data represents experimental strain readings obtained from surface patched strain gauges or embedded fiber Bragg grating (FBG) sensors. Three different sensor placement configurations with varying density and alignment of strain data were examined and their corresponding displacement contours were compared with those of reference solutions. The results indicate that a sparse distribution of FBG sensors (uniaxial strain measurements), aligned in only the longitudinal direction, is sufficient for predicting accurate full-field membrane and bending responses (deformed shapes) of the panel, including a true zigzag representation of interfacial displacements. On the other hand, a sparse deployment of strain rosettes (triaxial strain measurements) is essentially enough to produce torsion shapes that are as accurate as those of predicted by a dense sensor placement configuration. Hence, the potential applicability and practical aspects of i3-RZT/iFEM methodology is proven for three-dimensional shape-sensing of future aerospace structures.

  16. Dynamic Load Predictions for Launchers Using Extra-Large Eddy Simulations X-Les

    NASA Astrophysics Data System (ADS)

    Maseland, J. E. J.; Soemarwoto, B. I.; Kok, J. C.

    2005-02-01

    Flow-induced unsteady loads can have a strong impact on performance and flight characteristics of aerospace vehicles and therefore play a crucial role in their design and operation. Complementary to costly flight tests and delicate wind-tunnel experiments, unsteady loads can be calculated using time-accurate Computational Fluid Dynamics. A capability to accurately predict the dynamic loads on aerospace structures at flight Reynolds numbers can be of great value for the design and analysis of aerospace vehicles. Advanced space launchers are subject to dynamic loads in the base region during the ascent to space. In particular the engine and nozzle experience aerodynamic pressure fluctuations resulting from massive flow separations. Understanding these phenomena is essential for performance enhancements for future launchers which operate a larger nozzle. A new hybrid RANS-LES turbulence modelling approach termed eXtra-Large Eddy Simulations (X-LES) holds the promise to capture the flow structures associated with massive separations and enables the prediction of the broad-band spectrum of dynamic loads. This type of method has become a focal point, reducing the cost of full LES, driven by the demand for their applicability in an industrial environment. The industrial feasibility of X-LES simulations is demonstrated by computing the unsteady aerodynamic loads on the main-engine nozzle of a generic space launcher configuration. The potential to calculate the dynamic loads is qualitatively assessed for transonic flow conditions in a comparison to wind-tunnel experiments. In terms of turn-around-times, X-LES computations are already feasible within the time-frames of the development process to support the structural design. Key words: massive separated flows; buffet loads; nozzle vibrations; space launchers; time-accurate CFD; composite RANS-LES formulation.

  17. Comparison of numerical weather prediction based deterministic and probabilistic wind resource assessment methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Draxl, Caroline; Hopson, Thomas

    Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less

  18. Relationships Between the External and Internal Training Load in Professional Soccer: What Can We Learn From Machine Learning?

    PubMed

    Jaspers, Arne; De Beéck, Tim Op; Brink, Michel S; Frencken, Wouter G P; Staes, Filip; Davis, Jesse J; Helsen, Werner F

    2018-05-01

    Machine learning may contribute to understanding the relationship between the external load and internal load in professional soccer. Therefore, the relationship between external load indicators (ELIs) and the rating of perceived exertion (RPE) was examined using machine learning techniques on a group and individual level. Training data were collected from 38 professional soccer players over 2 seasons. The external load was measured using global positioning system technology and accelerometry. The internal load was obtained using the RPE. Predictive models were constructed using 2 machine learning techniques, artificial neural networks and least absolute shrinkage and selection operator (LASSO) models, and 1 naive baseline method. The predictions were based on a large set of ELIs. Using each technique, 1 group model involving all players and 1 individual model for each player were constructed. These models' performance on predicting the reported RPE values for future training sessions was compared with the naive baseline's performance. Both the artificial neural network and LASSO models outperformed the baseline. In addition, the LASSO model made more accurate predictions for the RPE than did the artificial neural network model. Furthermore, decelerations were identified as important ELIs. Regardless of the applied machine learning technique, the group models resulted in equivalent or better predictions for the reported RPE values than the individual models. Machine learning techniques may have added value in predicting RPE for future sessions to optimize training design and evaluation. These techniques may also be used in conjunction with expert knowledge to select key ELIs for load monitoring.

  19. Affective forecasting: an unrecognized challenge in making serious health decisions.

    PubMed

    Halpern, Jodi; Arnold, Robert M

    2008-10-01

    Patients facing medical decisions that will impact quality of life make assumptions about how they will adjust emotionally to living with health declines and disability. Despite abundant research on decision-making, we have no direct research on how accurately patients envision their future well-being and how this influences their decisions. Outside medicine, psychological research on "affective forecasting" consistently shows that people poorly predict their future ability to adapt to adversity. This finding is important for medicine, since many serious health decisions hinge on quality-of-life judgments. We describe three specific mechanisms for affective forecasting errors that may influence health decisions: focalism, in which people focus more on what will change than on what will stay the same; immune neglect, in which they fail to envision how their own coping skills will lessen their unhappiness; and failure to predict adaptation, in which people fail to envision shifts in what they value. We discuss emotional and social factors that interact with these cognitive biases. We describe how caregivers can recognize these biases in the clinical setting and suggest interventions to help patients recognize and address affective forecasting errors.

  20. High-order computational fluid dynamics tools for aircraft design

    PubMed Central

    Wang, Z. J.

    2014-01-01

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  1. TANDI: threat assessment of network data and information

    NASA Astrophysics Data System (ADS)

    Holsopple, Jared; Yang, Shanchieh Jay; Sudit, Moises

    2006-04-01

    Current practice for combating cyber attacks typically use Intrusion Detection Sensors (IDSs) to passively detect and block multi-stage attacks. This work leverages Level-2 fusion that correlates IDS alerts belonging to the same attacker, and proposes a threat assessment algorithm to predict potential future attacker actions. The algorithm, TANDI, reduces the problem complexity by separating the models of the attacker's capability and opportunity, and fuse the two to determine the attacker's intent. Unlike traditional Bayesian-based approaches, which require assigning a large number of edge probabilities, the proposed Level-3 fusion procedure uses only 4 parameters. TANDI has been implemented and tested with randomly created attack sequences. The results demonstrate that TANDI predicts future attack actions accurately as long as the attack is not part of a coordinated attack and contains no insider threats. In the presence of abnormal attack events, TANDI will alarm the network analyst for further analysis. The attempt to evaluate a threat assessment algorithm via simulation is the first in the literature, and shall open up a new avenue in the area of high level fusion.

  2. A time series based sequence prediction algorithm to detect activities of daily living in smart home.

    PubMed

    Marufuzzaman, M; Reaz, M B I; Ali, M A M; Rahman, L F

    2015-01-01

    The goal of smart homes is to create an intelligent environment adapting the inhabitants need and assisting the person who needs special care and safety in their daily life. This can be reached by collecting the ADL (activities of daily living) data and further analysis within existing computing elements. In this research, a very recent algorithm named sequence prediction via enhanced episode discovery (SPEED) is modified and in order to improve accuracy time component is included. The modified SPEED or M-SPEED is a sequence prediction algorithm, which modified the previous SPEED algorithm by using time duration of appliance's ON-OFF states to decide the next state. M-SPEED discovered periodic episodes of inhabitant behavior, trained it with learned episodes, and made decisions based on the obtained knowledge. The results showed that M-SPEED achieves 96.8% prediction accuracy, which is better than other time prediction algorithms like PUBS, ALZ with temporal rules and the previous SPEED. Since human behavior shows natural temporal patterns, duration times can be used to predict future events more accurately. This inhabitant activity prediction system will certainly improve the smart homes by ensuring safety and better care for elderly and handicapped people.

  3. Accurate prediction of subcellular location of apoptosis proteins combining Chou's PseAAC and PsePSSM based on wavelet denoising.

    PubMed

    Yu, Bin; Li, Shan; Qiu, Wen-Ying; Chen, Cheng; Chen, Rui-Xin; Wang, Lei; Wang, Ming-Hui; Zhang, Yan

    2017-12-08

    Apoptosis proteins subcellular localization information are very important for understanding the mechanism of programmed cell death and the development of drugs. The prediction of subcellular localization of an apoptosis protein is still a challenging task because the prediction of apoptosis proteins subcellular localization can help to understand their function and the role of metabolic processes. In this paper, we propose a novel method for protein subcellular localization prediction. Firstly, the features of the protein sequence are extracted by combining Chou's pseudo amino acid composition (PseAAC) and pseudo-position specific scoring matrix (PsePSSM), then the feature information of the extracted is denoised by two-dimensional (2-D) wavelet denoising. Finally, the optimal feature vectors are input to the SVM classifier to predict subcellular location of apoptosis proteins. Quite promising predictions are obtained using the jackknife test on three widely used datasets and compared with other state-of-the-art methods. The results indicate that the method proposed in this paper can remarkably improve the prediction accuracy of apoptosis protein subcellular localization, which will be a supplementary tool for future proteomics research.

  4. Accurate prediction of subcellular location of apoptosis proteins combining Chou’s PseAAC and PsePSSM based on wavelet denoising

    PubMed Central

    Chen, Cheng; Chen, Rui-Xin; Wang, Lei; Wang, Ming-Hui; Zhang, Yan

    2017-01-01

    Apoptosis proteins subcellular localization information are very important for understanding the mechanism of programmed cell death and the development of drugs. The prediction of subcellular localization of an apoptosis protein is still a challenging task because the prediction of apoptosis proteins subcellular localization can help to understand their function and the role of metabolic processes. In this paper, we propose a novel method for protein subcellular localization prediction. Firstly, the features of the protein sequence are extracted by combining Chou's pseudo amino acid composition (PseAAC) and pseudo-position specific scoring matrix (PsePSSM), then the feature information of the extracted is denoised by two-dimensional (2-D) wavelet denoising. Finally, the optimal feature vectors are input to the SVM classifier to predict subcellular location of apoptosis proteins. Quite promising predictions are obtained using the jackknife test on three widely used datasets and compared with other state-of-the-art methods. The results indicate that the method proposed in this paper can remarkably improve the prediction accuracy of apoptosis protein subcellular localization, which will be a supplementary tool for future proteomics research. PMID:29296195

  5. Prediction and control of neural responses to pulsatile electrical stimulation

    NASA Astrophysics Data System (ADS)

    Campbell, Luke J.; Sly, David James; O'Leary, Stephen John

    2012-04-01

    This paper aims to predict and control the probability of firing of a neuron in response to pulsatile electrical stimulation of the type delivered by neural prostheses such as the cochlear implant, bionic eye or in deep brain stimulation. Using the cochlear implant as a model, we developed an efficient computational model that predicts the responses of auditory nerve fibers to electrical stimulation and evaluated the model's accuracy by comparing the model output with pooled responses from a group of guinea pig auditory nerve fibers. It was found that the model accurately predicted the changes in neural firing probability over time to constant and variable amplitude electrical pulse trains, including speech-derived signals, delivered at rates up to 889 pulses s-1. A simplified version of the model that did not incorporate adaptation was used to adaptively predict, within its limitations, the pulsatile electrical stimulus required to cause a desired response from neurons up to 250 pulses s-1. Future stimulation strategies for cochlear implants and other neural prostheses may be enhanced using similar models that account for the way that neural responses are altered by previous stimulation.

  6. Short- versus long-term responses to changing CO2 in a coastal dinoflagellate bloom: implications for interspecific competitive interactions and community structure.

    PubMed

    Tatters, Avery O; Schnetzer, Astrid; Fu, Feixue; Lie, Alle Y A; Caron, David A; Hutchins, David A

    2013-07-01

    Increasing pCO2 (partial pressure of CO2 ) in an "acidified" ocean will affect phytoplankton community structure, but manipulation experiments with assemblages briefly acclimated to simulated future conditions may not accurately predict the long-term evolutionary shifts that could affect inter-specific competitive success. We assessed community structure changes in a natural mixed dinoflagellate bloom incubated at three pCO2 levels (230, 433, and 765 ppm) in a short-term experiment (2 weeks). The four dominant species were then isolated from each treatment into clonal cultures, and maintained at all three pCO2 levels for approximately 1 year. Periodically (4, 8, and 12 months), these pCO2 -conditioned clones were recombined into artificial communities, and allowed to compete at their conditioning pCO2 level or at higher and lower levels. The dominant species in these artificial communities of CO2 -conditioned clones differed from those in the original short-term experiment, but individual species relative abundance trends across pCO2 treatments were often similar. Specific growth rates showed no strong evidence for fitness increases attributable to conditioning pCO2 level. Although pCO2 significantly structured our experimental communities, conditioning time and biotic interactions like mixotrophy also had major roles in determining competitive outcomes. New methods of carrying out extended mixed species experiments are needed to accurately predict future long-term phytoplankton community responses to changing pCO2 . © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  7. Inverse modeling using PS-InSAR for improved calibration of hydraulic parameters and prediction of future subsidence for Las Vegas Valley, USA

    NASA Astrophysics Data System (ADS)

    Burbey, T. J.; Zhang, M.

    2015-11-01

    Las Vegas Valley has had a long history of surface deformation due to groundwater pumping that began in the early 20th century. After nearly 80 years of pumping, PS-InSAR interferograms have revealed detailed and complex spatial patterns of subsidence in the Las Vegas Valley area that do not coincide with major pumping regions. High spatial and temporal resolution subsidence observations from InSAR and hydraulic head data were used to inversely calibrate transmissivities (T), elastic and inelastic skeletal storage coefficients (Ske and Skv) of the developed-zone aquifer and conductance (CR) of the basin-fill faults for the entire Las Vegas basin. The results indicate that the subsidence observations from PS-InSAR are extremely beneficial for accurately quantifying hydraulic parameters, and the model calibration results are far more accurate than when using only water-levels as observations, and just a few random subsidence observations. Future predictions of land subsidence to year 2030 were made on the basis of existing pumping patterns and rates. Simulation results suggests that subsidence will continue in northwest subsidence bowl area, which is expected to undergo an additional 11.3 cm of subsidence. Even mitigation measures that include artificial recharge and reduced pumping do not significantly reduce the compaction in the northwest subsidence bowl. This is due to the slow draining of thick confining units in the region. However, a small amount of uplift of 0.4 cm is expected in the North and Central bowl areas over the next 20 years.

  8. The role of lymphadenectomy in endometrial cancer: was the ASTEC trial doomed by design and are we destined to repeat that mistake?

    PubMed

    Naumann, R Wendel

    2012-07-01

    This study examines the design of previous and future trials of lymph node dissection in endometrial cancer. Data from previous trials were used to construct a decision analysis modeling the risk of lymphatic spread and the effects of treatment on patients with endometrial cancer. This model was then applied to previous trials as well as other future trial designs that might be used to address this subject. Comparing the predicted and actual results in the ASTEC trial, the model closely mimics the survival results with and without lymph node dissection for the low and high risk groups. The model suggests a survival difference of less than 2% between the experimental and control arms of the ASTEC trial under all circumstances. Sensitivity analyses reveal that these conclusions are robust. Future trial designs were also modeled with hysterectomy only, hysterectomy with radiation in intermediate risk patients, and staging with radiation only with node positive patients. Predicted outcomes for these approaches yield survival rates of 88%, 90%, and 93% in clinical stage I patients who have a risk of pelvic node involvement of approximately 7%. These estimates were 78%, 82%, and 89% in intermediate risk patients who have a risk of nodal spread of approximately 15%. This model accurately predicts the outcome of previous trials and demonstrates that even if lymph node dissection was therapeutic, these trials would have been negative due to study design. Furthermore, future trial designs that are being considered would need to be conducted in high-intermediate risk patients to detect any difference. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Interactions of predominant insects and diseases with climate change in Douglas-fir forests of western Oregon and Washington, U.S.A.

    PubMed

    Agne, Michelle C; Beedlow, Peter A; Shaw, David C; Woodruff, David R; Lee, E Henry; Cline, Steven P; Comeleo, Randy L

    2018-02-01

    Forest disturbance regimes are beginning to show evidence of climate-mediated changes, such as increasing severity of droughts and insect outbreaks. We review the major insects and pathogens affecting the disturbance regime for coastal Douglas-fir forests in western Oregon and Washington State, USA, and ask how future climate changes may influence their role in disturbance ecology. Although the physiological constraints of light, temperature, and moisture largely control tree growth, episodic and chronic disturbances interacting with biological factors have substantial impacts on the structure and functioning of forest ecosystems in this region. Understanding insect and disease interactions is critical to predicting forest response to climate change and the consequences for ecosystem services, such as timber, clean water, fish and wildlife. We focused on future predictions for warmer wetter winters, hotter drier summers, and elevated atmospheric CO 2 to hypothesize the response of Douglas-fir forests to the major insects and diseases influencing this forest type: Douglas-fir beetle, Swiss needle cast, black stain root disease, and laminated root rot. We hypothesize that 1) Douglas-fir beetle and black stain root disease could become more prevalent with increasing, fire, temperature stress, and moisture stress, 2) future impacts of Swiss needle cast are difficult to predict due to uncertainties in May-July leaf wetness, but warmer winters could contribute to intensification at higher elevations, and 3) laminated root rot will be influenced primarily by forest management, rather than climatic change. Furthermore, these biotic disturbance agents interact in complex ways that are poorly understood. Consequently, to inform management decisions, insect and disease influences on disturbance regimes must be characterized specifically by forest type and region in order to accurately capture these interactions in light of future climate-mediated changes.

  10. Accurate predictions of population-level changes in sequence and structural properties of HIV-1 Env using a volatility-controlled diffusion model

    PubMed Central

    DeLeon, Orlando; Hodis, Hagit; O’Malley, Yunxia; Johnson, Jacklyn; Salimi, Hamid; Zhai, Yinjie; Winter, Elizabeth; Remec, Claire; Eichelberger, Noah; Van Cleave, Brandon; Puliadi, Ramya; Harrington, Robert D.; Stapleton, Jack T.; Haim, Hillel

    2017-01-01

    The envelope glycoproteins (Envs) of HIV-1 continuously evolve in the host by random mutations and recombination events. The resulting diversity of Env variants circulating in the population and their continuing diversification process limit the efficacy of AIDS vaccines. We examined the historic changes in Env sequence and structural features (measured by integrity of epitopes on the Env trimer) in a geographically defined population in the United States. As expected, many Env features were relatively conserved during the 1980s. From this state, some features diversified whereas others remained conserved across the years. We sought to identify “clues” to predict the observed historic diversification patterns. Comparison of viruses that cocirculate in patients at any given time revealed that each feature of Env (sequence or structural) exists at a defined level of variance. The in-host variance of each feature is highly conserved among individuals but can vary between different HIV-1 clades. We designate this property “volatility” and apply it to model evolution of features as a linear diffusion process that progresses with increasing genetic distance. Volatilities of different features are highly correlated with their divergence in longitudinally monitored patients. Volatilities of features also correlate highly with their population-level diversification. Using volatility indices measured from a small number of patient samples, we accurately predict the population diversity that developed for each feature over the course of 30 years. Amino acid variants that evolved at key antigenic sites are also predicted well. Therefore, small “fluctuations” in feature values measured in isolated patient samples accurately describe their potential for population-level diversification. These tools will likely contribute to the design of population-targeted AIDS vaccines by effectively capturing the diversity of currently circulating strains and addressing properties of variants expected to appear in the future. PMID:28384158

  11. Credit risk evaluation based on social media.

    PubMed

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Application of two neural network paradigms to the study of voluntary employee turnover.

    PubMed

    Somers, M J

    1999-04-01

    Two neural network paradigms--multilayer perceptron and learning vector quantization--were used to study voluntary employee turnover with a sample of 577 hospital employees. The objectives of the study were twofold. The 1st was to assess whether neural computing techniques offered greater predictive accuracy than did conventional turnover methodologies. The 2nd was to explore whether computer models of turnover based on neural network technologies offered new insights into turnover processes. When compared with logistic regression analysis, both neural network paradigms provided considerably more accurate predictions of turnover behavior, particularly with respect to the correct classification of leavers. In addition, these neural network paradigms captured nonlinear relationships that are relevant for theory development. Results are discussed in terms of their implications for future research.

  13. Effect of Counterflow Jet on a Supersonic Reentry Capsule

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan; Venkatachari, Balaji Shankar; Cheng, Gary C.

    2006-01-01

    Recent NASA initiatives for space exploration have reinvigorated research on Apollo-like capsule vehicles. Aerothermodynamic characteristics of these capsule configurations during reentry play a crucial role in the performance and safety of the planetary entry probes and the crew exploration vehicles. At issue are the forebody thermal shield protection and afterbody aeroheating predictions. Due to the lack of flight or wind tunnel measurements at hypersonic speed, design decisions on such vehicles would rely heavily on computational results. Validation of current computational tools against experimental measurement thus becomes one of the most important tasks for general hypersonic research. This paper is focused on time-accurate numerical computations of hypersonic flows over a set of capsule configurations, which employ a counterflow jet to offset the detached bow shock. The accompanying increased shock stand-off distance and modified heat transfer characteristics associated with the counterflow jet may provide guidance for future design of hypersonic reentry capsules. The newly emerged space-time conservation element solution element (CESE) method is used to perform time-accurate, unstructured mesh Navier-Stokes computations for all cases investigated. The results show good agreement between experimental and numerical Schlieren pictures. Surface heat flux and aerodynamic force predictions of the capsule configurations are discussed in detail.

  14. Epidemiology of Recurrent Acute and Chronic Pancreatitis: Similarities and Differences.

    PubMed

    Machicado, Jorge D; Yadav, Dhiraj

    2017-07-01

    Emerging data in the past few years suggest that acute, recurrent acute (RAP), and chronic pancreatitis (CP) represent a disease continuum. This review discusses the similarities and differences in the epidemiology of RAP and CP. RAP is a high-risk group, comprised of individuals at varying risk of progression. The premise is that RAP is an intermediary stage in the pathogenesis of CP, and a subset of RAP patients during their natural course transition to CP. Although many clinical factors have been identified, accurately predicting the probability of disease course in individual patients remains difficult. Future studies should focus on providing more precise estimates of the risk of disease transition in a cohort of patients, quantification of clinical events during the natural course of disease, and discovery of biomarkers of the different stages of the disease continuum. Availability of clinically relevant endpoints and linked biomarkers will allow more accurate prediction of the natural course of disease over intermediate- or long-term-based characteristics of an individual patient. These endpoints will also provide objective measures for use in clinical trials of interventions that aim to alter the natural course of disease.

  15. Skeletal assessment with finite element analysis: relevance, pitfalls and interpretation.

    PubMed

    Campbell, Graeme Michael; Glüer, Claus-C

    2017-07-01

    Finite element models simulate the mechanical response of bone under load, enabling noninvasive assessment of strength. Models generated from quantitative computed tomography (QCT) incorporate the geometry and spatial distribution of bone mineral density (BMD) to simulate physiological and traumatic loads as well as orthopaedic implant behaviour. The present review discusses the current strengths and weakness of finite element models for application to skeletal biomechanics. In cadaver studies, finite element models provide better estimations of strength compared to BMD. Data from clinical studies are encouraging; however, the superiority of finite element models over BMD measures for fracture prediction has not been shown conclusively, and may be sex and site dependent. Therapeutic effects on bone strength are larger than for BMD; however, model validation has only been performed on untreated bone. High-resolution modalities and novel image processing methods may enhance the structural representation and predictive ability. Despite extensive use of finite element models to study orthopaedic implant stability, accurate simulation of the bone-implant interface and fracture progression remains a significant challenge. Skeletal finite element models provide noninvasive assessments of strength and implant stability. Improved structural representation and implant surface interaction may enable more accurate models of fragility in the future.

  16. Pitching Emotions: The Interpersonal Effects of Emotions in Professional Baseball.

    PubMed

    Cheshin, Arik; Heerdink, Marc W; Kossakowski, Jolanda J; Van Kleef, Gerben A

    2016-01-01

    Sports games are inherently emotional situations, but surprisingly little is known about the social consequences of these emotions. We examined the interpersonal effects of emotional expressions in professional baseball. Specifically, we investigated whether pitchers' facial displays influence how pitches are assessed and responded to. Using footage from the Major League Baseball World Series finals, we isolated incidents where the pitcher's face was visible before a pitch. A pre-study indicated that participants consistently perceived anger, happiness, and worry in pitchers' facial displays. An independent sample then predicted pitch characteristics and batter responses based on the same perceived emotional displays. Participants expected pitchers perceived as happy to throw more accurate balls, pitchers perceived as angry to throw faster and more difficult balls, and pitchers perceived as worried to throw slower and less accurate balls. Batters were expected to approach (swing) when faced with a pitcher perceived as happy and to avoid (no swing) when faced with a pitcher perceived as worried. Whereas previous research focused on using emotional expressions as information regarding past and current situations, our work suggests that people also use perceived emotional expressions to predict future behavior. Our results attest to the impact perceived emotional expressions can have on professional sports.

  17. Crude oil price analysis and forecasting based on variational mode decomposition and independent component analysis

    NASA Astrophysics Data System (ADS)

    E, Jianwei; Bao, Yanling; Ye, Jimin

    2017-10-01

    As one of the most vital energy resources in the world, crude oil plays a significant role in international economic market. The fluctuation of crude oil price has attracted academic and commercial attention. There exist many methods in forecasting the trend of crude oil price. However, traditional models failed in predicting accurately. Based on this, a hybrid method will be proposed in this paper, which combines variational mode decomposition (VMD), independent component analysis (ICA) and autoregressive integrated moving average (ARIMA), called VMD-ICA-ARIMA. The purpose of this study is to analyze the influence factors of crude oil price and predict the future crude oil price. Major steps can be concluded as follows: Firstly, applying the VMD model on the original signal (crude oil price), the modes function can be decomposed adaptively. Secondly, independent components are separated by the ICA, and how the independent components affect the crude oil price is analyzed. Finally, forecasting the price of crude oil price by the ARIMA model, the forecasting trend demonstrates that crude oil price declines periodically. Comparing with benchmark ARIMA and EEMD-ICA-ARIMA, VMD-ICA-ARIMA can forecast the crude oil price more accurately.

  18. A Synthesis of Solar Cycle Prediction Techniques

    NASA Technical Reports Server (NTRS)

    Hathaway, David H.; Wilson, Robert M.; Reichmann, Edwin J.

    1999-01-01

    A number of techniques currently in use for predicting solar activity on a solar cycle timescale are tested with historical data. Some techniques, e.g., regression and curve fitting, work well as solar activity approaches maximum and provide a month-by-month description of future activity, while others, e.g., geomagnetic precursors, work well near solar minimum but only provide an estimate of the amplitude of the cycle. A synthesis of different techniques is shown to provide a more accurate and useful forecast of solar cycle activity levels. A combination of two uncorrelated geomagnetic precursor techniques provides a more accurate prediction for the amplitude of a solar activity cycle at a time well before activity minimum. This combined precursor method gives a smoothed sunspot number maximum of 154 plus or minus 21 at the 95% level of confidence for the next cycle maximum. A mathematical function dependent on the time of cycle initiation and the cycle amplitude is used to describe the level of solar activity month by month for the next cycle. As the time of cycle maximum approaches a better estimate of the cycle activity is obtained by including the fit between previous activity levels and this function. This Combined Solar Cycle Activity Forecast gives, as of January 1999, a smoothed sunspot maximum of 146 plus or minus 20 at the 95% level of confidence for the next cycle maximum.

  19. Radiation from advanced solid rocket motor plumes

    NASA Technical Reports Server (NTRS)

    Farmer, Richard C.; Smith, Sheldon D.; Myruski, Brian L.

    1994-01-01

    The overall objective of this study was to develop an understanding of solid rocket motor (SRM) plumes in sufficient detail to accurately explain the majority of plume radiation test data. Improved flowfield and radiation analysis codes were developed to accurately and efficiently account for all the factors which effect radiation heating from rocket plumes. These codes were verified by comparing predicted plume behavior with measured NASA/MSFC ASRM test data. Upon conducting a thorough review of the current state-of-the-art of SRM plume flowfield and radiation prediction methodology and the pertinent data base, the following analyses were developed for future design use. The NOZZRAD code was developed for preliminary base heating design and Al2O3 particle optical property data evaluation using a generalized two-flux solution to the radiative transfer equation. The IDARAD code was developed for rapid evaluation of plume radiation effects using the spherical harmonics method of differential approximation to the radiative transfer equation. The FDNS CFD code with fully coupled Euler-Lagrange particle tracking was validated by comparison to predictions made with the industry standard RAMP code for SRM nozzle flowfield analysis. The FDNS code provides the ability to analyze not only rocket nozzle flow, but also axisymmetric and three-dimensional plume flowfields with state-of-the-art CFD methodology. Procedures for conducting meaningful thermo-vision camera studies were developed.

  20. Quantitative contrast-enhanced optical coherence tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winetraub, Yonatan; SoRelle, Elliott D.; Bio-X Program, Stanford University, 299 Campus Drive, Stanford, California 94305

    2016-01-11

    We have developed a model to accurately quantify the signals produced by exogenous scattering agents used for contrast-enhanced Optical Coherence Tomography (OCT). This model predicts distinct concentration-dependent signal trends that arise from the underlying physics of OCT detection. Accordingly, we show that real scattering particles can be described as simplified ideal scatterers with modified scattering intensity and concentration. The relation between OCT signal and particle concentration is approximately linear at concentrations lower than 0.8 particle per imaging voxel. However, at higher concentrations, interference effects cause signal to increase with a square root dependence on the number of particles within amore » voxel. Finally, high particle concentrations cause enough light attenuation to saturate the detected signal. Predictions were validated by comparison with measured OCT signals from gold nanorods (GNRs) prepared in water at concentrations ranging over five orders of magnitude (50 fM to 5 nM). In addition, we validated that our model accurately predicts the signal responses of GNRs in highly heterogeneous scattering environments including whole blood and living animals. By enabling particle quantification, this work provides a valuable tool for current and future contrast-enhanced in vivo OCT studies. More generally, the model described herein may inform the interpretation of detected signals in modalities that rely on coherence-based detection or are susceptible to interference effects.« less

  1. Mechanical behaviour of a fibrous scaffold for ligament tissue engineering: finite elements analysis vs. X-ray tomography imaging.

    PubMed

    Laurent, Cédric P; Latil, Pierre; Durville, Damien; Rahouadj, Rachid; Geindreau, Christian; Orgéas, Laurent; Ganghoffer, Jean-François

    2014-12-01

    The use of biodegradable scaffolds seeded with cells in order to regenerate functional tissue-engineered substitutes offers interesting alternative to common medical approaches for ligament repair. Particularly, finite element (FE) method enables the ability to predict and optimise both the macroscopic behaviour of these scaffolds and the local mechanic signals that control the cell activity. In this study, we investigate the ability of a dedicated FE code to predict the geometrical evolution of a new braided and biodegradable polymer scaffold for ligament tissue engineering by comparing scaffold geometries issued from FE simulations and from X-ray tomographic imaging during a tensile test. Moreover, we compare two types of FE simulations the initial geometries of which are issued either from X-ray imaging or from a computed idealised configuration. We report that the dedicated FE simulations from an idealised reference configuration can be reasonably used in the future to predict the global and local mechanical behaviour of the braided scaffold. A valuable and original dialog between the fields of experimental and numerical characterisation of such fibrous media is thus achieved. In the future, this approach should enable to improve accurate characterisation of local and global behaviour of tissue-engineering scaffolds. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Multiclass cancer diagnosis using tumor gene expression signatures

    DOE PAGES

    Ramaswamy, S.; Tamayo, P.; Rifkin, R.; ...

    2001-12-11

    The optimal treatment of patients with cancer depends on establishing accurate diagnoses by using a complex combination of clinical and histopathological data. In some instances, this task is difficult or impossible because of atypical clinical presentation or histopathology. To determine whether the diagnosis of multiple common adult malignancies could be achieved purely by molecular classification, we subjected 218 tumor samples, spanning 14 common tumor types, and 90 normal tissue samples to oligonucleotide microarray gene expression analysis. The expression levels of 16,063 genes and expressed sequence tags were used to evaluate the accuracy of a multiclass classifier based on a supportmore » vector machine algorithm. Overall classification accuracy was 78%, far exceeding the accuracy of random classification (9%). Poorly differentiated cancers resulted in low-confidence predictions and could not be accurately classified according to their tissue of origin, indicating that they are molecularly distinct entities with dramatically different gene expression patterns compared with their well differentiated counterparts. Taken together, these results demonstrate the feasibility of accurate, multiclass molecular cancer classification and suggest a strategy for future clinical implementation of molecular cancer diagnostics.« less

  3. Prediction of Lunar Reconnaissance Orbiter Reaction Wheel Assembly Angular Momentum Using Regression Analysis

    NASA Technical Reports Server (NTRS)

    DeHart, Russell

    2017-01-01

    This study determines the feasibility of creating a tool that can accurately predict Lunar Reconnaissance Orbiter (LRO) reaction wheel assembly (RWA) angular momentum, weeks or even months into the future. LRO is a three-axis stabilized spacecraft that was launched on June 18, 2009. While typically nadir-pointing, LRO conducts many types of slews to enable novel science collection. Momentum unloads have historically been performed approximately once every two weeks with the goal of maintaining system total angular momentum below 70 Nms; however flight experience shows the models developed before launch are overly conservative, with many momentum unloads being performed before system angular momentum surpasses 50 Nms. A more accurate model of RWA angular momentum growth would improve momentum unload scheduling and decrease the frequency of these unloads. Since some LRO instruments must be deactivated during momentum unloads and in the case of one instrument, decontaminated for 24 hours there after a decrease in the frequency of unloads increases science collection. This study develops a new model to predict LRO RWA angular momentum. Regression analysis of data from October 2014 to October 2015 was used to develop relationships between solar beta angle, slew specifications, and RWA angular momentum growth. The resulting model predicts RWA angular momentum using input solar beta angle and mission schedule data. This model was used to predict RWA angular momentum from October 2013 to October 2014. Predictions agree well with telemetry; of the 23 momentum unloads performed from October 2013 to October 2014, the mean and median magnitude of the RWA total angular momentum prediction error at the time of the momentum unloads were 3.7 and 2.7 Nms, respectively. The magnitude of the largest RWA total angular momentum prediction error was 10.6 Nms. Development of a tool that uses the models presented herein is currently underway.

  4. Prediction of high incidence of dengue in the Philippines.

    PubMed

    Buczak, Anna L; Baugher, Benjamin; Babin, Steven M; Ramac-Thomas, Liane C; Guven, Erhan; Elbert, Yevgeniy; Koshute, Phillip T; Velasco, John Mark S; Roque, Vito G; Tayag, Enrique A; Yoon, In-Kyu; Lewis, Sheri H

    2014-04-01

    Accurate prediction of dengue incidence levels weeks in advance of an outbreak may reduce the morbidity and mortality associated with this neglected disease. Therefore, models were developed to predict high and low dengue incidence in order to provide timely forewarnings in the Philippines. Model inputs were chosen based on studies indicating variables that may impact dengue incidence. The method first uses Fuzzy Association Rule Mining techniques to extract association rules from these historical epidemiological, environmental, and socio-economic data, as well as climate data indicating future weather patterns. Selection criteria were used to choose a subset of these rules for a classifier, thereby generating a Prediction Model. The models predicted high or low incidence of dengue in a Philippines province four weeks in advance. The threshold between high and low was determined relative to historical incidence data. Model accuracy is described by Positive Predictive Value (PPV), Negative Predictive Value (NPV), Sensitivity, and Specificity computed on test data not previously used to develop the model. Selecting a model using the F0.5 measure, which gives PPV more importance than Sensitivity, gave these results: PPV = 0.780, NPV = 0.938, Sensitivity = 0.547, Specificity = 0.978. Using the F3 measure, which gives Sensitivity more importance than PPV, the selected model had PPV = 0.778, NPV = 0.948, Sensitivity = 0.627, Specificity = 0.974. The decision as to which model has greater utility depends on how the predictions will be used in a particular situation. This method builds prediction models for future dengue incidence in the Philippines and is capable of being modified for use in different situations; for diseases other than dengue; and for regions beyond the Philippines. The Philippines dengue prediction models predicted high or low incidence of dengue four weeks in advance of an outbreak with high accuracy, as measured by PPV, NPV, Sensitivity, and Specificity.

  5. Prediction of High Incidence of Dengue in the Philippines

    PubMed Central

    Buczak, Anna L.; Baugher, Benjamin; Babin, Steven M.; Ramac-Thomas, Liane C.; Guven, Erhan; Elbert, Yevgeniy; Koshute, Phillip T.; Velasco, John Mark S.; Roque, Vito G.; Tayag, Enrique A.; Yoon, In-Kyu; Lewis, Sheri H.

    2014-01-01

    Background Accurate prediction of dengue incidence levels weeks in advance of an outbreak may reduce the morbidity and mortality associated with this neglected disease. Therefore, models were developed to predict high and low dengue incidence in order to provide timely forewarnings in the Philippines. Methods Model inputs were chosen based on studies indicating variables that may impact dengue incidence. The method first uses Fuzzy Association Rule Mining techniques to extract association rules from these historical epidemiological, environmental, and socio-economic data, as well as climate data indicating future weather patterns. Selection criteria were used to choose a subset of these rules for a classifier, thereby generating a Prediction Model. The models predicted high or low incidence of dengue in a Philippines province four weeks in advance. The threshold between high and low was determined relative to historical incidence data. Principal Findings Model accuracy is described by Positive Predictive Value (PPV), Negative Predictive Value (NPV), Sensitivity, and Specificity computed on test data not previously used to develop the model. Selecting a model using the F0.5 measure, which gives PPV more importance than Sensitivity, gave these results: PPV = 0.780, NPV = 0.938, Sensitivity = 0.547, Specificity = 0.978. Using the F3 measure, which gives Sensitivity more importance than PPV, the selected model had PPV = 0.778, NPV = 0.948, Sensitivity = 0.627, Specificity = 0.974. The decision as to which model has greater utility depends on how the predictions will be used in a particular situation. Conclusions This method builds prediction models for future dengue incidence in the Philippines and is capable of being modified for use in different situations; for diseases other than dengue; and for regions beyond the Philippines. The Philippines dengue prediction models predicted high or low incidence of dengue four weeks in advance of an outbreak with high accuracy, as measured by PPV, NPV, Sensitivity, and Specificity. PMID:24722434

  6. Spatial-altitudinal and temporal variation of Degree Day Factors (DDFs) in the Upper Indus Basin

    NASA Astrophysics Data System (ADS)

    Khan, Asif; Attaullah, Haleema; Masud, Tabinda; Khan, Mujahid

    2017-04-01

    Melt contribution from snow and ice in the Hindukush-Karakoram-Himalayan (HKH) region could account for more than 80% of annual river flows in the Upper Indus Basin (UIB). Increase or decrease in precipitation, energy input and glacier reserves can significantly affect water resources of this region. Therefore improved hydrological modelling and accurate future water resources prediction are vital for food production and hydro-power generation for millions of people living downstream, and are intensively needed. In mountain regions Degree Day Factors (DDFs) significantly vary on spatial and altitudinal basis, and are primary inputs of temperature-based hydrological modelling. However previous studies have used different DDFs as calibration parameters without due attention to the physical meaning of the values employed, and these estimates possess significant variability and uncertainty. This study provides estimates of DDFs for various altitudinal zones in the UIB at sub-basin level. Snow, clean ice and ice with debris cover bear different melt rates (or DDFs), therefore areally-averaged DDFs based on snow, clean and debris-covered ice classes in various altitudinal zones have been estimated for all sub-basins of the UIB. Zonal estimates of DDFs in the current study are significantly different from earlier adopted DDFs, hence suggest a revisit of previous hydrological modelling studies. DDFs presented in current study have been validated by using Snowmelt Runoff Model (SRM) in various sub-basins with good Nash Sutcliffe coefficients (R2 > 0.85) and low volumetric errors (Dv<10%). DDFs and methods provided in the current study can be used in future improved hydrological modelling and to provide accurate predictions of future river flows changes. The methodology used for estimation of DDFs is robust, and can be adopted to produce such estimates in other regions of the, particularly in the nearby other HKH basins.

  7. An accurate halo model for fitting non-linear cosmological power spectra and baryonic feedback models

    NASA Astrophysics Data System (ADS)

    Mead, A. J.; Peacock, J. A.; Heymans, C.; Joudaki, S.; Heavens, A. F.

    2015-12-01

    We present an optimized variant of the halo model, designed to produce accurate matter power spectra well into the non-linear regime for a wide range of cosmological models. To do this, we introduce physically motivated free parameters into the halo-model formalism and fit these to data from high-resolution N-body simulations. For a variety of Λ cold dark matter (ΛCDM) and wCDM models, the halo-model power is accurate to ≃ 5 per cent for k ≤ 10h Mpc-1 and z ≤ 2. An advantage of our new halo model is that it can be adapted to account for the effects of baryonic feedback on the power spectrum. We demonstrate this by fitting the halo model to power spectra from the OWLS (OverWhelmingly Large Simulations) hydrodynamical simulation suite via parameters that govern halo internal structure. We are able to fit all feedback models investigated at the 5 per cent level using only two free parameters, and we place limits on the range of these halo parameters for feedback models investigated by the OWLS simulations. Accurate predictions to high k are vital for weak-lensing surveys, and these halo parameters could be considered nuisance parameters to marginalize over in future analyses to mitigate uncertainty regarding the details of feedback. Finally, we investigate how lensing observables predicted by our model compare to those from simulations and from HALOFIT for a range of k-cuts and feedback models and quantify the angular scales at which these effects become important. Code to calculate power spectra from the model presented in this paper can be found at https://github.com/alexander-mead/hmcode.

  8. Astrometric cosmology .

    NASA Astrophysics Data System (ADS)

    Lattanzi, M. G.

    The accurate measurement of the motions of stars in our Galaxy can provide access to the cosmological signatures in the disk and halo, while astrometric experiments from within our Solar System can uniquely probe possible deviations from General Relativity. This article will introduce to the fact that astrometry has the potential, thanks also to impressive technological advancements, to become a key player in the field of local cosmology. For example, accurate absolute kinematics at the scale of the Milky Way can, for the first time in situ, account for the predictions made by the cold dark matter model for the Galactic halo, and eventually map out the distribution of dark matter, or other formation mechanisms, required to explain the signatures recently identified in the old component of the thick disk. Final notes dwell on to what extent Gaia can fulfill the expectations of astrometric cosmology and on what must instead be left to future, specifically designed, astrometric experiments.

  9. Prediction and assimilation of surf-zone processes using a Bayesian network: Part I: Forward models

    USGS Publications Warehouse

    Plant, Nathaniel G.; Holland, K. Todd

    2011-01-01

    Prediction of coastal processes, including waves, currents, and sediment transport, can be obtained from a variety of detailed geophysical-process models with many simulations showing significant skill. This capability supports a wide range of research and applied efforts that can benefit from accurate numerical predictions. However, the predictions are only as accurate as the data used to drive the models and, given the large temporal and spatial variability of the surf zone, inaccuracies in data are unavoidable such that useful predictions require corresponding estimates of uncertainty. We demonstrate how a Bayesian-network model can be used to provide accurate predictions of wave-height evolution in the surf zone given very sparse and/or inaccurate boundary-condition data. The approach is based on a formal treatment of a data-assimilation problem that takes advantage of significant reduction of the dimensionality of the model system. We demonstrate that predictions of a detailed geophysical model of the wave evolution are reproduced accurately using a Bayesian approach. In this surf-zone application, forward prediction skill was 83%, and uncertainties in the model inputs were accurately transferred to uncertainty in output variables. We also demonstrate that if modeling uncertainties were not conveyed to the Bayesian network (i.e., perfect data or model were assumed), then overly optimistic prediction uncertainties were computed. More consistent predictions and uncertainties were obtained by including model-parameter errors as a source of input uncertainty. Improved predictions (skill of 90%) were achieved because the Bayesian network simultaneously estimated optimal parameters while predicting wave heights.

  10. The influence of El Niño-Southern Oscillation regimes on eastern African vegetation and its future implications under the RCP8.5 warming scenario

    NASA Astrophysics Data System (ADS)

    Fer, Istem; Tietjen, Britta; Jeltsch, Florian; Wolff, Christian

    2017-09-01

    The El Niño-Southern Oscillation (ENSO) is the main driver of the interannual variability in eastern African rainfall, with a significant impact on vegetation and agriculture and dire consequences for food and social security. In this study, we identify and quantify the ENSO contribution to the eastern African rainfall variability to forecast future eastern African vegetation response to rainfall variability related to a predicted intensified ENSO. To differentiate the vegetation variability due to ENSO, we removed the ENSO signal from the climate data using empirical orthogonal teleconnection (EOT) analysis. Then, we simulated the ecosystem carbon and water fluxes under the historical climate without components related to ENSO teleconnections. We found ENSO-driven patterns in vegetation response and confirmed that EOT analysis can successfully produce coupled tropical Pacific sea surface temperature-eastern African rainfall teleconnection from observed datasets. We further simulated eastern African vegetation response under future climate change as it is projected by climate models and under future climate change combined with a predicted increased ENSO intensity. Our EOT analysis highlights that climate simulations are still not good at capturing rainfall variability due to ENSO, and as we show here the future vegetation would be different from what is simulated under these climate model outputs lacking accurate ENSO contribution. We simulated considerable differences in eastern African vegetation growth under the influence of an intensified ENSO regime which will bring further environmental stress to a region with a reduced capacity to adapt effects of global climate change and food security.

  11. The synergistic use of models and observations: understanding the mechanisms behind observed biomass dynamics at 14 Amazonian field sites and the implications for future biomass change

    NASA Astrophysics Data System (ADS)

    Levine, N. M.; Galbraith, D.; Christoffersen, B. J.; Imbuzeiro, H. A.; Restrepo-Coupe, N.; Malhi, Y.; Saleska, S. R.; Costa, M. H.; Phillips, O.; Andrade, A.; Moorcroft, P. R.

    2011-12-01

    The Amazonian rainforests play a vital role in global water, energy and carbon cycling. The sensitivity of this system to natural and anthropogenic disturbances therefore has important implications for the global climate. Some global models have predicted large-scale forest dieback and the savannization of Amazonia over the next century [Meehl et al., 2007]. While several studies have demonstrated the sensitivity of dynamic global vegetation models to changes in temperature, precipitation, and dry season length [e.g. Galbraith et al., 2010; Good et al., 2011], the ability of these models to accurately reproduce ecosystem dynamics of present-day transitional or low biomass tropical forests has not been demonstrated. A model-data intercomparison was conducted with four state-of-the-art terrestrial ecosystem models to evaluate the ability of these models to accurately represent structure, function, and long-term biomass dynamics over a range of Amazonian ecosystems. Each modeling group conducted a series of simulations for 14 sites including mature forest, transitional forest, savannah, and agricultural/pasture sites. All models were run using standard physical parameters and the same initialization procedure. Model results were compared against forest inventory and dendrometer data in addition to flux tower measurements. While the models compared well against field observations for the mature forest sites, significant differences were observed between predicted and measured ecosystem structure and dynamics for the transitional forest and savannah sites. The length of the dry season and soil sand content were good predictors of model performance. In addition, for the big leaf models, model performance was highest for sites dominated by late successional trees and lowest for sites with predominantly early and mid-successional trees. This study provides insight into tropical forest function and sensitivity to environmental conditions that will aid in predictions of the response of the Amazonian rainforest to future anthropogenically induced changes.

  12. Numerical Simulation of Non-Rotating and Rotating Coolant Channel Flow Fields. Part 1

    NASA Technical Reports Server (NTRS)

    Rigby, David L.

    2000-01-01

    Future generations of ultra high bypass-ratio jet engines will require far higher pressure ratios and operating temperatures than those of current engines. For the foreseeable future, engine materials will not be able to withstand the high temperatures without some form of cooling. In particular the turbine blades, which are under high thermal as well as mechanical loads, must be cooled. Cooling of turbine blades is achieved by bleeding air from the compressor stage of the engine through complicated internal passages in the turbine blades (internal cooling, including jet-impingement cooling) and by bleeding small amounts of air into the boundary layer of the external flow through small discrete holes on the surface of the blade (film cooling and transpiration cooling). The cooling must be done using a minimum amount of air or any increases in efficiency gained through higher operating temperature will be lost due to added load on the compressor stage. Turbine cooling schemes have traditionally been based on extensive empirical data bases, quasi-one-dimensional computational fluid dynamics (CFD) analysis, and trial and error. With improved capabilities of CFD, these traditional methods can be augmented by full three-dimensional simulations of the coolant flow to predict in detail the heat transfer and metal temperatures. Several aspects of turbine coolant flows make such application of CFD difficult, thus a highly effective CFD methodology must be used. First, high resolution of the flow field is required to attain the needed accuracy for heat transfer predictions, making highly efficient flow solvers essential for such computations. Second, the geometries of the flow passages are complicated but must be modeled accurately in order to capture all important details of the flow. This makes grid generation and grid quality important issues. Finally, since coolant flows are turbulent and separated the effects of turbulence must be modeled with a low Reynolds number turbulence model to accurately predict details of heat transfer.

  13. Obtaining Accurate Probabilities Using Classifier Calibration

    ERIC Educational Resources Information Center

    Pakdaman Naeini, Mahdi

    2016-01-01

    Learning probabilistic classification and prediction models that generate accurate probabilities is essential in many prediction and decision-making tasks in machine learning and data mining. One way to achieve this goal is to post-process the output of classification models to obtain more accurate probabilities. These post-processing methods are…

  14. WE-H-BRA-07: Mechanistic Modelling of the Relative Biological Effectiveness of Heavy Charged Particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMahon, S; Queen’s University, Belfast, Belfast; McNamara, A

    2016-06-15

    Purpose Uncertainty in the Relative Biological Effectiveness (RBE) of heavy charged particles compared to photons remains one of the major uncertainties in particle therapy. As RBEs depend strongly on clinical variables such as tissue type, dose, and radiation quality, more accurate individualised models are needed to fully optimise treatments. MethodsWe have developed a model of DNA damage and repair following X-ray irradiation in a number of settings, incorporating mechanistic descriptions of DNA repair pathways, geometric effects on DNA repair, cell cycle effects and cell death. Our model has previously been shown to accurately predict a range of biological endpoints includingmore » chromosome aberrations, mutations, and cell death. This model was combined with nanodosimetric models of individual ion tracks to calculate the additional probability of lethal damage forming within a single track. These lethal damage probabilities can be used to predict survival and RBE for cells irradiated with ions of different Linear Energy Transfer (LET). ResultsBy combining the X-ray response model with nanodosimetry information, predictions of RBE can be made without cell-line specific fitting. The model’s RBE predictions were found to agree well with empirical proton RBE models (Mean absolute difference between models of 1.9% and 1.8% for cells with α/β ratios of 9 and 1.4, respectively, for LETs between 0 and 15 keV/µm). The model also accurately recovers the impact of high-LET carbon ion exposures, showing both the reduced efficacy of ions at extremely high LET, as well as the impact of defects in non-homologous end joining on RBE values in Chinese Hamster Ovary cells.ConclusionOur model is predicts RBE without the inclusion of empirical LET fitting parameters for a range of experimental conditions. This approach has the potential to deliver improved personalisation of particle therapy, with future developments allowing for the calculation of individualised RBEs. SJM is supported by a Marie Curie International Outgoing Fellowship from the European Commission’s FP7 program (EC FP7 MC-IOF-623630)« less

  15. Initializing carbon cycle predictions from the Community Land Model by assimilating global biomass observations

    NASA Astrophysics Data System (ADS)

    Fox, A. M.; Hoar, T. J.; Smith, W. K.; Moore, D. J.

    2017-12-01

    The locations and longevity of terrestrial carbon sinks remain uncertain, however it is clear that in order to predict long-term climate changes the role of the biosphere in surface energy and carbon balance must be understood and incorporated into earth system models (ESMs). Aboveground biomass, the amount of carbon stored in vegetation, is a key component of the terrestrial carbon cycle, representing the balance of uptake through gross primary productivity (GPP), losses from respiration, senescence and mortality over hundreds of years. The best predictions of current and future land-atmosphere fluxes are likely from the integration of process-based knowledge contained in models and information from observations of changes in carbon stocks using data assimilation (DA). By exploiting long times series, it is possible to accurately detect variability and change in carbon cycle dynamics through monitoring ecosystem states, for example biomass derived from vegetation optical depth (VOD), and use this information to initialize models before making predictions. To make maximum use of information about the current state of global ecosystems when using models we have developed a system that combines the Community Land Model (CLM) with the Data Assimilation Research Testbed (DART), a community tool for ensemble DA. This DA system is highly innovative in its complexity, completeness and capabilities. Here we described a series of activities, using both Observation System Simulation Experiments (OSSEs) and real observations, that have allowed us to quantify the potential impact of assimilating VOD data into CLM-DART on future land-atmosphere fluxes. VOD data are particularly suitable to use in this activity due to their long temporal coverage and appropriate scale when combined with CLM, but their absolute values rely on many assumptions. Therefore, we have had to assess the implications of the VOD retrieval algorithms, with an emphasis on detecting uncertainty due to assumptions and inputs in the algorithms that are incompatible with those encoded within CLM. It is probable that VOD describes changes in biomass more accurately than absolute values, so in additional to sequential assimilation of observations, we have tested alternative filter algorithms, and assimilating VOD anomalies.

  16. Crucial nesting habitat for gunnison sage-grouse: A spatially explicit hierarchical approach

    USGS Publications Warehouse

    Aldridge, Cameron L.; Saher, D.J.; Childers, T.M.; Stahlnecker, K.E.; Bowen, Z.H.

    2012-01-01

    Gunnison sage-grouse (Centrocercus minimus) is a species of special concern and is currently considered a candidate species under Endangered Species Act. Careful management is therefore required to ensure that suitable habitat is maintained, particularly because much of the species' current distribution is faced with exurban development pressures. We assessed hierarchical nest site selection patterns of Gunnison sage-grouse inhabiting the western portion of the Gunnison Basin, Colorado, USA, at multiple spatial scales, using logistic regression-based resource selection functions. Models were selected using Akaike Information Criterion corrected for small sample sizes (AIC c) and predictive surfaces were generated using model averaged relative probabilities. Landscape-scale factors that had the most influence on nest site selection included the proportion of sagebrush cover >5%, mean productivity, and density of 2 wheel-drive roads. The landscape-scale predictive surface captured 97% of known Gunnison sage-grouse nests within the top 5 of 10 prediction bins, implicating 57% of the basin as crucial nesting habitat. Crucial habitat identified by the landscape model was used to define the extent for patch-scale modeling efforts. Patch-scale variables that had the greatest influence on nest site selection were the proportion of big sagebrush cover >10%, distance to residential development, distance to high volume paved roads, and mean productivity. This model accurately predicted independent nest locations. The unique hierarchical structure of our models more accurately captures the nested nature of habitat selection, and allowed for increased discrimination within larger landscapes of suitable habitat. We extrapolated the landscape-scale model to the entire Gunnison Basin because of conservation concerns for this species. We believe this predictive surface is a valuable tool which can be incorporated into land use and conservation planning as well the assessment of future land-use scenarios. ?? 2011 The Wildlife Society.

  17. Stellar Occultations by TNOs and Centaurs: first results in the “Gaia era”

    NASA Astrophysics Data System (ADS)

    Rossi, Gustavo; Vieira-Martins, Roberto; Sicardy, Bruno; Ortiz, Jose Luis; Rio Group, Lucky Star Occultation Team, Granada Occultation Team

    2017-10-01

    After the first release of the GAIA catalog (in September/2016), stellar positions are now known with unprecedented accuracy, reaching values of the order of milliarcseconds. This improvement reflected into a stunning accuracy on the astrometry of moving objects, such as TNOs. Unfortunately, Gaia stars proper motions will be only available on the second data release (DR2) next year, so there is still a need to use hybrid stellar catalogs for occultation predictions until then. Despite that, stellar occultations predictions are now much more accurate, and the biggest uncertainties comes mainly from the object ephemerides. This issue will be overcome by large surveys such as the LSST, which will provide positions for the known TNOs and it is expected to increase the number of known TNOs by nearly 40,000, with an unprecedent amount of acquired information.This huge amount of data also poses a new era in stellar occultations: predictions will be very accurate and the participation of professional astronomers, laboratories, and the amateur community will be crucial to observe the predicted events; observation campaigns will need to be selected according to a specific scientific purpose such as the probability to detect rings or archs around a body, the presence of atmosphere or even the detection of topographic features; the development of softwares capable of reducing the data more efficiently and an easier method to coordinate observation campaigns are needed.Here we present some impressive results obtained from predictions and observed occultations in 2017 (among them we have Pluto, Chariklo and Haumea), the problems we are starting to face in the beginning of the “Gaia era” and the future challenges of stellar occultation.

  18. An Experimental and Computational Investigation of Oscillating Airfoil Unsteady Aerodynamics at Large Mean Incidence

    NASA Technical Reports Server (NTRS)

    Capece, Vincent R.; Platzer, Max F.

    2003-01-01

    A major challenge in the design and development of turbomachine airfoils for gas turbine engines is high cycle fatigue failures due to flutter and aerodynamically induced forced vibrations. In order to predict the aeroelastic response of gas turbine airfoils early in the design phase, accurate unsteady aerodynamic models are required. However, accurate predictions of flutter and forced vibration stress at all operating conditions have remained elusive. The overall objectives of this research program are to develop a transition model suitable for unsteady separated flow and quantify the effects of transition on airfoil steady and unsteady aerodynamics for attached and separated flow using this model. Furthermore, the capability of current state-of-the-art unsteady aerodynamic models to predict the oscillating airfoil response of compressor airfoils over a range of realistic reduced frequencies, Mach numbers, and loading levels will be evaluated through correlation with benchmark data. This comprehensive evaluation will assess the assumptions used in unsteady aerodynamic models. The results of this evaluation can be used to direct improvement of current models and the development of future models. The transition modeling effort will also make strides in improving predictions of steady flow performance of fan and compressor blades at off-design conditions. This report summarizes the progress and results obtained in the first year of this program. These include: installation and verification of the operation of the parallel version of TURBO; the grid generation and initiation of steady flow simulations of the NASA/Pratt&Whitney airfoil at a Mach number of 0.5 and chordal incidence angles of 0 and 10 deg.; and the investigation of the prediction of laminar separation bubbles on a NACA 0012 airfoil.

  19. Sweat loss prediction using a multi-model approach

    NASA Astrophysics Data System (ADS)

    Xu, Xiaojiang; Santee, William R.

    2011-07-01

    A new multi-model approach (MMA) for sweat loss prediction is proposed to improve prediction accuracy. MMA was computed as the average of sweat loss predicted by two existing thermoregulation models: i.e., the rational model SCENARIO and the empirical model Heat Strain Decision Aid (HSDA). Three independent physiological datasets, a total of 44 trials, were used to compare predictions by MMA, SCENARIO, and HSDA. The observed sweat losses were collected under different combinations of uniform ensembles, environmental conditions (15-40°C, RH 25-75%), and exercise intensities (250-600 W). Root mean square deviation (RMSD), residual plots, and paired t tests were used to compare predictions with observations. Overall, MMA reduced RMSD by 30-39% in comparison with either SCENARIO or HSDA, and increased the prediction accuracy to 66% from 34% or 55%. Of the MMA predictions, 70% fell within the range of mean observed value ± SD, while only 43% of SCENARIO and 50% of HSDA predictions fell within the same range. Paired t tests showed that differences between observations and MMA predictions were not significant, but differences between observations and SCENARIO or HSDA predictions were significantly different for two datasets. Thus, MMA predicted sweat loss more accurately than either of the two single models for the three datasets used. Future work will be to evaluate MMA using additional physiological data to expand the scope of populations and conditions.

  20. Proactive Supply Chain Performance Management with Predictive Analytics

    PubMed Central

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment. PMID:25386605

  1. Dynamic prediction in functional concurrent regression with an application to child growth.

    PubMed

    Leroux, Andrew; Xiao, Luo; Crainiceanu, Ciprian; Checkley, William

    2018-04-15

    In many studies, it is of interest to predict the future trajectory of subjects based on their historical data, referred to as dynamic prediction. Mixed effects models have traditionally been used for dynamic prediction. However, the commonly used random intercept and slope model is often not sufficiently flexible for modeling subject-specific trajectories. In addition, there may be useful exposures/predictors of interest that are measured concurrently with the outcome, complicating dynamic prediction. To address these problems, we propose a dynamic functional concurrent regression model to handle the case where both the functional response and the functional predictors are irregularly measured. Currently, such a model cannot be fit by existing software. We apply the model to dynamically predict children's length conditional on prior length, weight, and baseline covariates. Inference on model parameters and subject-specific trajectories is conducted using the mixed effects representation of the proposed model. An extensive simulation study shows that the dynamic functional regression model provides more accurate estimation and inference than existing methods. Methods are supported by fast, flexible, open source software that uses heavily tested smoothing techniques. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  2. Proactive supply chain performance management with predictive analytics.

    PubMed

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.

  3. VCSEL-based fiber optic link for avionics: implementation and performance analyses

    NASA Astrophysics Data System (ADS)

    Shi, Jieqin; Zhang, Chunxi; Duan, Jingyuan; Wen, Huaitao

    2006-11-01

    A Gb/s fiber optic link with built-in test capability (BIT) basing on vertical-cavity surface-emitting laser (VCSEL) sources for military avionics bus for next generation has been presented in this paper. To accurately predict link performance, statistical methods and Bit Error Rate (BER) measurements have been examined. The results show that the 1Gb/s fiber optic link meets the BER requirement and values for link margin can reach up to 13dB. Analysis shows that the suggested photonic network may provide high performance and low cost interconnections alternative for future military avionics.

  4. Studying depression using imaging and machine learning methods.

    PubMed

    Patel, Meenal J; Khalaf, Alexander; Aizenstein, Howard J

    2016-01-01

    Depression is a complex clinical entity that can pose challenges for clinicians regarding both accurate diagnosis and effective timely treatment. These challenges have prompted the development of multiple machine learning methods to help improve the management of this disease. These methods utilize anatomical and physiological data acquired from neuroimaging to create models that can identify depressed patients vs. non-depressed patients and predict treatment outcomes. This article (1) presents a background on depression, imaging, and machine learning methodologies; (2) reviews methodologies of past studies that have used imaging and machine learning to study depression; and (3) suggests directions for future depression-related studies.

  5. Automation Rover for Extreme Environments

    NASA Technical Reports Server (NTRS)

    Sauder, Jonathan; Hilgemann, Evan; Johnson, Michael; Parness, Aaron; Hall, Jeffrey; Kawata, Jessie; Stack, Kathryn

    2017-01-01

    Almost 2,300 years ago the ancient Greeks built the Antikythera automaton. This purely mechanical computer accurately predicted past and future astronomical events long before electronics existed1. Automata have been credibly used for hundreds of years as computers, art pieces, and clocks. However, in the past several decades automata have become less popular as the capabilities of electronics increased, leaving them an unexplored solution for robotic spacecraft. The Automaton Rover for Extreme Environments (AREE) proposes an exciting paradigm shift from electronics to a fully mechanical system, enabling longitudinal exploration of the most extreme environments within the solar system.

  6. Alexander Hegedus Lightning Talk: Integrating Measurements to Optimize Space Weather Strategies

    NASA Astrophysics Data System (ADS)

    Hegedus, A. M.

    2017-12-01

    Alexander Hegedus is a PhD Candidate at the University of Michigan, and won an Outstanding Student Paper Award at the AGU 2016 Fall Meeting for his poster "Simulating 3D Spacecraft Constellations for Low Frequency Radio Imaging." In this short talk, Alex outlines his current research of analyzing data from both real and simulated instruments to answer Heliophysical questions. He then sketches out future plans to simulate science pipelines in a real-time data assimilation model that uses a Bayesian framework to integrate information from different instruments to determine the efficacy of future Space Weather Alert systems. MHD simulations made with Michigan's own Space Weather Model Framework will provide input to simulated instruments, acting as an Observing System Simulation Experiment to verify that a certain set of measurements can accurately predict different classes of Space Weather events.

  7. A Critical Review for Developing Accurate and Dynamic Predictive Models Using Machine Learning Methods in Medicine and Health Care.

    PubMed

    Alanazi, Hamdan O; Abdullah, Abdul Hanan; Qureshi, Kashif Naseer

    2017-04-01

    Recently, Artificial Intelligence (AI) has been used widely in medicine and health care sector. In machine learning, the classification or prediction is a major field of AI. Today, the study of existing predictive models based on machine learning methods is extremely active. Doctors need accurate predictions for the outcomes of their patients' diseases. In addition, for accurate predictions, timing is another significant factor that influences treatment decisions. In this paper, existing predictive models in medicine and health care have critically reviewed. Furthermore, the most famous machine learning methods have explained, and the confusion between a statistical approach and machine learning has clarified. A review of related literature reveals that the predictions of existing predictive models differ even when the same dataset is used. Therefore, existing predictive models are essential, and current methods must be improved.

  8. Possibility of Predicting Serotonin Transporter Occupancy From the In Vitro Inhibition Constant for Serotonin Transporter, the Clinically Relevant Plasma Concentration of Unbound Drugs, and Their Profiles for Substrates of Transporters.

    PubMed

    Yahata, Masahiro; Chiba, Koji; Watanabe, Takao; Sugiyama, Yuichi

    2017-09-01

    Accurate prediction of target occupancy facilitates central nervous system drug development. In this review, we discuss the predictability of serotonin transporter (SERT) occupancy in human brain estimated from in vitro K i values for human SERT and plasma concentrations of unbound drug (C u,plasma ), as well as the impact of drug transporters in the blood-brain barrier. First, the geometric means of in vitro K i values were compared with the means of in vivo K i values (K i,u,plasma ) which were calculated as C u,plasma values at 50% occupancy of SERT obtained from previous clinical positron emission tomography/single photon emission computed tomography imaging studies for 6 selective serotonin transporter reuptake inhibitors and 3 serotonin norepinephrine reuptake inhibitors. The in vitro K i values for 7 drugs were comparable to their in vivo K i,u,plasma values within 3-fold difference. SERT occupancy was overestimated for 5 drugs (P-glycoprotein substrates) and underestimated for 2 drugs (presumably uptake transporter substrates, although no evidence exists as yet). In conclusion, prediction of human SERT occupancy from in vitro K i values and C u,plasma was successful for drugs that are not transporter substrates and will become possible in future even for transporter substrates, once the transporter activities will be accurately estimated from in vitro experiments. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  9. Pediatric trauma BIG score: predicting mortality in children after military and civilian trauma.

    PubMed

    Borgman, Matthew A; Maegele, Marc; Wade, Charles E; Blackbourne, Lorne H; Spinella, Philip C

    2011-04-01

    To develop a validated mortality prediction score for children with traumatic injuries. We identified all children (<18 years of age) in the US military established Joint Theater Trauma Registry from 2002 to 2009 who were admitted to combat-support hospitals with traumatic injuries in Iraq and Afghanistan. We identified factors associated with mortality using univariate and then multivariate regression modeling. The developed mortality prediction score was then validated on a data set of pediatric patients (≤ 18 years of age) from the German Trauma Registry, 2002-2007. Admission base deficit, international normalized ratio, and Glasgow Coma Scale were independently associated with mortality in 707 patients from the derivation set and 1101 patients in the validation set. These variables were combined into the pediatric "BIG" score (base deficit + [2.5 × international normalized ratio] + [15 - Glasgow Coma Scale), which were each calculated to have an area under the curve of 0.89 (95% confidence interval: 0.83-0.95) and 0.89 (95% confidence interval: 0.87-0.92) on the derivation and validation sets, respectively. The pediatric trauma BIG score is a simple method that can be performed rapidly on admission to evaluate severity of illness and predict mortality in children with traumatic injuries. The score has been shown to be accurate in both penetrating-injury and blunt-injury populations and may have significant utility in comparing severity of injury in future pediatric trauma research and quality-assurance studies. In addition, this score may be used to determine inclusion criteria on admission for prospective studies when accurately estimating the mortality for sample size calculation is required.

  10. Predicting watershed post-fire sediment yield with the InVEST sediment retention model: Accuracy and uncertainties

    USGS Publications Warehouse

    Sankey, Joel B.; McVay, Jason C.; Kreitler, Jason R.; Hawbaker, Todd J.; Vaillant, Nicole; Lowe, Scott

    2015-01-01

    Increased sedimentation following wildland fire can negatively impact water supply and water quality. Understanding how changing fire frequency, extent, and location will affect watersheds and the ecosystem services they supply to communities is of great societal importance in the western USA and throughout the world. In this work we assess the utility of the InVEST (Integrated Valuation of Ecosystem Services and Tradeoffs) Sediment Retention Model to accurately characterize erosion and sedimentation of burned watersheds. InVEST was developed by the Natural Capital Project at Stanford University (Tallis et al., 2014) and is a suite of GIS-based implementations of common process models, engineered for high-end computing to allow the faster simulation of larger landscapes and incorporation into decision-making. The InVEST Sediment Retention Model is based on common soil erosion models (e.g., USLE – Universal Soil Loss Equation) and determines which areas of the landscape contribute the greatest sediment loads to a hydrological network and conversely evaluate the ecosystem service of sediment retention on a watershed basis. In this study, we evaluate the accuracy and uncertainties for InVEST predictions of increased sedimentation after fire, using measured postfire sediment yields available for many watersheds throughout the western USA from an existing, published large database. We show that the model can be parameterized in a relatively simple fashion to predict post-fire sediment yield with accuracy. Our ultimate goal is to use the model to accurately predict variability in post-fire sediment yield at a watershed scale as a function of future wildfire conditions.

  11. Preschoolers can make highly accurate judgments of learning.

    PubMed

    Lipowski, Stacy L; Merriman, William E; Dunlosky, John

    2013-08-01

    Preschoolers' ability to make judgments of learning (JOLs) was examined in 3 experiments in which they were taught proper names for animals. In Experiment 1, when judgments were made immediately after studying, nearly every child predicted subsequent recall of every name. When judgments were made after a delay, fewer showed this response tendency. The delayed JOLs of those who predicted at least 1 recall failure were still overconfident, however, and were not correlated with final recall. In Experiment 2, children received a second study trial with feedback, made JOLs after a delay, and completed an additional forced-choice judgment task. In this task, an animal whose name had been recalled was pitted against an animal whose name had not been recalled, and the children chose the one they were more likely to remember later. Compared with Experiment 1, more children predicted at least 1 recall failure and predictions were moderately accurate. In the forced-choice task, animal names that had just been successfully recalled were typically chosen over ones that had not. Experiment 3 examined the effect of providing an additional retrieval attempt on delayed JOLs. Half of the children received a single study session, and half received an additional study session with feedback. Children in the practice group showed less overconfidence than those in the no-practice group. Taken together, the results suggest that, with minimal task experience, most preschoolers understand that they will not remember everything and that if they cannot recall something at present, they are unlikely to recall it in the future. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  12. A computational approach for predicting off-target toxicity of antiviral ribonucleoside analogues to mitochondrial RNA polymerase.

    PubMed

    Freedman, Holly; Winter, Philip; Tuszynski, Jack; Tyrrell, D Lorne; Houghton, Michael

    2018-06-22

    In the development of antiviral drugs that target viral RNA-dependent RNA polymerases, off-target toxicity caused by the inhibition of the human mitochondrial RNA polymerase (POLRMT) is a major liability. Therefore, it is essential that all new ribonucleoside analogue drugs be accurately screened for POLRMT inhibition. A computational tool that can accurately predict NTP binding to POLRMT could assist in evaluating any potential toxicity and in designing possible salvaging strategies. Using the available crystal structure of POLRMT bound to an RNA transcript, here we created a model of POLRMT with an NTP molecule bound in the active site. Furthermore, we implemented a computational screening procedure that determines the relative binding free energy of an NTP analogue to POLRMT by free energy perturbation (FEP), i.e. a simulation in which the natural NTP molecule is slowly transformed into the analogue and back. In each direction, the transformation was performed over 40 ns of simulation on our IBM Blue Gene Q supercomputer. This procedure was validated across a panel of drugs for which experimental dissociation constants were available, showing that NTP relative binding free energies could be predicted to within 0.97 kcal/mol of the experimental values on average. These results demonstrate for the first time that free-energy simulation can be a useful tool for predicting binding affinities of NTP analogues to a polymerase. We expect that our model, together with similar models of viral polymerases, will be very useful in the screening and future design of NTP inhibitors of viral polymerases that have no mitochondrial toxicity. © 2018 Freedman et al.

  13. Predicting dense nonaqueous phase liquid dissolution using a simplified source depletion model parameterized with partitioning tracers

    NASA Astrophysics Data System (ADS)

    Basu, Nandita B.; Fure, Adrian D.; Jawitz, James W.

    2008-07-01

    Simulations of nonpartitioning and partitioning tracer tests were used to parameterize the equilibrium stream tube model (ESM) that predicts the dissolution dynamics of dense nonaqueous phase liquids (DNAPLs) as a function of the Lagrangian properties of DNAPL source zones. Lagrangian, or stream-tube-based, approaches characterize source zones with as few as two trajectory-integrated parameters, in contrast to the potentially thousands of parameters required to describe the point-by-point variability in permeability and DNAPL in traditional Eulerian modeling approaches. The spill and subsequent dissolution of DNAPLs were simulated in two-dimensional domains having different hydrologic characteristics (variance of the log conductivity field = 0.2, 1, and 3) using the multiphase flow and transport simulator UTCHEM. Nonpartitioning and partitioning tracers were used to characterize the Lagrangian properties (travel time and trajectory-integrated DNAPL content statistics) of DNAPL source zones, which were in turn shown to be sufficient for accurate prediction of source dissolution behavior using the ESM throughout the relatively broad range of hydraulic conductivity variances tested here. The results were found to be relatively insensitive to travel time variability, suggesting that dissolution could be accurately predicted even if the travel time variance was only coarsely estimated. Estimation of the ESM parameters was also demonstrated using an approximate technique based on Eulerian data in the absence of tracer data; however, determining the minimum amount of such data required remains for future work. Finally, the stream tube model was shown to be a more unique predictor of dissolution behavior than approaches based on the ganglia-to-pool model for source zone characterization.

  14. Accurate Prediction of Motor Failures by Application of Multi CBM Tools: A Case Study

    NASA Astrophysics Data System (ADS)

    Dutta, Rana; Singh, Veerendra Pratap; Dwivedi, Jai Prakash

    2018-02-01

    Motor failures are very difficult to predict accurately with a single condition-monitoring tool as both electrical and the mechanical systems are closely related. Electrical problem, like phase unbalance, stator winding insulation failures can, at times, lead to vibration problem and at the same time mechanical failures like bearing failure, leads to rotor eccentricity. In this case study of a 550 kW blower motor it has been shown that a rotor bar crack was detected by current signature analysis and vibration monitoring confirmed the same. In later months in a similar motor vibration monitoring predicted bearing failure and current signature analysis confirmed the same. In both the cases, after dismantling the motor, the predictions were found to be accurate. In this paper we will be discussing the accurate predictions of motor failures through use of multi condition monitoring tools with two case studies.

  15. Aggregation Trade Offs in Family Based Recommendations

    NASA Astrophysics Data System (ADS)

    Berkovsky, Shlomo; Freyne, Jill; Coombe, Mac

    Personalized information access tools are frequently based on collaborative filtering recommendation algorithms. Collaborative filtering recommender systems typically suffer from a data sparsity problem, where systems do not have sufficient user data to generate accurate and reliable predictions. Prior research suggested using group-based user data in the collaborative filtering recommendation process to generate group-based predictions and partially resolve the sparsity problem. Although group recommendations are less accurate than personalized recommendations, they are more accurate than general non-personalized recommendations, which are the natural fall back when personalized recommendations cannot be generated. In this work we present initial results of a study that exploits the browsing logs of real families of users gathered in an eHealth portal. The browsing logs allowed us to experimentally compare the accuracy of two group-based recommendation strategies: aggregated group models and aggregated predictions. Our results showed that aggregating individual models into group models resulted in more accurate predictions than aggregating individual predictions into group predictions.

  16. A hypothetical model for predicting the toxicity of high aspect ratio nanoparticles (HARN)

    NASA Astrophysics Data System (ADS)

    Tran, C. L.; Tantra, R.; Donaldson, K.; Stone, V.; Hankin, S. M.; Ross, B.; Aitken, R. J.; Jones, A. D.

    2011-12-01

    The ability to predict nanoparticle (dimensional structures which are less than 100 nm in size) toxicity through the use of a suitable model is an important goal if nanoparticles are to be regulated in terms of exposures and toxicological effects. Recently, a model to predict toxicity of nanoparticles with high aspect ratio has been put forward by a consortium of scientists. The High aspect ratio nanoparticles (HARN) model is a platform that relates the physical dimensions of HARN (specifically length and diameter ratio) and biopersistence to their toxicity in biological environments. Potentially, this model is of great public health and economic importance, as it can be used as a tool to not only predict toxicological activity but can be used to classify the toxicity of various fibrous nanoparticles, without the need to carry out time-consuming and expensive toxicology studies. However, this model of toxicity is currently hypothetical in nature and is based solely on drawing similarities in its dimensional geometry with that of asbestos and synthetic vitreous fibres. The aim of this review is two-fold: (a) to present findings from past literature, on the physicochemical property and pathogenicity bioassay testing of HARN (b) to identify some of the challenges and future research steps crucial before the HARN model can be accepted as a predictive model. By presenting what has been done, we are able to identify scientific challenges and research directions that are needed for the HARN model to gain public acceptance. Our recommendations for future research includes the need to: (a) accurately link physicochemical data with corresponding pathogenicity assay data, through the use of suitable reference standards and standardised protocols, (b) develop better tools/techniques for physicochemical characterisation, (c) to develop better ways of monitoring HARN in the workplace, (d) to reliably measure dose exposure levels, in order to support future epidemiological studies.

  17. Predicting future major depression and persistent depressive symptoms: Development of a prognostic screener and PHQ-4 cutoffs in breast cancer patients.

    PubMed

    Weihs, Karen L; Wiley, Joshua F; Crespi, Catherine M; Krull, Jennifer L; Stanton, Annette L

    2018-02-01

    Create a brief, self-report screener for recently diagnosed breast cancer patients to identify patients at risk of future depression. Breast cancer patients (N = 410) within 2 ± 1 months after diagnosis provided data on depression vulnerability. Depression outcomes were defined as a high depressive symptom trajectory or a major depressive episode during 16 months after diagnosis. Stochastic gradient boosting of regression trees identified 7 items highly predictive for the depression outcomes from a pool of 219 candidate depression vulnerability items. Three of the 7 items were from the Patient Health Questionnaire 4 (PHQ-4), a validated screener for current anxiety/depressive disorder that has not been tested to identify risk for future depression. Thresholds classifying patients as high or low risk on the new Depression Risk Questionnaire 7 (DRQ-7) and the PHQ-4 were obtained. Predictive performance of the DRQ-7 and PHQ-4 was assessed on a holdout validation subsample. DRQ-7 items assess loneliness, irritability, persistent sadness, and low acceptance of emotion as well as 3 items from the PHQ-4 (anhedonia, depressed mood, and worry). A DRQ-7 score of ≥6/23 identified depression outcomes with 0.73 specificity, 0.83 sensitivity, 0.68 positive predictive value, and 0.86 negative predictive value. A PHQ-4 score of ≥3/12 performed moderately well but less accurately than the DRQ-7 (net reclassification improvement = 10%; 95% CI [0.5-16]). The DRQ-7 and the PHQ-4 with a new cutoff score are clinically accessible screeners for risk of depression in newly diagnosed breast cancer patients. Use of the screener to select patients for preventive interventions awaits validation of the screener in other samples. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Choosing the appropriate forecasting model for predictive parameter control.

    PubMed

    Aleti, Aldeida; Moser, Irene; Meedeniya, Indika; Grunske, Lars

    2014-01-01

    All commonly used stochastic optimisation algorithms have to be parameterised to perform effectively. Adaptive parameter control (APC) is an effective method used for this purpose. APC repeatedly adjusts parameter values during the optimisation process for optimal algorithm performance. The assignment of parameter values for a given iteration is based on previously measured performance. In recent research, time series prediction has been proposed as a method of projecting the probabilities to use for parameter value selection. In this work, we examine the suitability of a variety of prediction methods for the projection of future parameter performance based on previous data. All considered prediction methods have assumptions the time series data has to conform to for the prediction method to provide accurate projections. Looking specifically at parameters of evolutionary algorithms (EAs), we find that all standard EA parameters with the exception of population size conform largely to the assumptions made by the considered prediction methods. Evaluating the performance of these prediction methods, we find that linear regression provides the best results by a very small and statistically insignificant margin. Regardless of the prediction method, predictive parameter control outperforms state of the art parameter control methods when the performance data adheres to the assumptions made by the prediction method. When a parameter's performance data does not adhere to the assumptions made by the forecasting method, the use of prediction does not have a notable adverse impact on the algorithm's performance.

  19. Multidimensional severity assessment in bronchiectasis: an analysis of seven European cohorts.

    PubMed

    McDonnell, M J; Aliberti, S; Goeminne, P C; Dimakou, K; Zucchetti, S C; Davidson, J; Ward, C; Laffey, J G; Finch, S; Pesci, A; Dupont, L J; Fardon, T C; Skrbic, D; Obradovic, D; Cowman, S; Loebinger, M R; Rutherford, R M; De Soyza, A; Chalmers, J D

    2016-12-01

    Bronchiectasis is a multidimensional disease associated with substantial morbidity and mortality. Two disease-specific clinical prediction tools have been developed, the Bronchiectasis Severity Index (BSI) and the FACED score, both of which stratify patients into severity risk categories to predict the probability of mortality. We aimed to compare the predictive utility of BSI and FACED in assessing clinically relevant disease outcomes across seven European cohorts independent of their original validation studies. The combined cohorts totalled 1612. Pooled analysis showed that both scores had a good discriminatory predictive value for mortality (pooled area under the curve (AUC) 0.76, 95% CI 0.74 to 0.78 for both scores) with the BSI demonstrating a higher sensitivity (65% vs 28%) but lower specificity (70% vs 93%) compared with the FACED score. Calibration analysis suggested that the BSI performed consistently well across all cohorts, while FACED consistently overestimated mortality in 'severe' patients (pooled OR 0.33 (0.23 to 0.48), p<0.0001). The BSI accurately predicted hospitalisations (pooled AUC 0.82, 95% CI 0.78 to 0.84), exacerbations, quality of life (QoL) and respiratory symptoms across all risk categories. FACED had poor discrimination for hospital admissions (pooled AUC 0.65, 95% CI 0.63 to 0.67) with low sensitivity at 16% and did not consistently predict future risk of exacerbations, QoL or respiratory symptoms. No association was observed with FACED and 6 min walk distance (6MWD) or lung function decline. The BSI accurately predicts mortality, hospital admissions, exacerbations, QoL, respiratory symptoms, 6MWD and lung function decline in bronchiectasis, providing a clinically relevant evaluation of disease severity. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Prediction of household and commercial BMW generation according to socio-economic and other factors for the Dublin region.

    PubMed

    Purcell, M; Magette, W L

    2009-04-01

    Both planning and design of integrated municipal solid waste management systems require accurate prediction of waste generation. This research predicted the quantity and distribution of biodegradable municipal waste (BMW) generation within a diverse 'landscape' of residential areas, as well as from a variety of commercial establishments (restaurants, hotels, hospitals, etc.) in the Dublin (Ireland) region. Socio-economic variables, housing types, and the sizes and main activities of commercial establishments were hypothesized as the key determinants contributing to the spatial variability of BMW generation. A geographical information system (GIS) 'model' of BMW generation was created using ArcMap, a component of ArcGIS 9. Statistical data including socio-economic status and household size were mapped on an electoral district basis. Historical research and data from scientific literature were used to assign BMW generation rates to residential and commercial establishments. These predictions were combined to give overall BMW estimates for the region, which can aid waste planning and policy decisions. This technique will also aid the design of future waste management strategies, leading to policy and practice alterations as a function of demographic changes and development. The household prediction technique gave a more accurate overall estimate of household waste generation than did the social class technique. Both techniques produced estimates that differed from the reported local authority data; however, given that local authority reported figures for the region are below the national average, with some of the waste generated from apartment complexes being reported as commercial waste, predictions arising from this research are believed to be closer to actual waste generation than a comparison to reported data would suggest. By changing the input data, this estimation tool can be adapted for use in other locations. Although focusing on waste in the Dublin region, this method of waste prediction can have significant potential benefits if a universal method can be found to apply it effectively.

  1. Punishing an error improves learning: the influence of punishment magnitude on error-related neural activity and subsequent learning.

    PubMed

    Hester, Robert; Murphy, Kevin; Brown, Felicity L; Skilleter, Ashley J

    2010-11-17

    Punishing an error to shape subsequent performance is a major tenet of individual and societal level behavioral interventions. Recent work examining error-related neural activity has identified that the magnitude of activity in the posterior medial frontal cortex (pMFC) is predictive of learning from an error, whereby greater activity in this region predicts adaptive changes in future cognitive performance. It remains unclear how punishment influences error-related neural mechanisms to effect behavior change, particularly in key regions such as pMFC, which previous work has demonstrated to be insensitive to punishment. Using an associative learning task that provided monetary reward and punishment for recall performance, we observed that when recall errors were categorized by subsequent performance--whether the failure to accurately recall a number-location association was corrected at the next presentation of the same trial--the magnitude of error-related pMFC activity predicted future correction. However, the pMFC region was insensitive to the magnitude of punishment an error received and it was the left insula cortex that predicted learning from the most aversive outcomes. These findings add further evidence to the hypothesis that error-related pMFC activity may reflect more than a prediction error in representing the value of an outcome. The novel role identified here for the insular cortex in learning from punishment appears particularly compelling for our understanding of psychiatric and neurologic conditions that feature both insular cortex dysfunction and a diminished capacity for learning from negative feedback or punishment.

  2. A General and Efficient Method for Incorporating Precise Spike Times in Globally Time-Driven Simulations

    PubMed Central

    Hanuschkin, Alexander; Kunkel, Susanne; Helias, Moritz; Morrison, Abigail; Diesmann, Markus

    2010-01-01

    Traditionally, event-driven simulations have been limited to the very restricted class of neuronal models for which the timing of future spikes can be expressed in closed form. Recently, the class of models that is amenable to event-driven simulation has been extended by the development of techniques to accurately calculate firing times for some integrate-and-fire neuron models that do not enable the prediction of future spikes in closed form. The motivation of this development is the general perception that time-driven simulations are imprecise. Here, we demonstrate that a globally time-driven scheme can calculate firing times that cannot be discriminated from those calculated by an event-driven implementation of the same model; moreover, the time-driven scheme incurs lower computational costs. The key insight is that time-driven methods are based on identifying a threshold crossing in the recent past, which can be implemented by a much simpler algorithm than the techniques for predicting future threshold crossings that are necessary for event-driven approaches. As run time is dominated by the cost of the operations performed at each incoming spike, which includes spike prediction in the case of event-driven simulation and retrospective detection in the case of time-driven simulation, the simple time-driven algorithm outperforms the event-driven approaches. Additionally, our method is generally applicable to all commonly used integrate-and-fire neuronal models; we show that a non-linear model employing a standard adaptive solver can reproduce a reference spike train with a high degree of precision. PMID:21031031

  3. Characterization Approaches to Place Invariant Sites on SI-Traceable Scales

    NASA Technical Reports Server (NTRS)

    Thome, Kurtis

    2012-01-01

    The effort to understand the Earth's climate system requires a complete integration of remote sensing imager data across time and multiple countries. Such an integration necessarily requires ensuring inter-consistency between multiple sensors to create the data sets needed to understand the climate system. Past efforts at inter-consistency have forced agreement between two sensors using sources that are viewed by both sensors at nearly the same time, and thus tend to be near polar regions over snow and ice. The current work describes a method that would provide an absolute radiometric calibration of a sensor rather than an inter-consistency of a sensor relative to another. The approach also relies on defensible error budgets that eventually provides a cross comparison of sensors without systematic errors. The basis of the technique is a model-based, SI-traceable prediction of at-sensor radiance over selected sites. The predicted radiance would be valid for arbitrary view and illumination angles and for any date of interest that is dominated by clear-sky conditions. The effort effectively works to characterize the sites as sources with known top-of-atmosphere radiance allowing accurate intercomparison of sensor data that without the need for coincident views. Data from the Advanced Spaceborne Thermal Emission and Reflection and Radiometer (ASTER), Enhanced Thematic Mapper Plus (ETM+), and Moderate Resolution Imaging Spectroradiometer (MODIS) are used to demonstrate the difficulties of cross calibration as applied to current sensors. Special attention is given to the differences caused in the cross-comparison of sensors in radiance space as opposed to reflectance space. The radiance comparisons lead to significant differences created by the specific solar model used for each sensor. The paper also proposes methods to mitigate the largest error sources in future systems. The results from these historical intercomparisons provide the basis for a set of recommendations to ensure future SI-traceable cross calibration using future missions such as CLARREO and TRUTHS. The paper describes a proposed approach that relies on model-based, SI-traceable predictions of at-sensor radiance over selected sites. The predicted radiance would be valid for arbitrary view and illumination angles and for any date of interest that is dominated by clear-sky conditions. The basis of the method is highly accurate measurements of at-sensor radiance of sufficient quality to understand the spectral and BRDF characteristics of the site and sufficient historical data to develop an understanding of temporal effects from changing surface and atmospheric conditions.

  4. Comparison and validation of injury risk classifiers for advanced automated crash notification systems.

    PubMed

    Kusano, Kristofer; Gabler, Hampton C

    2014-01-01

    The odds of death for a seriously injured crash victim are drastically reduced if he or she received care at a trauma center. Advanced automated crash notification (AACN) algorithms are postcrash safety systems that use data measured by the vehicles during the crash to predict the likelihood of occupants being seriously injured. The accuracy of these models are crucial to the success of an AACN. The objective of this study was to compare the predictive performance of competing injury risk models and algorithms: logistic regression, random forest, AdaBoost, naïve Bayes, support vector machine, and classification k-nearest neighbors. This study compared machine learning algorithms to the widely adopted logistic regression modeling approach. Machine learning algorithms have not been commonly studied in the motor vehicle injury literature. Machine learning algorithms may have higher predictive power than logistic regression, despite the drawback of lacking the ability to perform statistical inference. To evaluate the performance of these algorithms, data on 16,398 vehicles involved in non-rollover collisions were extracted from the NASS-CDS. Vehicles with any occupants having an Injury Severity Score (ISS) of 15 or greater were defined as those requiring victims to be treated at a trauma center. The performance of each model was evaluated using cross-validation. Cross-validation assesses how a model will perform in the future given new data not used for model training. The crash ΔV (change in velocity during the crash), damage side (struck side of the vehicle), seat belt use, vehicle body type, number of events, occupant age, and occupant sex were used as predictors in each model. Logistic regression slightly outperformed the machine learning algorithms based on sensitivity and specificity of the models. Previous studies on AACN risk curves used the same data to train and test the power of the models and as a result had higher sensitivity compared to the cross-validated results from this study. Future studies should account for future data; for example, by using cross-validation or risk presenting optimistic predictions of field performance. Past algorithms have been criticized for relying on age and sex, being difficult to measure by vehicle sensors, and inaccuracies in classifying damage side. The models with accurate damage side and including age/sex did outperform models with less accurate damage side and without age/sex, but the differences were small, suggesting that the success of AACN is not reliant on these predictors.

  5. Evidence-based pathology in its second decade: toward probabilistic cognitive computing.

    PubMed

    Marchevsky, Alberto M; Walts, Ann E; Wick, Mark R

    2017-03-01

    Evidence-based pathology advocates using a combination of best available data ("evidence") from the literature and personal experience for the diagnosis, estimation of prognosis, and assessment of other variables that impact individual patient care. Evidence-based pathology relies on systematic reviews of the literature, evaluation of the quality of evidence as categorized by evidence levels and statistical tools such as meta-analyses, estimates of probabilities and odds, and others. However, it is well known that previously "statistically significant" information usually does not accurately forecast the future for individual patients. There is great interest in "cognitive computing" in which "data mining" is combined with "predictive analytics" designed to forecast future events and estimate the strength of those predictions. This study demonstrates the use of IBM Watson Analytics software to evaluate and predict the prognosis of 101 patients with typical and atypical pulmonary carcinoid tumors in which Ki-67 indices have been determined. The results obtained with this system are compared with those previously reported using "routine" statistical software and the help of a professional statistician. IBM Watson Analytics interactively provides statistical results that are comparable to those obtained with routine statistical tools but much more rapidly, with considerably less effort and with interactive graphics that are intuitively easy to apply. It also enables analysis of natural language variables and yields detailed survival predictions for patient subgroups selected by the user. Potential applications of this tool and basic concepts of cognitive computing are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Constraints on the neutrino flux in NOvA using the near detector data

    DOE PAGES

    Maan, Kuldeep K.

    2016-12-19

    NOvA, a long-baseline neutrino oscillation experiment at Fermilab, is designed to measure electron-neutrino appearance and muon-neutrino disappearance in the NuMI beam. NOvA comprises of two finely segmented liquid scintillator detectors at 14 mrad off-axis in the NuMI beam. An accurate prediction of the neutrino flux is needed for precision oscillation and cross-section measurements. Data from the hadron-production experiments and, importantly, from the NOvA Near Detector provide powerful constraints on the muon-neutrino and electron-neutrino fluxes. In particular, the measurement of the neutrino-electron elastic scattering provides an in situ constraint on the absolute flux. Lastly, this poster presents the data-driven predictions ofmore » the NOvA muonneutrino and electron-neutrino flux, and outlines future improvements in the flux determination.« less

  7. Uncertainties in radiation effect predictions for the natural radiation environments of space.

    PubMed

    McNulty, P J; Stassinopoulos, E G

    1994-10-01

    Future manned missions beyond low earth orbit require accurate predictions of the risk to astronauts and to critical systems from exposure to ionizing radiation. For low-level exposures, the hazards are dominated by rare single-event phenomena where individual cosmic-ray particles or spallation reactions result in potentially catastrophic changes in critical components. Examples might be a biological lesion leading to cancer in an astronaut or a memory upset leading to an undesired rocket firing. The risks of such events appears to depend on the amount of energy deposited within critical sensitive volumes of biological cells and microelectronic components. The critical environmental information needed to estimate the risks posed by the natural space environments, including solar flares, is the number of times more than a threshold amount of energy for an event will be deposited in the critical microvolumes. These predictions are complicated by uncertainties in the natural environments, particularly the composition of flares, and by the effects of shielding. Microdosimetric data for large numbers of orbits are needed to improve the environmental models and to test the transport codes used to predict event rates.

  8. Model-based redesign of global transcription regulation

    PubMed Central

    Carrera, Javier; Rodrigo, Guillermo; Jaramillo, Alfonso

    2009-01-01

    Synthetic biology aims to the design or redesign of biological systems. In particular, one possible goal could be the rewiring of the transcription regulation network by exchanging the endogenous promoters. To achieve this objective, we have adapted current methods to the inference of a model based on ordinary differential equations that is able to predict the network response after a major change in its topology. Our procedure utilizes microarray data for training. We have experimentally validated our inferred global regulatory model in Escherichia coli by predicting transcriptomic profiles under new perturbations. We have also tested our methodology in silico by providing accurate predictions of the underlying networks from expression data generated with artificial genomes. In addition, we have shown the predictive power of our methodology by obtaining the gene profile in experimental redesigns of the E. coli genome, where rewiring the transcriptional network by means of knockouts of master regulators or by upregulating transcription factors controlled by different promoters. Our approach is compatible with most network inference methods, allowing to explore computationally future genome-wide redesign experiments in synthetic biology. PMID:19188257

  9. Validation metrics for turbulent plasma transport

    DOE PAGES

    Holland, C.

    2016-06-22

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  10. Predicting St. Louis encephalitis virus epidemics: lessons from recent, and not so recent, outbreaks.

    PubMed

    Day, J F

    2001-01-01

    St. Louis encephalitis virus was first identified as the cause of human disease in North America after a large urban epidemic in St. Louis, Missouri, during the summer of 1933. Since then, numerous outbreaks of St. Louis encephalitis have occurred throughout the continent. In south Florida, a 1990 epidemic lasted from August 1990 through January 1991 and resulted in 226 clinical cases and 11 deaths in 28 counties. This epidemic severely disrupted normal activities throughout the southern half of the state for 5 months and adversely impacted tourism in the affected region. The accurate forecasting of mosquito-borne arboviral epidemics will help minimize their impact on urban and rural population centers. Epidemic predictability would help focus control efforts and public education about epidemic risks, transmission patterns, and elements of personal protection that reduce the probability of arboviral infection. Research associated with arboviral outbreaks has provided an understanding of the strengths and weaknesses associated with epidemic prediction. The purpose of this paper is to review lessons from past arboviral epidemics and determine how these observations might aid our ability to predict and respond to future outbreaks.

  11. Validation metrics for turbulent plasma transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, C.

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  12. Uncertainties in radiation effect predictions for the natural radiation environments of space

    NASA Technical Reports Server (NTRS)

    Mcnulty, P. J.; Stassinopoulos, E. G.

    1994-01-01

    Future manned missions beyond low earth orbit require accurate predictions of the risk to astronauts and to critical systems from exposure to ionizing radiation. For low-level exposures, the hazards are dominated by rare single-event phenomena where individual cosmic-ray particles or spallation reactions result in potentially catastrophic changes in critical components. Examples might be a biological lesion leading to cancer in an astronaut or a memory upset leading to an undesired rocket firing. The risks of such events appears to depend on the amount of energy deposited within critical sensitive volumes of biological cells and microelectronic components. The critical environmental information needed to estimate the risks posed by the natural space environments, including solar flares, is the number of times more than a threshold amount of energy for an event will be deposited in the critical microvolumes. These predictions are complicated by uncertainties in the natural environments, particularly the composition of flares, and by the effects of shielding. Microdosimetric data for large numbers of orbits are needed to improve the environmental models and to test the transport codes used to predict event rates.

  13. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs

    PubMed Central

    2017-01-01

    Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package. PMID:29107980

  14. Interacting stressors and the potential for adaptation in a changing world: responses of populations and individuals

    PubMed Central

    French, Susannah S.; Brodie, Edmund D.

    2017-01-01

    To accurately predict the impact of environmental change, it is necessary to assay effects of key interacting stressors on vulnerable organisms, and the potential resiliency of their populations. Yet, for the most part, these critical data are missing. We examined the effects of two common abiotic stressors predicted to interact with climate change, salinity and temperature, on the embryonic survival and development of a model freshwater vertebrate, the rough-skinned newt (Taricha granulosa) from different populations. We found that salinity and temperature significantly interacted to affect newt embryonic survival and development, with the negative effects of salinity most pronounced at temperature extremes. We also found significant variation among, and especially within, populations, with different females varying in the performance of their eggs at different salinity–temperature combinations, possibly providing the raw material for future natural selection. Our results highlight the complex nature of predicting responses to climate change in space and time, and provide critical data towards that aim. PMID:28680662

  15. Differential evolution-based multi-objective optimization for the definition of a health indicator for fault diagnostics and prognostics

    NASA Astrophysics Data System (ADS)

    Baraldi, P.; Bonfanti, G.; Zio, E.

    2018-03-01

    The identification of the current degradation state of an industrial component and the prediction of its future evolution is a fundamental step for the development of condition-based and predictive maintenance approaches. The objective of the present work is to propose a general method for extracting a health indicator to measure the amount of component degradation from a set of signals measured during operation. The proposed method is based on the combined use of feature extraction techniques, such as Empirical Mode Decomposition and Auto-Associative Kernel Regression, and a multi-objective Binary Differential Evolution (BDE) algorithm for selecting the subset of features optimal for the definition of the health indicator. The objectives of the optimization are desired characteristics of the health indicator, such as monotonicity, trendability and prognosability. A case study is considered, concerning the prediction of the remaining useful life of turbofan engines. The obtained results confirm that the method is capable of extracting health indicators suitable for accurate prognostics.

  16. De Novo Chromosome Structure Prediction

    NASA Astrophysics Data System (ADS)

    di Pierro, Michele; Cheng, Ryan R.; Lieberman-Aiden, Erez; Wolynes, Peter G.; Onuchic, Jose'n.

    Chromatin consists of DNA and hundreds of proteins that interact with the genetic material. In vivo, chromatin folds into nonrandom structures. The physical mechanism leading to these characteristic conformations, however, remains poorly understood. We recently introduced MiChroM, a model that generates chromosome conformations by using the idea that chromatin can be subdivided into types based on its biochemical interactions. Here we extend and complete our previous finding by showing that structural chromatin types can be inferred from ChIP-Seq data. Chromatin types, which are distinct from DNA sequence, are partially epigenetically controlled and change during cell differentiation, thus constituting a link between epigenetics, chromosomal organization, and cell development. We show that, for GM12878 lymphoblastoid cells we are able to predict accurate chromosome structures with the only input of genomic data. The degree of accuracy achieved by our prediction supports the viability of the proposed physical mechanism of chromatin folding and makes the computational model a powerful tool for future investigations.

  17. Space can substitute for time in predicting climate-change effects on biodiversity

    USGS Publications Warehouse

    Blois, Jessica L.; Williams, John W.; Fitzpatrick, Matthew C.; Jackson, Stephen T.; Ferrier, Simon

    2013-01-01

    “Space-for-time” substitution is widely used in biodiversity modeling to infer past or future trajectories of ecological systems from contemporary spatial patterns. However, the foundational assumption—that drivers of spatial gradients of species composition also drive temporal changes in diversity—rarely is tested. Here, we empirically test the space-for-time assumption by constructing orthogonal datasets of compositional turnover of plant taxa and climatic dissimilarity through time and across space from Late Quaternary pollen records in eastern North America, then modeling climate-driven compositional turnover. Predictions relying on space-for-time substitution were ∼72% as accurate as “time-for-time” predictions. However, space-for-time substitution performed poorly during the Holocene when temporal variation in climate was small relative to spatial variation and required subsampling to match the extent of spatial and temporal climatic gradients. Despite this caution, our results generally support the judicious use of space-for-time substitution in modeling community responses to climate change.

  18. Corneal cell culture models: a tool to study corneal drug absorption.

    PubMed

    Dey, Surajit

    2011-05-01

    In recent times, there has been an ever increasing demand for ocular drugs to treat sight threatening diseases such as glaucoma, age-related macular degeneration and diabetic retinopathy. As more drugs are developed, there is a great need to test in vitro permeability of these drugs to predict their efficacy and bioavailability in vivo. Corneal cell culture models are the only tool that can predict drug absorption across ocular layers accurately and rapidly. Cell culture studies are also valuable in reducing the number of animals needed for in vivo studies which can increase the cost of the drug developmental process. Currently, rabbit corneal cell culture models are used to predict human corneal absorption due to the difficulty in human corneal studies. More recently, a three dimensional human corneal equivalent has been developed using three different cell types to mimic the human cornea. In the future, human corneal cell culture systems need to be developed to be used as a standardized model for drug permeation.

  19. Space can substitute for time in predicting climate-change effects on biodiversity.

    PubMed

    Blois, Jessica L; Williams, John W; Fitzpatrick, Matthew C; Jackson, Stephen T; Ferrier, Simon

    2013-06-04

    "Space-for-time" substitution is widely used in biodiversity modeling to infer past or future trajectories of ecological systems from contemporary spatial patterns. However, the foundational assumption--that drivers of spatial gradients of species composition also drive temporal changes in diversity--rarely is tested. Here, we empirically test the space-for-time assumption by constructing orthogonal datasets of compositional turnover of plant taxa and climatic dissimilarity through time and across space from Late Quaternary pollen records in eastern North America, then modeling climate-driven compositional turnover. Predictions relying on space-for-time substitution were ∼72% as accurate as "time-for-time" predictions. However, space-for-time substitution performed poorly during the Holocene when temporal variation in climate was small relative to spatial variation and required subsampling to match the extent of spatial and temporal climatic gradients. Despite this caution, our results generally support the judicious use of space-for-time substitution in modeling community responses to climate change.

  20. Predicting Urban Medical Services Demand in China: An Improved Grey Markov Chain Model by Taylor Approximation.

    PubMed

    Duan, Jinli; Jiao, Feng; Zhang, Qishan; Lin, Zhibin

    2017-08-06

    The sharp increase of the aging population has raised the pressure on the current limited medical resources in China. To better allocate resources, a more accurate prediction on medical service demand is very urgently needed. This study aims to improve the prediction on medical services demand in China. To achieve this aim, the study combines Taylor Approximation into the Grey Markov Chain model, and develops a new model named Taylor-Markov Chain GM (1,1) (T-MCGM (1,1)). The new model has been tested by adopting the historical data, which includes the medical service on treatment of diabetes, heart disease, and cerebrovascular disease from 1997 to 2015 in China. The model provides a predication on medical service demand of these three types of disease up to 2022. The results reveal an enormous growth of urban medical service demand in the future. The findings provide practical implications for the Health Administrative Department to allocate medical resources, and help hospitals to manage investments on medical facilities.

  1. RACER a Coarse-Grained RNA Model for Capturing Folding Free Energy in Molecular Dynamics Simulations

    NASA Astrophysics Data System (ADS)

    Cheng, Sara; Bell, David; Ren, Pengyu

    RACER is a coarse-grained RNA model that can be used in molecular dynamics simulations to predict native structures and sequence-specific variation of free energy of various RNA structures. RACER is capable of accurate prediction of native structures of duplexes and hairpins (average RMSD of 4.15 angstroms), and RACER can capture sequence-specific variation of free energy in excellent agreement with experimentally measured stabilities (r-squared =0.98). The RACER model implements a new effective non-bonded potential and re-parameterization of hydrogen bond and Debye-Huckel potentials. Insights from the RACER model include the importance of treating pairing and stacking interactions separately in order to distinguish folded an unfolded states and identification of hydrogen-bonding, base stacking, and electrostatic interactions as essential driving forces for RNA folding. Future applications of the RACER model include predicting free energy landscapes of more complex RNA structures and use of RACER for multiscale simulations.

  2. Heart rate during basketball game play and volleyball drills accurately predicts oxygen uptake and energy expenditure.

    PubMed

    Scribbans, T D; Berg, K; Narazaki, K; Janssen, I; Gurd, B J

    2015-09-01

    There is currently little information regarding the ability of metabolic prediction equations to accurately predict oxygen uptake and exercise intensity from heart rate (HR) during intermittent sport. The purpose of the present study was to develop and, cross-validate equations appropriate for accurately predicting oxygen cost (VO2) and energy expenditure from HR during intermittent sport participation. Eleven healthy adult males (19.9±1.1yrs) were recruited to establish the relationship between %VO2peak and %HRmax during low-intensity steady state endurance (END), moderate-intensity interval (MOD) and high intensity-interval exercise (HI), as performed on a cycle ergometer. Three equations (END, MOD, and HI) for predicting %VO2peak based on %HRmax were developed. HR and VO2 were directly measured during basketball games (6 male, 20.8±1.0 yrs; 6 female, 20.0±1.3yrs) and volleyball drills (12 female; 20.8±1.0yrs). Comparisons were made between measured and predicted VO2 and energy expenditure using the 3 equations developed and 2 previously published equations. The END and MOD equations accurately predicted VO2 and energy expenditure, while the HI equation underestimated, and the previously published equations systematically overestimated VO2 and energy expenditure. Intermittent sport VO2 and energy expenditure can be accurately predicted from heart rate data using either the END (%VO2peak=%HRmax x 1.008-17.17) or MOD (%VO2peak=%HRmax x 1.2-32) equations. These 2 simple equations provide an accessible and cost-effective method for accurate estimation of exercise intensity and energy expenditure during intermittent sport.

  3. Lessons Learned from Radiative Transfer Simulations of the Venus Atmosphere

    NASA Technical Reports Server (NTRS)

    Arney, G.; Meadows, V. S.; Lincowski, A.

    2017-01-01

    The Venus atmosphere is extremely complex, and because of this the spectrum of Earths sister planet is likewise intricate and a challenge to model accurately. However, accurate modeling of Venus spectrum opens up multiple opportunities to better understand the planet next door, and even for understanding Venus-like planets beyond our solar system. Near-infrared (1-2.5 um, NIR) spectral windows observable on the Venus nigthside present the opportunity to probe beneath the Venusian cloud deck and measure thermal emission from the surface and lower atmosphere remotely from Earth or from orbit. These nigthside spectral windows were discovered by Allen and Crawford (1984) and have since been used measure trace gas abundances in the Venus lower atmosphere (less than 45 km), map surface emissivity varisions, and measure properties of the lower cloud deck. These windows sample radiation from below the cloud base at roughly 45 km, and pressures in this region range from roughly Earthlike (approx. 1 bar) up to 90 bars at the surface. Temperatures in this region are high: they range from about 400 K at the base of the cloud deck up to about 740 K at the surface. This high temperature and pressure presents several challenges to modelers attempting radiative transfer simulations of this region of the atmosphere, which we will review. Venus is also important to spectrally model to predict the remote observables of Venus-like exoplanets in anticipation of data from future observatories. Venus-like planets are likely one of the most common types of terrestrial planets and so simulations of them are valuable for planning observatory and detector properties of future telescopes being designed, as well as predicting the types of observations required to characterize them.

  4. The Dorsal Visual System Predicts Future and Remembers Past Eye Position

    PubMed Central

    Morris, Adam P.; Bremmer, Frank; Krekelberg, Bart

    2016-01-01

    Eye movements are essential to primate vision but introduce potentially disruptive displacements of the retinal image. To maintain stable vision, the brain is thought to rely on neurons that carry both visual signals and information about the current direction of gaze in their firing rates. We have shown previously that these neurons provide an accurate representation of eye position during fixation, but whether they are updated fast enough during saccadic eye movements to support real-time vision remains controversial. Here we show that not only do these neurons carry a fast and accurate eye-position signal, but also that they support in parallel a range of time-lagged variants, including predictive and post dictive signals. We recorded extracellular activity in four areas of the macaque dorsal visual cortex during a saccade task, including the lateral and ventral intraparietal areas (LIP, VIP), and the middle temporal (MT) and medial superior temporal (MST) areas. As reported previously, neurons showed tonic eye-position-related activity during fixation. In addition, they showed a variety of transient changes in activity around the time of saccades, including relative suppression, enhancement, and pre-saccadic bursts for one saccade direction over another. We show that a hypothetical neuron that pools this rich population activity through a weighted sum can produce an output that mimics the true spatiotemporal dynamics of the eye. Further, with different pooling weights, this downstream eye position signal (EPS) could be updated long before (<100 ms) or after (<200 ms) an eye movement. The results suggest a flexible coding scheme in which downstream computations have access to past, current, and future eye positions simultaneously, providing a basis for visual stability and delay-free visually-guided behavior. PMID:26941617

  5. Novel sensing technology in fall risk assessment in older adults: a systematic review.

    PubMed

    Sun, Ruopeng; Sosnoff, Jacob J

    2018-01-16

    Falls are a major health problem for older adults with significant physical and psychological consequences. A first step of successful fall prevention is to identify those at risk of falling. Recent advancement in sensing technology offers the possibility of objective, low-cost and easy-to-implement fall risk assessment. The objective of this systematic review is to assess the current state of sensing technology on providing objective fall risk assessment in older adults. A systematic review was conducted in accordance to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis statement (PRISMA). Twenty-two studies out of 855 articles were systematically identified and included in this review. Pertinent methodological features (sensing technique, assessment activities, outcome variables, and fall discrimination/prediction models) were extracted from each article. Four major sensing technologies (inertial sensors, video/depth camera, pressure sensing platform and laser sensing) were reported to provide accurate fall risk diagnostic in older adults. Steady state walking, static/dynamic balance, and functional mobility were used as the assessment activity. A diverse range of diagnostic accuracy across studies (47.9% - 100%) were reported, due to variation in measured kinematic/kinetic parameters and modelling techniques. A wide range of sensor technologies have been utilized in fall risk assessment in older adults. Overall, these devices have the potential to provide an accurate, inexpensive, and easy-to-implement fall risk assessment. However, the variation in measured parameters, assessment tools, sensor sites, movement tasks, and modelling techniques, precludes a firm conclusion on their ability to predict future falls. Future work is needed to determine a clinical meaningful and easy to interpret fall risk diagnosis utilizing sensing technology. Additionally, the gap between functional evaluation and user experience to technology should be addressed.

  6. Relationship between the Prediction Accuracy of Tsunami Inundation and Relative Distribution of Tsunami Source and Observation Arrays: A Case Study in Tokyo Bay

    NASA Astrophysics Data System (ADS)

    Takagawa, T.

    2017-12-01

    A rapid and precise tsunami forecast based on offshore monitoring is getting attention to reduce human losses due to devastating tsunami inundation. We developed a forecast method based on the combination of hierarchical Bayesian inversion with pre-computed database and rapid post-computing of tsunami inundation. The method was applied to Tokyo bay to evaluate the efficiency of observation arrays against three tsunamigenic earthquakes. One is a scenario earthquake at Nankai trough and the other two are historic ones of Genroku in 1703 and Enpo in 1677. In general, rich observation array near the tsunami source has an advantage in both accuracy and rapidness of tsunami forecast. To examine the effect of observation time length we used four types of data with the lengths of 5, 10, 20 and 45 minutes after the earthquake occurrences. Prediction accuracy of tsunami inundation was evaluated by the simulated tsunami inundation areas around Tokyo bay due to target earthquakes. The shortest time length of accurate prediction varied with target earthquakes. Here, accurate prediction means the simulated values fall within the 95% credible intervals of prediction. In Enpo earthquake case, 5-minutes observation is enough for accurate prediction for Tokyo bay, but 10-minutes and 45-minutes are needed in the case of Nankai trough and Genroku, respectively. The difference of the shortest time length for accurate prediction shows the strong relationship with the relative distance from the tsunami source and observation arrays. In the Enpo case, offshore tsunami observation points are densely distributed even in the source region. So, accurate prediction can be rapidly achieved within 5 minutes. This precise prediction is useful for early warnings. Even in the worst case of Genroku, where less observation points are available near the source, accurate prediction can be obtained within 45 minutes. This information can be useful to figure out the outline of the hazard in an early stage of reaction.

  7. Priorities for future research into asthma diagnostic tools: A PAN-EU consensus exercise from the European asthma research innovation partnership (EARIP).

    PubMed

    Garcia-Marcos, L; Edwards, J; Kennington, E; Aurora, P; Baraldi, E; Carraro, S; Gappa, M; Louis, R; Moreno-Galdo, A; Peroni, D G; Pijnenburg, M; Priftis, K N; Sanchez-Solis, M; Schuster, A; Walker, S

    2018-02-01

    The diagnosis of asthma is currently based on clinical history, physical examination and lung function, and to date, there are no accurate objective tests either to confirm the diagnosis or to discriminate between different types of asthma. This consensus exercise reviews the state of the art in asthma diagnosis to identify opportunities for future investment based on the likelihood of their successful development, potential for widespread adoption and their perceived impact on asthma patients. Using a two-stage e-Delphi process and a summarizing workshop, a group of European asthma experts including health professionals, researchers, people with asthma and industry representatives ranked the potential impact of research investment in each technique or tool for asthma diagnosis and monitoring. After a systematic review of the literature, 21 statements were extracted and were subject of the two-stage Delphi process. Eleven statements were scored 3 or more and were further discussed and ranked in a face-to-face workshop. The three most important diagnostic/predictive tools ranked were as follows: "New biological markers of asthma (eg genomics, proteomics and metabolomics) as a tool for diagnosis and/or monitoring," "Prediction of future asthma in preschool children with reasonable accuracy" and "Tools to measure volatile organic compounds (VOCs) in exhaled breath." © 2018 John Wiley & Sons Ltd.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dierickx, Marion I. P.; Loeb, Abraham, E-mail: mdierickx@cfa.harvard.edu, E-mail: aloeb@cfa.harvard.edu

    The extensive span of the Sagittarius (Sgr) stream makes it a promising tool for studying the gravitational potential of the Milky Way (MW). Characterizing its stellar kinematics can constrain halo properties and provide a benchmark for the paradigm of galaxy formation from cold dark matter. Accurate models of the disruption dynamics of the Sgr progenitor are necessary to employ this tool. Using a combination of analytic modeling and N -body simulations, we build a new model of the Sgr orbit and resulting stellar stream. In contrast to previous models, we simulate the full infall trajectory of the Sgr progenitor frommore » the time it first crossed the MW virial radius 8 Gyr ago. An exploration of the parameter space of initial phase-space conditions yields tight constraints on the angular momentum of the Sgr progenitor. Our best-fit model is the first to accurately reproduce existing data on the 3D positions and radial velocities of the debris detected 100 kpc away in the MW halo. In addition to replicating the mapped stream, the simulation also predicts the existence of several arms of the Sgr stream extending to hundreds of kiloparsecs. The two most distant stars known in the MW halo coincide with the predicted structure. Additional stars in the newly predicted arms can be found with future data from the Large Synoptic Survey Telescope. Detecting a statistical sample of stars in the most distant Sgr arms would provide an opportunity to constrain the MW potential out to unprecedented Galactocentric radii.« less

  9. Proposal for future diagnosis and management of vascular tumors by using automatic software for image processing and statistic prediction.

    PubMed

    Popescu, M D; Draghici, L; Secheli, I; Secheli, M; Codrescu, M; Draghici, I

    2015-01-01

    Infantile Hemangiomas (IH) are the most frequent tumors of vascular origin, and the differential diagnosis from vascular malformations is difficult to establish. Specific types of IH due to the location, dimensions and fast evolution, can determine important functional and esthetic sequels. To avoid these unfortunate consequences it is necessary to establish the exact appropriate moment to begin the treatment and decide which the most adequate therapeutic procedure is. Based on clinical data collected by a serial clinical observations correlated with imaging data, and processed by a computer-aided diagnosis system (CAD), the study intended to develop a treatment algorithm to accurately predict the best final results, from the esthetical and functional point of view, for a certain type of lesion. The preliminary database was composed of 75 patients divided into 4 groups according to the treatment management they received: medical therapy, sclerotherapy, surgical excision and no treatment. The serial clinical observation was performed each month and all the data was processed by using CAD. The project goal was to create a software that incorporated advanced methods to accurately measure the specific IH lesions, integrated medical information, statistical methods and computational methods to correlate this information with that obtained from the processing of images. Based on these correlations, a prediction mechanism of the evolution of hemangioma, which helped determine the best method of therapeutic intervention to minimize further complications, was established.

  10. A Simple and Accurate Model to Predict Responses to Multi-electrode Stimulation in the Retina

    PubMed Central

    Maturana, Matias I.; Apollo, Nicholas V.; Hadjinicolaou, Alex E.; Garrett, David J.; Cloherty, Shaun L.; Kameneva, Tatiana; Grayden, David B.; Ibbotson, Michael R.; Meffin, Hamish

    2016-01-01

    Implantable electrode arrays are widely used in therapeutic stimulation of the nervous system (e.g. cochlear, retinal, and cortical implants). Currently, most neural prostheses use serial stimulation (i.e. one electrode at a time) despite this severely limiting the repertoire of stimuli that can be applied. Methods to reliably predict the outcome of multi-electrode stimulation have not been available. Here, we demonstrate that a linear-nonlinear model accurately predicts neural responses to arbitrary patterns of stimulation using in vitro recordings from single retinal ganglion cells (RGCs) stimulated with a subretinal multi-electrode array. In the model, the stimulus is projected onto a low-dimensional subspace and then undergoes a nonlinear transformation to produce an estimate of spiking probability. The low-dimensional subspace is estimated using principal components analysis, which gives the neuron’s electrical receptive field (ERF), i.e. the electrodes to which the neuron is most sensitive. Our model suggests that stimulation proportional to the ERF yields a higher efficacy given a fixed amount of power when compared to equal amplitude stimulation on up to three electrodes. We find that the model captures the responses of all the cells recorded in the study, suggesting that it will generalize to most cell types in the retina. The model is computationally efficient to evaluate and, therefore, appropriate for future real-time applications including stimulation strategies that make use of recorded neural activity to improve the stimulation strategy. PMID:27035143

  11. Macaques can predict social outcomes from facial expressions.

    PubMed

    Waller, Bridget M; Whitehouse, Jamie; Micheletta, Jérôme

    2016-09-01

    There is widespread acceptance that facial expressions are useful in social interactions, but empirical demonstration of their adaptive function has remained elusive. Here, we investigated whether macaques can use the facial expressions of others to predict the future outcomes of social interaction. Crested macaques (Macaca nigra) were shown an approach between two unknown individuals on a touchscreen and were required to choose between one of two potential social outcomes. The facial expressions of the actors were manipulated in the last frame of the video. One subject reached the experimental stage and accurately predicted different social outcomes depending on which facial expressions the actors displayed. The bared-teeth display (homologue of the human smile) was most strongly associated with predicted friendly outcomes. Contrary to our predictions, screams and threat faces were not associated more with conflict outcomes. Overall, therefore, the presence of any facial expression (compared to neutral) caused the subject to choose friendly outcomes more than negative outcomes. Facial expression in general, therefore, indicated a reduced likelihood of social conflict. The findings dispute traditional theories that view expressions only as indicators of present emotion and instead suggest that expressions form part of complex social interactions where individuals think beyond the present.

  12. Short-Term Load Forecasting Based Automatic Distribution Network Reconfiguration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang; Ding, Fei; Zhang, Yingchen

    In a traditional dynamic network reconfiguration study, the optimal topology is determined at every scheduled time point by using the real load data measured at that time. The development of the load forecasting technique can provide an accurate prediction of the load power that will happen in a future time and provide more information about load changes. With the inclusion of load forecasting, the optimal topology can be determined based on the predicted load conditions during a longer time period instead of using a snapshot of the load at the time when the reconfiguration happens; thus, the distribution system operatormore » can use this information to better operate the system reconfiguration and achieve optimal solutions. This paper proposes a short-term load forecasting approach to automatically reconfigure distribution systems in a dynamic and pre-event manner. Specifically, a short-term and high-resolution distribution system load forecasting approach is proposed with a forecaster based on support vector regression and parallel parameters optimization. The network reconfiguration problem is solved by using the forecasted load continuously to determine the optimal network topology with the minimum amount of loss at the future time. The simulation results validate and evaluate the proposed approach.« less

  13. Short-Term Load Forecasting Based Automatic Distribution Network Reconfiguration: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang; Ding, Fei; Zhang, Yingchen

    In the traditional dynamic network reconfiguration study, the optimal topology is determined at every scheduled time point by using the real load data measured at that time. The development of load forecasting technique can provide accurate prediction of load power that will happen in future time and provide more information about load changes. With the inclusion of load forecasting, the optimal topology can be determined based on the predicted load conditions during the longer time period instead of using the snapshot of load at the time when the reconfiguration happens, and thus it can provide information to the distribution systemmore » operator (DSO) to better operate the system reconfiguration to achieve optimal solutions. Thus, this paper proposes a short-term load forecasting based approach for automatically reconfiguring distribution systems in a dynamic and pre-event manner. Specifically, a short-term and high-resolution distribution system load forecasting approach is proposed with support vector regression (SVR) based forecaster and parallel parameters optimization. And the network reconfiguration problem is solved by using the forecasted load continuously to determine the optimal network topology with the minimum loss at the future time. The simulation results validate and evaluate the proposed approach.« less

  14. Short-Term Load Forecasting-Based Automatic Distribution Network Reconfiguration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang; Ding, Fei; Zhang, Yingchen

    In a traditional dynamic network reconfiguration study, the optimal topology is determined at every scheduled time point by using the real load data measured at that time. The development of the load forecasting technique can provide an accurate prediction of the load power that will happen in a future time and provide more information about load changes. With the inclusion of load forecasting, the optimal topology can be determined based on the predicted load conditions during a longer time period instead of using a snapshot of the load at the time when the reconfiguration happens; thus, the distribution system operatormore » can use this information to better operate the system reconfiguration and achieve optimal solutions. This paper proposes a short-term load forecasting approach to automatically reconfigure distribution systems in a dynamic and pre-event manner. Specifically, a short-term and high-resolution distribution system load forecasting approach is proposed with a forecaster based on support vector regression and parallel parameters optimization. The network reconfiguration problem is solved by using the forecasted load continuously to determine the optimal network topology with the minimum amount of loss at the future time. The simulation results validate and evaluate the proposed approach.« less

  15. Uncertainty is associated with increased selective attention and sustained stimulus processing.

    PubMed

    Dieterich, Raoul; Endrass, Tanja; Kathmann, Norbert

    2016-06-01

    Uncertainty about future threat has been found to be associated with an overestimation of threat probability and is hypothesized to elicit additional allocation of attention. We used event-related potentials to examine uncertainty-related dynamics in attentional allocation, exploiting brain potentials' high temporal resolution and sensitivity to attention. Thirty participants performed a picture-viewing task in which cues indicated the subsequent picture valence. A certain-neutral and a certain-aversive cue accurately predicted subsequent picture valence, whereas an uncertain cue did not. Participants overestimated the effective frequency of aversive pictures following the uncertain cue, both during and after the task, signifying expectancy and covariation biases, and they tended to express lower subjective valences for aversive pictures presented after the uncertain cue. Pictures elicited increased P2 and LPP amplitudes when their valence could not be predicted from the cue. For the LPP, this effect was more pronounced in response to neutral pictures. Uncertainty appears to enhance the engagement of early phasic and sustained attention for uncertainly cued targets. Thus, defensive motivation related to uncertainty about future threat elicits specific attentional dynamics implicating prioritization at various processing stages, especially for nonthreatening stimuli that tend to violate expectations.

  16. DNA methylation-based forensic age prediction using artificial neural networks and next generation sequencing.

    PubMed

    Vidaki, Athina; Ballard, David; Aliferi, Anastasia; Miller, Thomas H; Barron, Leon P; Syndercombe Court, Denise

    2017-05-01

    The ability to estimate the age of the donor from recovered biological material at a crime scene can be of substantial value in forensic investigations. Aging can be complex and is associated with various molecular modifications in cells that accumulate over a person's lifetime including epigenetic patterns. The aim of this study was to use age-specific DNA methylation patterns to generate an accurate model for the prediction of chronological age using data from whole blood. In total, 45 age-associated CpG sites were selected based on their reported age coefficients in a previous extensive study and investigated using publicly available methylation data obtained from 1156 whole blood samples (aged 2-90 years) analysed with Illumina's genome-wide methylation platforms (27K/450K). Applying stepwise regression for variable selection, 23 of these CpG sites were identified that could significantly contribute to age prediction modelling and multiple regression analysis carried out with these markers provided an accurate prediction of age (R 2 =0.92, mean absolute error (MAE)=4.6 years). However, applying machine learning, and more specifically a generalised regression neural network model, the age prediction significantly improved (R 2 =0.96) with a MAE=3.3 years for the training set and 4.4 years for a blind test set of 231 cases. The machine learning approach used 16 CpG sites, located in 16 different genomic regions, with the top 3 predictors of age belonged to the genes NHLRC1, SCGN and CSNK1D. The proposed model was further tested using independent cohorts of 53 monozygotic twins (MAE=7.1 years) and a cohort of 1011 disease state individuals (MAE=7.2 years). Furthermore, we highlighted the age markers' potential applicability in samples other than blood by predicting age with similar accuracy in 265 saliva samples (R 2 =0.96) with a MAE=3.2 years (training set) and 4.0 years (blind test). In an attempt to create a sensitive and accurate age prediction test, a next generation sequencing (NGS)-based method able to quantify the methylation status of the selected 16 CpG sites was developed using the Illumina MiSeq ® platform. The method was validated using DNA standards of known methylation levels and the age prediction accuracy has been initially assessed in a set of 46 whole blood samples. Although the resulted prediction accuracy using the NGS data was lower compared to the original model (MAE=7.5years), it is expected that future optimization of our strategy to account for technical variation as well as increasing the sample size will improve both the prediction accuracy and reproducibility. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. DNDO Report: Predicting Solar Modulation Potentials for Modeling Cosmic Background Radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behne, Patrick Alan

    The modeling of the detectability of special nuclear material (SNM) at ports and border crossings requires accurate knowledge of the background radiation at those locations. Background radiation originates from two main sources, cosmic and terrestrial. Cosmic background is produced by high-energy galactic cosmic rays (GCR) entering the atmosphere and inducing a cascade of particles that eventually impact the earth’s surface. The solar modulation potential represents one of the primary inputs to modeling cosmic background radiation. Usosokin et al. formally define solar modulation potential as “the mean energy loss [per unit charge] of a cosmic ray particle inside the heliosphere…” Modulationmore » potential, a function of elevation, location, and time, shares an inverse relationship with cosmic background radiation. As a result, radiation detector thresholds require adjustment to account for differing background levels, caused partly by differing solar modulations. Failure to do so can result in higher rates of false positives and failed detection of SNM for low and high levels of solar modulation potential, respectively. This study focuses on solar modulation’s time dependence, and seeks the best method to predict modulation for future dates using Python. To address the task of predicting future solar modulation, we utilize both non-linear least squares sinusoidal curve fitting and cubic spline interpolation. This material will be published in transactions of the ANS winter meeting of November, 2016.« less

  18. Understanding and predicting the dynamics of tokamak discharges during startup and rampdown

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, G. L.; Politzer, P. A.; Humphreys, D. A.

    Understanding the dynamics of plasma startup and termination is important for present tokamaks and for predictive modeling of future burning plasma devices such as ITER. We report on experiments in the DIII-D tokamak that explore the plasma startup and rampdown phases and on the benchmarking of transport models. Key issues have been examined such as plasma initiation and burnthrough with limited inductive voltage and achieving flattop and maximum burn within the technical limits of coil systems and their actuators while maintaining the desired q profile. Successful rampdown requires scenarios consistent with technical limits, including controlled H-L transitions, while avoiding verticalmore » instabilities, additional Ohmic transformer flux consumption, and density limit disruptions. Discharges were typically initiated with an inductive electric field typical of ITER, 0.3 V/m, most with second harmonic electron cyclotron assist. A fast framing camera was used during breakdown and burnthrough of low Z impurity charge states to study the formation physics. An improved 'large aperture' ITER startup scenario was developed, and aperture reduction in rampdown was found to be essential to avoid instabilities. Current evolution using neoclassical conductivity in the CORSICA code agrees with rampup experiments, but the prediction of the temperature and internal inductance evolution using the Coppi-Tang model for electron energy transport is not yet accurate enough to allow extrapolation to future devices.« less

  19. Dynamic modeling of green algae cultivation in a photobioreactor for sustainable biodiesel production.

    PubMed

    Del Rio-Chanona, Ehecatl A; Liu, Jiao; Wagner, Jonathan L; Zhang, Dongda; Meng, Yingying; Xue, Song; Shah, Nilay

    2018-02-01

    Biodiesel produced from microalgae has been extensively studied due to its potentially outstanding advantages over traditional transportation fuels. In order to facilitate its industrialization and improve the process profitability, it is vital to construct highly accurate models capable of predicting the complex behavior of the investigated biosystem for process optimization and control, which forms the current research goal. Three original contributions are described in this paper. Firstly, a dynamic model is constructed to simulate the complicated effect of light intensity, nutrient supply and light attenuation on both biomass growth and biolipid production. Secondly, chlorophyll fluorescence, an instantly measurable variable and indicator of photosynthetic activity, is embedded into the model to monitor and update model accuracy especially for the purpose of future process optimal control, and its correlation between intracellular nitrogen content is quantified, which to the best of our knowledge has never been addressed so far. Thirdly, a thorough experimental verification is conducted under different scenarios including both continuous illumination and light/dark cycle conditions to testify the model predictive capability particularly for long-term operation, and it is concluded that the current model is characterized by a high level of predictive capability. Based on the model, the optimal light intensity for algal biomass growth and lipid synthesis is estimated. This work, therefore, paves the way to forward future process design and real-time optimization. © 2017 Wiley Periodicals, Inc.

  20. A dual-process account of auditory change detection.

    PubMed

    McAnally, Ken I; Martin, Russell L; Eramudugolla, Ranmalee; Stuart, Geoffrey W; Irvine, Dexter R F; Mattingley, Jason B

    2010-08-01

    Listeners can be "deaf" to a substantial change in a scene comprising multiple auditory objects unless their attention has been directed to the changed object. It is unclear whether auditory change detection relies on identification of the objects in pre- and post-change scenes. We compared the rates at which listeners correctly identify changed objects with those predicted by change-detection models based on signal detection theory (SDT) and high-threshold theory (HTT). Detected changes were not identified as accurately as predicted by models based on either theory, suggesting that some changes are detected by a process that does not support change identification. Undetected changes were identified as accurately as predicted by the HTT model but much less accurately than predicted by the SDT models. The process underlying change detection was investigated further by determining receiver-operating characteristics (ROCs). ROCs did not conform to those predicted by either a SDT or a HTT model but were well modeled by a dual-process that incorporated HTT and SDT components. The dual-process model also accurately predicted the rates at which detected and undetected changes were correctly identified.

  1. Review on applications of artificial intelligence methods for dam and reservoir-hydro-environment models.

    PubMed

    Allawi, Mohammed Falah; Jaafar, Othman; Mohamad Hamzah, Firdaus; Abdullah, Sharifah Mastura Syed; El-Shafie, Ahmed

    2018-05-01

    Efficacious operation for dam and reservoir system could guarantee not only a defenselessness policy against natural hazard but also identify rule to meet the water demand. Successful operation of dam and reservoir systems to ensure optimal use of water resources could be unattainable without accurate and reliable simulation models. According to the highly stochastic nature of hydrologic parameters, developing accurate predictive model that efficiently mimic such a complex pattern is an increasing domain of research. During the last two decades, artificial intelligence (AI) techniques have been significantly utilized for attaining a robust modeling to handle different stochastic hydrological parameters. AI techniques have also shown considerable progress in finding optimal rules for reservoir operation. This review research explores the history of developing AI in reservoir inflow forecasting and prediction of evaporation from a reservoir as the major components of the reservoir simulation. In addition, critical assessment of the advantages and disadvantages of integrated AI simulation methods with optimization methods has been reported. Future research on the potential of utilizing new innovative methods based AI techniques for reservoir simulation and optimization models have also been discussed. Finally, proposal for the new mathematical procedure to accomplish the realistic evaluation of the whole optimization model performance (reliability, resilience, and vulnerability indices) has been recommended.

  2. Processing LiDAR Data to Predict Natural Hazards

    NASA Technical Reports Server (NTRS)

    Fairweather, Ian; Crabtree, Robert; Hager, Stacey

    2008-01-01

    ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.

  3. Improving efficacy of metastatic tumor segmentation to facilitate early prediction of ovarian cancer patients' response to chemotherapy

    NASA Astrophysics Data System (ADS)

    Danala, Gopichandh; Wang, Yunzhi; Thai, Theresa; Gunderson, Camille C.; Moxley, Katherine M.; Moore, Kathleen; Mannel, Robert S.; Cheng, Samuel; Liu, Hong; Zheng, Bin; Qiu, Yuchen

    2017-02-01

    Accurate tumor segmentation is a critical step in the development of the computer-aided detection (CAD) based quantitative image analysis scheme for early stage prognostic evaluation of ovarian cancer patients. The purpose of this investigation is to assess the efficacy of several different methods to segment the metastatic tumors occurred in different organs of ovarian cancer patients. In this study, we developed a segmentation scheme consisting of eight different algorithms, which can be divided into three groups: 1) Region growth based methods; 2) Canny operator based methods; and 3) Partial differential equation (PDE) based methods. A number of 138 tumors acquired from 30 ovarian cancer patients were used to test the performance of these eight segmentation algorithms. The results demonstrate each of the tested tumors can be successfully segmented by at least one of the eight algorithms without the manual boundary correction. Furthermore, modified region growth, classical Canny detector, and fast marching, and threshold level set algorithms are suggested in the future development of the ovarian cancer related CAD schemes. This study may provide meaningful reference for developing novel quantitative image feature analysis scheme to more accurately predict the response of ovarian cancer patients to the chemotherapy at early stage.

  4. Pitching Emotions: The Interpersonal Effects of Emotions in Professional Baseball

    PubMed Central

    Cheshin, Arik; Heerdink, Marc W.; Kossakowski, Jolanda J.; Van Kleef, Gerben A.

    2016-01-01

    Sports games are inherently emotional situations, but surprisingly little is known about the social consequences of these emotions. We examined the interpersonal effects of emotional expressions in professional baseball. Specifically, we investigated whether pitchers’ facial displays influence how pitches are assessed and responded to. Using footage from the Major League Baseball World Series finals, we isolated incidents where the pitcher’s face was visible before a pitch. A pre-study indicated that participants consistently perceived anger, happiness, and worry in pitchers’ facial displays. An independent sample then predicted pitch characteristics and batter responses based on the same perceived emotional displays. Participants expected pitchers perceived as happy to throw more accurate balls, pitchers perceived as angry to throw faster and more difficult balls, and pitchers perceived as worried to throw slower and less accurate balls. Batters were expected to approach (swing) when faced with a pitcher perceived as happy and to avoid (no swing) when faced with a pitcher perceived as worried. Whereas previous research focused on using emotional expressions as information regarding past and current situations, our work suggests that people also use perceived emotional expressions to predict future behavior. Our results attest to the impact perceived emotional expressions can have on professional sports. PMID:26909062

  5. A probabilistic and adaptive approach to modeling performance of pavement infrastructure

    DOT National Transportation Integrated Search

    2007-08-01

    Accurate prediction of pavement performance is critical to pavement management agencies. Reliable and accurate predictions of pavement infrastructure performance can save significant amounts of money for pavement infrastructure management agencies th...

  6. The Perfect Burrow, but for What? Identifying Local Habitat Conditions Promoting the Presence of the Host and Vector Species in the Kazakh Plague System

    PubMed Central

    Wilschut, Liesbeth; Addink, Elisabeth; Ageyev, Vladimir; Yeszhanov, Aidyn; Sapozhnikov, Valerij; Belayev, Alexander; Davydova, Tania; Eagle, Sally; Begon, Mike

    2015-01-01

    Introduction The wildlife plague system in the Pre-Balkhash desert of Kazakhstan has been a subject of study for many years. Much progress has been made in generating a method of predicting outbreaks of the disease (infection by the gram negative bacterium Yersinia pestis) but existing methods are not yet accurate enough to inform public health planning. The present study aimed to identify characteristics of individual mammalian host (Rhombomys opimus) burrows related to and potentially predictive of the presence of R.opimus and the dominant flea vectors (Xenopsylla spp.). Methods Over four seasons, burrow characteristics, their current occupancy status, and flea and tick burden of the occupants were recorded in the field. A second data set was generated of long term occupancy trends by recording the occupancy status of specific burrows over multiple occasions. Generalised linear mixed models were constructed to identify potential burrow properties predictive of either occupancy or flea burden. Results At the burrow level, it was identified that a burrow being occupied by Rhombomys, and remaining occupied, were both related to the characteristics of the sediment in which the burrow was constructed. The flea burden of Rhombomys in a burrow was found to be related to the tick burden. Further larger scale properties were also identified as being related to both Rhombomys and flea presence, including latitudinal position and the season. Conclusions Therefore, in advancing our current predictions of plague in Kazakhstan, we must consider the landscape at this local level to increase our accuracy in predicting the dynamics of gerbil and flea populations. Furthermore this demonstrates that in other zoonotic systems, it may be useful to consider the distribution and location of suitable habitat for both host and vector species at this fine scale to accurately predict future epizootics. PMID:26325073

  7. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    NASA Astrophysics Data System (ADS)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we contend that creating believable soil carbon predictions requires a robust, transparent, and community-available benchmarking framework. I will present an ILAMB evaluation of several of the above-mentioned approaches in ACME, and attempt to motivate community adoption of this evaluation approach.

  8. Does ADHD in adults affect the relative accuracy of metamemory judgments?

    PubMed

    Knouse, Laura E; Paradise, Matthew J; Dunlosky, John

    2006-11-01

    Prior research suggests that individuals with ADHD overestimate their performance across domains despite performing more poorly in these domains. The authors introduce measures of accuracy from the larger realm of judgment and decision making--namely, relative accuracy and calibration--to the study of self-evaluative judgment accuracy in adults with ADHD. Twenty-eight adults with ADHD and 28 matched controls participate in a computer-administered paired-associate learning task and predict their future recall using immediate and delayed judgments of learning (JOLs). Retrospective confidence judgments are also collected. Groups perform equally in terms of judgment magnitude and absolute judgment accuracy as measured by discrepancy scores and calibration curves. Both groups benefit equally from making their JOL at a delay, and the group with ADHD show higher relative accuracy for delayed judgments. Results suggest that under certain circumstances, adults with ADHD can make accurate judgments about their future memory.

  9. Chemical reacting flows

    NASA Technical Reports Server (NTRS)

    Mularz, Edward J.; Sockol, Peter M.

    1987-01-01

    Future aerospace propulsion concepts involve the combination of liquid or gaseous fuels in a highly turbulent internal air stream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at Lewis to better understand chemical reacting flows with the long term goal of establishing these reliable computer codes. The approach to understanding chemical reacting flows is to look at separate simple parts of this complex phenomena as well as to study the full turbulent reacting flow process. As a result research on the fluid mechanics associated with chemical reacting flows was initiated. The chemistry of fuel-air combustion is also being studied. Finally, the phenomena of turbulence-combustion interaction is being investigated. This presentation will highlight research, both experimental and analytical, in each of these three major areas.

  10. Chemical reacting flows

    NASA Technical Reports Server (NTRS)

    Mularz, Edward J.; Sockol, Peter M.

    1990-01-01

    Future aerospace propulsion concepts involve the combustion of liquid or gaseous fuels in a highly turbulent internal airstream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence-combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at LeRC to better understand chemical reacting flows with the long-term goal of establishing these reliable computer codes. Our approach to understand chemical reacting flows is to look at separate, more simple parts of this complex phenomenon as well as to study the full turbulent reacting flow process. As a result, we are engaged in research on the fluid mechanics associated with chemical reacting flows. We are also studying the chemistry of fuel-air combustion. Finally, we are investigating the phenomenon of turbulence-combustion interaction. Research, both experimental and analytical, is highlighted in each of these three major areas.

  11. A Personalized Approach in Progressive Multiple Sclerosis: The Current Status of Disease Modifying Therapies (DMTs) and Future Perspectives

    PubMed Central

    D’Amico, Emanuele; Patti, Francesco; Zanghì, Aurora; Zappia, Mario

    2016-01-01

    Using the term of progressive multiple sclerosis (PMS), we considered a combined population of persons with secondary progressive MS (SPMS) and primary progressive MS (PPMS). These forms of MS cannot be challenged with efficacy by the licensed therapy. In the last years, several measures of risk estimation were developed for predicting clinical course in MS, but none is specific for the PMS forms. Personalized medicine is a therapeutic approach, based on identifying what might be the best therapy for an individual patient, taking into account the risk profile. We need to achieve more accurate estimates of useful predictors in PMS, including unconventional and qualitative markers which are not yet currently available or practicable routine diagnostics. The evaluation of an individual patient is based on the profile of disease activity.Within the neurology field, PMS is one of the fastest-moving going into the future. PMID:27763513

  12. Short-Term State Forecasting-Based Optimal Voltage Regulation in Distribution Systems: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Rui; Jiang, Huaiguang; Zhang, Yingchen

    2017-05-17

    A novel short-term state forecasting-based optimal power flow (OPF) approach for distribution system voltage regulation is proposed in this paper. An extreme learning machine (ELM) based state forecaster is developed to accurately predict system states (voltage magnitudes and angles) in the near future. Based on the forecast system states, a dynamically weighted three-phase AC OPF problem is formulated to minimize the voltage violations with higher penalization on buses which are forecast to have higher voltage violations in the near future. By solving the proposed OPF problem, the controllable resources in the system are optimally coordinated to alleviate the potential severemore » voltage violations and improve the overall voltage profile. The proposed approach has been tested in a 12-bus distribution system and simulation results are presented to demonstrate the performance of the proposed approach.« less

  13. Reservoir adaptive operating rules based on both of historical streamflow and future projections

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Liu, Pan; Wang, Hao; Chen, Jie; Lei, Xiaohui; Feng, Maoyuan

    2017-10-01

    Climate change is affecting hydrological variables and consequently is impacting water resources management. Historical strategies are no longer applicable under climate change. Therefore, adaptive management, especially adaptive operating rules for reservoirs, has been developed to mitigate the possible adverse effects of climate change. However, to date, adaptive operating rules are generally based on future projections involving uncertainties under climate change, yet ignoring historical information. To address this, we propose an approach for deriving adaptive operating rules considering both historical information and future projections, namely historical and future operating rules (HAFOR). A robustness index was developed by comparing benefits from HAFOR with benefits from conventional operating rules (COR). For both historical and future streamflow series, maximizations of both average benefits and the robustness index were employed as objectives, and four trade-offs were implemented to solve the multi-objective problem. Based on the integrated objective, the simulation-based optimization method was used to optimize the parameters of HAFOR. Using the Dongwushi Reservoir in China as a case study, HAFOR was demonstrated to be an effective and robust method for developing adaptive operating rules under the uncertain changing environment. Compared with historical or projected future operating rules (HOR or FPOR), HAFOR can reduce the uncertainty and increase the robustness for future projections, especially regarding results of reservoir releases and volumes. HAFOR, therefore, facilitates adaptive management in the context that climate change is difficult to predict accurately.

  14. Limitations to the use of two-dimensional thermal modeling of a nuclear waste repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, B.W.

    1979-01-04

    Thermal modeling of a nuclear waste repository is basic to most waste management predictive models. It is important that the modeling techniques accurately determine the time-dependent temperature distribution of the waste emplacement media. Recent modeling studies show that the time-dependent temperature distribution can be accurately modeled in the far-field using a 2-dimensional (2-D) planar numerical model; however, the near-field cannot be modeled accurately enough by either 2-D axisymmetric or 2-D planar numerical models for repositories in salt. The accuracy limits of 2-D modeling were defined by comparing results from 3-dimensional (3-D) TRUMP modeling with results from both 2-D axisymmetric andmore » 2-D planar. Both TRUMP and ADINAT were employed as modeling tools. Two-dimensional results from the finite element code, ADINAT were compared with 2-D results from the finite difference code, TRUMP; they showed almost perfect correspondence in the far-field. This result adds substantially to confidence in future use of ADINAT and its companion stress code ADINA for thermal stress analysis. ADINAT was found to be somewhat sensitive to time step and mesh aspect ratio. 13 figures, 4 tables.« less

  15. Number of repetitions for evaluating technological traits in cotton genotypes.

    PubMed

    Carvalho, L P; Farias, F J C; Morello, C L; Rodrigues, J I S; Teodoro, P E

    2016-08-19

    With the changes in spinning technology, technological cotton traits, such as fiber length, fiber uniformity, fiber strength, fineness, fiber maturity, percentage of fibers, and short fiber index, are of great importance for selecting cotton genotypes. However, for accurate discrimination of genotypes, it is important that these traits are evaluated with the best possible accuracy. The aim of this study was to determine the number of measurements (repetitions) needed to accurately assess technological traits of cotton genotypes. Seven experiments were conducted in four Brazilian States (Ceará, Rio Grande do Norte, Goiás, and Mato Grosso do Sul). We used nine brown and two white colored fiber lines in a randomized block design with four replications. After verifying the assumptions of residual normality and homogeneity of variances, analysis of variance was performed to estimate the repeatability coefficient and calculating the number of repetitions. Trials with four replications were found to be sufficient to identify superior cotton genotypes for all measured traits except short fiber index with a selective accuracy >90% and at least 81% accuracy in predicting their actual value. These results allow more accurate and reliable results in future researches with evaluating technological traits in cotton genotypes.

  16. Fuzzy association rule mining and classification for the prediction of malaria in South Korea.

    PubMed

    Buczak, Anna L; Baugher, Benjamin; Guven, Erhan; Ramac-Thomas, Liane C; Elbert, Yevgeniy; Babin, Steven M; Lewis, Sheri H

    2015-06-18

    Malaria is the world's most prevalent vector-borne disease. Accurate prediction of malaria outbreaks may lead to public health interventions that mitigate disease morbidity and mortality. We describe an application of a method for creating prediction models utilizing Fuzzy Association Rule Mining to extract relationships between epidemiological, meteorological, climatic, and socio-economic data from Korea. These relationships are in the form of rules, from which the best set of rules is automatically chosen and forms a classifier. Two classifiers have been built and their results fused to become a malaria prediction model. Future malaria cases are predicted as Low, Medium or High, where these classes are defined as a total of 0-2, 3-16, and above 17 cases, respectively, for a region in South Korea during a two-week period. Based on user recommendations, HIGH is considered an outbreak. Model accuracy is described by Positive Predictive Value (PPV), Sensitivity, and F-score for each class, computed on test data not previously used to develop the model. For predictions made 7-8 weeks in advance, model PPV and Sensitivity are 0.842 and 0.681, respectively, for the HIGH classes. The F0.5 and F3 scores (which combine PPV and Sensitivity) are 0.804 and 0.694, respectively, for the HIGH classes. The overall FARM results (as measured by F-scores) are significantly better than those obtained by Decision Tree, Random Forest, Support Vector Machine, and Holt-Winters methods for the HIGH class. For the Medium class, Random Forest and FARM obtain comparable results, with FARM being better at F0.5, and Random Forest obtaining a higher F3. A previously described method for creating disease prediction models has been modified and extended to build models for predicting malaria. In addition, some new input variables were used, including indicators of intervention measures. The South Korea malaria prediction models predict Low, Medium or High cases 7-8 weeks in the future. This paper demonstrates that our data driven approach can be used for the prediction of different diseases.

  17. Predictability of the 2012 Great Arctic Cyclone on medium-range timescales

    NASA Astrophysics Data System (ADS)

    Yamagami, Akio; Matsueda, Mio; Tanaka, Hiroshi L.

    2018-03-01

    Arctic Cyclones (ACs) can have a significant impact on the Arctic region. Therefore, the accurate prediction of ACs is important in anticipating their associated environmental and societal costs. This study investigates the predictability of the 2012 Great Arctic Cyclone (AC12) that exhibited a minimum central pressure of 964 hPa on 6 August 2012, using five medium-range ensemble forecasts. We show that the development and position of AC12 were better predicted in forecasts initialized on and after 4 August 2012. In addition, the position of AC12 was more predictable than its development. A comparison of ensemble members, classified by the error in predictability of the development and position of AC12, revealed that an accurate prediction of upper-level fields, particularly temperature, was important for the prediction of this event. The predicted position of AC12 was influenced mainly by the prediction of the polar vortex, whereas the predicted development of AC12 was dependent primarily on the prediction of the merging of upper-level warm cores. Consequently, an accurate prediction of the polar vortex position and the development of the warm core through merging resulted in better prediction of AC12.

  18. Molecular Modeling of Thermodynamic and Transport Properties for CO 2 and Aqueous Brines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Hao; Economou, Ioannis G.; Panagiotopoulos, Athanassios Z.

    Molecular simulation techniques using classical force-fields occupy the space between ab initio quantum mechanical methods and phenomenological correlations. In particular, Monte Carlo and molecular dynamics algorithms can be used to provide quantitative predictions of thermodynamic and transport properties of fluids relevant for geologic carbon sequestration at conditions for which experimental data are uncertain or not available. These methods can cover time and length scales far exceeding those of quantum chemical methods, while maintaining transferability and predictive power lacking from phenomenological correlations. The accuracy of predictions depends sensitively on the quality of the molecular models used. Many existing fixed-point-charge models formore » water and aqueous mixtures fail to represent accurately these fluid properties, especially when descriptions covering broad ranges of thermodynamic conditions are needed. Recent work on development of accurate models for water, CO 2, and dissolved salts, as well as their mixtures, is summarized in this Account. Polarizable models that can respond to the different dielectric environments in aqueous versus nonaqueous phases are necessary for predictions of properties over extended ranges of temperatures and pressures. Phase compositions and densities, activity coefficients of the dissolved salts, interfacial tensions, viscosities and diffusivities can be obtained in near-quantitative agreement to available experimental data, using relatively modest computational resources. In some cases, for example, for the composition of the CO 2-rich phase in coexistence with an aqueous phase, recent results from molecular simulations have helped discriminate among conflicting experimental data sets. The sensitivity of properties on the quality of the intermolecular interaction model varies significantly. Properties such as the phase compositions or electrolyte activity coefficients are much more sensitive than phase densities, viscosities, or component diffusivities. Strong confinement effects on physical properties in nanoscale media can also be directly obtained from molecular simulations. Future work on molecular modeling for CO 2 and aqueous brines is likely to be focused on more systematic generation of interaction models by utilizing quantum chemical as well as direct experimental measurements. New ion models need to be developed for use with the current generation of polarizable water models, including ion–ion interactions that will allow for accurate description of dense, mixed brines. Methods will need to be devised that go beyond the use of effective potentials for incorporation of quantum effects known to be important for water, and reactive force fields developed that can handle bond creation and breaking in systems with carbonate and silicate minerals. Lastly, another area of potential future work is the integration of molecular simulation methods in multiscale models for the chemical reactions leading to mineral dissolution and flow within the porous media in underground formations.« less

  19. Molecular Modeling of Thermodynamic and Transport Properties for CO2 and Aqueous Brines.

    PubMed

    Jiang, Hao; Economou, Ioannis G; Panagiotopoulos, Athanassios Z

    2017-04-18

    Molecular simulation techniques using classical force-fields occupy the space between ab initio quantum mechanical methods and phenomenological correlations. In particular, Monte Carlo and molecular dynamics algorithms can be used to provide quantitative predictions of thermodynamic and transport properties of fluids relevant for geologic carbon sequestration at conditions for which experimental data are uncertain or not available. These methods can cover time and length scales far exceeding those of quantum chemical methods, while maintaining transferability and predictive power lacking from phenomenological correlations. The accuracy of predictions depends sensitively on the quality of the molecular models used. Many existing fixed-point-charge models for water and aqueous mixtures fail to represent accurately these fluid properties, especially when descriptions covering broad ranges of thermodynamic conditions are needed. Recent work on development of accurate models for water, CO 2 , and dissolved salts, as well as their mixtures, is summarized in this Account. Polarizable models that can respond to the different dielectric environments in aqueous versus nonaqueous phases are necessary for predictions of properties over extended ranges of temperatures and pressures. Phase compositions and densities, activity coefficients of the dissolved salts, interfacial tensions, viscosities and diffusivities can be obtained in near-quantitative agreement to available experimental data, using relatively modest computational resources. In some cases, for example, for the composition of the CO 2 -rich phase in coexistence with an aqueous phase, recent results from molecular simulations have helped discriminate among conflicting experimental data sets. The sensitivity of properties on the quality of the intermolecular interaction model varies significantly. Properties such as the phase compositions or electrolyte activity coefficients are much more sensitive than phase densities, viscosities, or component diffusivities. Strong confinement effects on physical properties in nanoscale media can also be directly obtained from molecular simulations. Future work on molecular modeling for CO 2 and aqueous brines is likely to be focused on more systematic generation of interaction models by utilizing quantum chemical as well as direct experimental measurements. New ion models need to be developed for use with the current generation of polarizable water models, including ion-ion interactions that will allow for accurate description of dense, mixed brines. Methods will need to be devised that go beyond the use of effective potentials for incorporation of quantum effects known to be important for water, and reactive force fields developed that can handle bond creation and breaking in systems with carbonate and silicate minerals. Another area of potential future work is the integration of molecular simulation methods in multiscale models for the chemical reactions leading to mineral dissolution and flow within the porous media in underground formations.

  20. Molecular Modeling of Thermodynamic and Transport Properties for CO 2 and Aqueous Brines

    DOE PAGES

    Jiang, Hao; Economou, Ioannis G.; Panagiotopoulos, Athanassios Z.

    2017-02-24

    Molecular simulation techniques using classical force-fields occupy the space between ab initio quantum mechanical methods and phenomenological correlations. In particular, Monte Carlo and molecular dynamics algorithms can be used to provide quantitative predictions of thermodynamic and transport properties of fluids relevant for geologic carbon sequestration at conditions for which experimental data are uncertain or not available. These methods can cover time and length scales far exceeding those of quantum chemical methods, while maintaining transferability and predictive power lacking from phenomenological correlations. The accuracy of predictions depends sensitively on the quality of the molecular models used. Many existing fixed-point-charge models formore » water and aqueous mixtures fail to represent accurately these fluid properties, especially when descriptions covering broad ranges of thermodynamic conditions are needed. Recent work on development of accurate models for water, CO 2, and dissolved salts, as well as their mixtures, is summarized in this Account. Polarizable models that can respond to the different dielectric environments in aqueous versus nonaqueous phases are necessary for predictions of properties over extended ranges of temperatures and pressures. Phase compositions and densities, activity coefficients of the dissolved salts, interfacial tensions, viscosities and diffusivities can be obtained in near-quantitative agreement to available experimental data, using relatively modest computational resources. In some cases, for example, for the composition of the CO 2-rich phase in coexistence with an aqueous phase, recent results from molecular simulations have helped discriminate among conflicting experimental data sets. The sensitivity of properties on the quality of the intermolecular interaction model varies significantly. Properties such as the phase compositions or electrolyte activity coefficients are much more sensitive than phase densities, viscosities, or component diffusivities. Strong confinement effects on physical properties in nanoscale media can also be directly obtained from molecular simulations. Future work on molecular modeling for CO 2 and aqueous brines is likely to be focused on more systematic generation of interaction models by utilizing quantum chemical as well as direct experimental measurements. New ion models need to be developed for use with the current generation of polarizable water models, including ion–ion interactions that will allow for accurate description of dense, mixed brines. Methods will need to be devised that go beyond the use of effective potentials for incorporation of quantum effects known to be important for water, and reactive force fields developed that can handle bond creation and breaking in systems with carbonate and silicate minerals. Lastly, another area of potential future work is the integration of molecular simulation methods in multiscale models for the chemical reactions leading to mineral dissolution and flow within the porous media in underground formations.« less

  1. Proceedings of the Ship Control Systems Symposium (9th) Held in Bethesda, Maryland on 10-14 September 1990. Theme: Automation in Surface Ship Control Systems, Today’s Applications and Future Trends. Volume 2

    DTIC Science & Technology

    1990-09-14

    residual fuel oil on board a ship (Ref 1,2), has indicated that expert systems are a very -powerful communications tool in that new developments ...characteristics of new designs will increase as will the need for more accurate methods to make those predictions. 2.152 4. PANEL H-10 DEVELOPMENT OF THE... New Y6ik Metropolitan Section, March 11, 1981. 7. Cojeen, H. P., Landsburg, A. C., MacFarlane, A. A., "One Approach to the Development and

  2. Gender Differences in Hip Anatomy: Possible Implications for Injury Tolerance in Frontal Collisions

    PubMed Central

    Wang, Stewart C.; Brede, Chris; Lange, David; Poster, Craig S.; Lange, Aaron W.; Kohoyda-Inglis, Carla; Sochor, Mark R.; Ipaktchi, Kyros; Rowe, Stephen A.; Patel, Smita; Garton, Hugh J.

    2004-01-01

    Male occupants in frontal motor vehicle collisions have reduced tolerance for hip fractures than females in similar crashes. We studied 92 adult pelvic CT scans and found significant gender differences in bony pelvic geometry, including acetabular socket depth and femoral head width. Significant differences were also noted in the presentation angle of the acetabular socket to frontal loading. The observed differences provide biomechanical insight into why hip injury tolerance may differ with gender. These findings have implications for the future design of vehicle countermeasures as well as finite element models capable of more accurately predicting body tolerances to injury. PMID:15319131

  3. Tools and techniques for developing policies for complex and uncertain systems.

    PubMed

    Bankes, Steven C

    2002-05-14

    Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.

  4. Features of πΔ photoproduction at high energies

    DOE PAGES

    Nys, Jannes; Mathieu, V.; Fernandez-Ramirez, C.; ...

    2018-02-02

    Hybrid/exotic meson spectroscopy searches at Jefferson Lab require the accurate theoretical description of the production mechanism in peripheral photoproduction. We develop a model for πΔ photoproduction at high energies (5 ≤ E lab ≤ 16 GeV) that incorporates both the absorbed pion and natural-parity cut contributions. We fit the available observables, providing a good description of the energy and angular dependencies of the experimental data. In conclusion, we also provide predictions for the photon beam asymmetry of charged pions at E lab = 9 GeV which is expected to be measured by GlueX and CLAS12 experiments in the near future.

  5. Features of πΔ photoproduction at high energies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nys, Jannes; Mathieu, V.; Fernandez-Ramirez, C.

    Hybrid/exotic meson spectroscopy searches at Jefferson Lab require the accurate theoretical description of the production mechanism in peripheral photoproduction. We develop a model for πΔ photoproduction at high energies (5 ≤ E lab ≤ 16 GeV) that incorporates both the absorbed pion and natural-parity cut contributions. We fit the available observables, providing a good description of the energy and angular dependencies of the experimental data. In conclusion, we also provide predictions for the photon beam asymmetry of charged pions at E lab = 9 GeV which is expected to be measured by GlueX and CLAS12 experiments in the near future.

  6. Evaluation of Turbulence-Model Performance as Applied to Jet-Noise Prediction

    NASA Technical Reports Server (NTRS)

    Woodruff, S. L.; Seiner, J. M.; Hussaini, M. Y.; Erlebacher, G.

    1998-01-01

    The accurate prediction of jet noise is possible only if the jet flow field can be predicted accurately. Predictions for the mean velocity and turbulence quantities in the jet flowfield are typically the product of a Reynolds-averaged Navier-Stokes solver coupled with a turbulence model. To evaluate the effectiveness of solvers and turbulence models in predicting those quantities most important to jet noise prediction, two CFD codes and several turbulence models were applied to a jet configuration over a range of jet temperatures for which experimental data is available.

  7. Validity of the BodyGem calorimeter and prediction equations for the assessment of resting energy expenditure in overweight and obese Saudi males.

    PubMed

    Almajwal, Ali M; Williams, Peter G; Batterham, Marijka J

    2011-07-01

    To assess the accuracy of resting energy expenditure (REE) measurement in a sample of overweight and obese Saudi males, using the BodyGem device (BG) with whole room calorimetry (WRC) as a reference, and to evaluate the accuracy of predictive equations. Thirty-eight subjects (mean +/- SD, age 26.8+/- 3.7 years, body mass index 31.0+/- 4.8) were recruited during the period from 5 February 2007 to 28 March 2008. Resting energy expenditure was measured using a WRC and BG device, and also calculated using 7 prediction equations. Mean differences, bias, percent of bias (%bias), accurate estimation, underestimation and overestimation were calculated. Repeated measures with the BG were not significantly different (accurate prediction: 81.6%; %bias 1.1+/- 6.3, p>0.24) with limits of agreement ranging from +242 to -200 kcal. Resting energy expenditure measured by BG was significantly less than WRC values (accurate prediction: 47.4%; %bias: 11.0+/- 14.6, p = 0.0001) with unacceptably wide limits of agreement. Harris-Benedict, Schofield and World Health Organization equations were the most accurate, estimating REE within 10% of measured REE, but none seem appropriate to predict the REE of individuals. There was a poor agreement between the REE measured by WRC compared to BG or predictive equations. The BG assessed REE accurately in 47.4% of the subjects on an individual level.

  8. Wind and fairness in ski jumping: A computer modelling analysis.

    PubMed

    Jung, Alexander; Müller, Wolfram; Staat, Manfred

    2018-06-25

    Wind is closely associated with the discussion of fairness in ski jumping. To counter-act its influence on the jump length, the International Ski Federation (FIS) has introduced a wind compensation approach. We applied three differently accurate computer models of the flight phase with wind (M1, M2, and M3) to study the jump length effects of various wind scenarios. The previously used model M1 is accurate for wind blowing in direction of the flight path, but inaccuracies are to be expected for wind directions deviating from the tangent to the flight path. M2 considers the change of airflow direction, but it does not consider the associated change in the angle of attack of the skis which additionally modifies drag and lift area time functions. M3 predicts the length effect for all wind directions within the plane of the flight trajectory without any mathematical simplification. Prediction errors of M3 are determined only by the quality of the input data: wind velocity, drag and lift area functions, take-off velocity, and weight. For comparing the three models, drag and lift area functions of an optimized reference jump were used. Results obtained with M2, which is much easier to handle than M3, did not deviate noticeably when compared to predictions of the reference model M3. Therefore, we suggest to use M2 in future applications. A comparison of M2 predictions with the FIS wind compensation system showed substantial discrepancies, for instance: in the first flight phase, tailwind can increase jump length, and headwind can decrease it; this is opposite of what had been anticipated before and is not considered in the current wind compensation system in ski jumping. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Evaluation of a numerical model's ability to predict bed load transport observed in braided river experiments

    NASA Astrophysics Data System (ADS)

    Javernick, Luke; Redolfi, Marco; Bertoldi, Walter

    2018-05-01

    New data collection techniques offer numerical modelers the ability to gather and utilize high quality data sets with high spatial and temporal resolution. Such data sets are currently needed for calibration, verification, and to fuel future model development, particularly morphological simulations. This study explores the use of high quality spatial and temporal data sets of observed bed load transport in braided river flume experiments to evaluate the ability of a two-dimensional model, Delft3D, to predict bed load transport. This study uses a fixed bed model configuration and examines the model's shear stress calculations, which are the foundation to predict the sediment fluxes necessary for morphological simulations. The evaluation is conducted for three flow rates, and model setup used highly accurate Structure-from-Motion (SfM) topography and discharge boundary conditions. The model was hydraulically calibrated using bed roughness, and performance was evaluated based on depth and inundation agreement. Model bed load performance was evaluated in terms of critical shear stress exceedance area compared to maps of observed bed mobility in a flume. Following the standard hydraulic calibration, bed load performance was tested for sensitivity to horizontal eddy viscosity parameterization and bed morphology updating. Simulations produced depth errors equal to the SfM inherent errors, inundation agreement of 77-85%, and critical shear stress exceedance in agreement with 49-68% of the observed active area. This study provides insight into the ability of physically based, two-dimensional simulations to accurately predict bed load as well as the effects of horizontal eddy viscosity and bed updating. Further, this study highlights how using high spatial and temporal data to capture the physical processes at work during flume experiments can help to improve morphological modeling.

  10. Performance Evaluation of a High Bandwidth Liquid Fuel Modulation Valve for Active Combustion Control

    NASA Technical Reports Server (NTRS)

    Saus, Joseph R.; DeLaat, John C.; Chang, Clarence T.; Vrnak, Daniel R.

    2012-01-01

    At the NASA Glenn Research Center, a characterization rig was designed and constructed for the purpose of evaluating high bandwidth liquid fuel modulation devices to determine their suitability for active combustion control research. Incorporated into the rig s design are features that approximate conditions similar to those that would be encountered by a candidate device if it were installed on an actual combustion research rig. The characterized dynamic performance measures obtained through testing in the rig are planned to be accurate indicators of expected performance in an actual combustion testing environment. To evaluate how well the characterization rig predicts fuel modulator dynamic performance, characterization rig data was compared with performance data for a fuel modulator candidate when the candidate was in operation during combustion testing. Specifically, the nominal and off-nominal performance data for a magnetostrictive-actuated proportional fuel modulation valve is described. Valve performance data were collected with the characterization rig configured to emulate two different combustion rig fuel feed systems. Fuel mass flows and pressures, fuel feed line lengths, and fuel injector orifice size was approximated in the characterization rig. Valve performance data were also collected with the valve modulating the fuel into the two combustor rigs. Comparison of the predicted and actual valve performance data show that when the valve is operated near its design condition the characterization rig can appropriately predict the installed performance of the valve. Improvements to the characterization rig and accompanying modeling activities are underway to more accurately predict performance, especially for the devices under development to modulate fuel into the much smaller fuel injectors anticipated in future lean-burning low-emissions aircraft engine combustors.

  11. Building factorial regression models to explain and predict nitrate concentrations in groundwater under agricultural land

    NASA Astrophysics Data System (ADS)

    Stigter, T. Y.; Ribeiro, L.; Dill, A. M. M. Carvalho

    2008-07-01

    SummaryFactorial regression models, based on correspondence analysis, are built to explain the high nitrate concentrations in groundwater beneath an agricultural area in the south of Portugal, exceeding 300 mg/l, as a function of chemical variables, electrical conductivity (EC), land use and hydrogeological setting. Two important advantages of the proposed methodology are that qualitative parameters can be involved in the regression analysis and that multicollinearity is avoided. Regression is performed on eigenvectors extracted from the data similarity matrix, the first of which clearly reveals the impact of agricultural practices and hydrogeological setting on the groundwater chemistry of the study area. Significant correlation exists between response variable NO3- and explanatory variables Ca 2+, Cl -, SO42-, depth to water, aquifer media and land use. Substituting Cl - by the EC results in the most accurate regression model for nitrate, when disregarding the four largest outliers (model A). When built solely on land use and hydrogeological setting, the regression model (model B) is less accurate but more interesting from a practical viewpoint, as it is based on easily obtainable data and can be used to predict nitrate concentrations in groundwater in other areas with similar conditions. This is particularly useful for conservative contaminants, where risk and vulnerability assessment methods, based on assumed rather than established correlations, generally produce erroneous results. Another purpose of the models can be to predict the future evolution of nitrate concentrations under influence of changes in land use or fertilization practices, which occur in compliance with policies such as the Nitrates Directive. Model B predicts a 40% decrease in nitrate concentrations in groundwater of the study area, when horticulture is replaced by other land use with much lower fertilization and irrigation rates.

  12. Predicting Transport of 3,5,6-Trichloro-2-Pyridinol Into Saliva Using a Combination Experimental and Computational Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Jordan Ned; Carver, Zana A.; Weber, Thomas J.

    A combination experimental and computational approach was developed to predict chemical transport into saliva. A serous-acinar chemical transport assay was established to measure chemical transport with non-physiological (standard cell culture medium) and physiological (using surrogate plasma and saliva medium) conditions using 3,5,6-trichloro-2-pyridinol (TCPy) a metabolite of the pesticide chlorpyrifos. High levels of TCPy protein binding was observed in cell culture medium and rat plasma resulting in different TCPy transport behaviors in the two experimental conditions. In the non-physiological transport experiment, TCPy reached equilibrium at equivalent concentrations in apical and basolateral chambers. At higher TCPy doses, increased unbound TCPy was observed,more » and TCPy concentrations in apical and basolateral chambers reached equilibrium faster than lower doses, suggesting only unbound TCPy is able to cross the cellular monolayer. In the physiological experiment, TCPy transport was slower than non-physiological conditions, and equilibrium was achieved at different concentrations in apical and basolateral chambers at a comparable ratio (0.034) to what was previously measured in rats dosed with TCPy (saliva:blood ratio: 0.049). A cellular transport computational model was developed based on TCPy protein binding kinetics and accurately simulated all transport experiments using different permeability coefficients for the two experimental conditions (1.4 vs 0.4 cm/hr for non-physiological and physiological experiments, respectively). The computational model was integrated into a physiologically based pharmacokinetic (PBPK) model and accurately predicted TCPy concentrations in saliva of rats dosed with TCPy. Overall, this study demonstrates an approach to predict chemical transport in saliva potentially increasing the utility of salivary biomonitoring in the future.« less

  13. Comparison and validation of statistical methods for predicting power outage durations in the event of hurricanes.

    PubMed

    Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M

    2011-12-01

    This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy. © 2011 Society for Risk Analysis.

  14. Prediction of oxygen uptake dynamics by machine learning analysis of wearable sensors during activities of daily living

    PubMed Central

    Beltrame, T.; Amelard, R.; Wong, A.; Hughson, R. L.

    2017-01-01

    Currently, oxygen uptake () is the most precise means of investigating aerobic fitness and level of physical activity; however, can only be directly measured in supervised conditions. With the advancement of new wearable sensor technologies and data processing approaches, it is possible to accurately infer work rate and predict during activities of daily living (ADL). The main objective of this study was to develop and verify the methods required to predict and investigate the dynamics during ADL. The variables derived from the wearable sensors were used to create a predictor based on a random forest method. The temporal dynamics were assessed by the mean normalized gain amplitude (MNG) obtained from frequency domain analysis. The MNG provides a means to assess aerobic fitness. The predicted during ADL was strongly correlated (r = 0.87, P < 0.001) with the measured and the prediction bias was 0.2 ml·min−1·kg−1. The MNG calculated based on predicted was strongly correlated (r = 0.71, P < 0.001) with MNG calculated based on measured data. This new technology provides an important advance in ambulatory and continuous assessment of aerobic fitness with potential for future applications such as the early detection of deterioration of physical health. PMID:28378815

  15. Real-time stylistic prediction for whole-body human motions.

    PubMed

    Matsubara, Takamitsu; Hyon, Sang-Ho; Morimoto, Jun

    2012-01-01

    The ability to predict human motion is crucial in several contexts such as human tracking by computer vision and the synthesis of human-like computer graphics. Previous work has focused on off-line processes with well-segmented data; however, many applications such as robotics require real-time control with efficient computation. In this paper, we propose a novel approach called real-time stylistic prediction for whole-body human motions to satisfy these requirements. This approach uses a novel generative model to represent a whole-body human motion including rhythmic motion (e.g., walking) and discrete motion (e.g., jumping). The generative model is composed of a low-dimensional state (phase) dynamics and a two-factor observation model, allowing it to capture the diversity of motion styles in humans. A real-time adaptation algorithm was derived to estimate both state variables and style parameter of the model from non-stationary unlabeled sequential observations. Moreover, with a simple modification, the algorithm allows real-time adaptation even from incomplete (partial) observations. Based on the estimated state and style, a future motion sequence can be accurately predicted. In our implementation, it takes less than 15 ms for both adaptation and prediction at each observation. Our real-time stylistic prediction was evaluated for human walking, running, and jumping behaviors. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Forensic DNA Phenotyping: Predicting human appearance from crime scene material for investigative purposes.

    PubMed

    Kayser, Manfred

    2015-09-01

    Forensic DNA Phenotyping refers to the prediction of appearance traits of unknown sample donors, or unknown deceased (missing) persons, directly from biological materials found at the scene. "Biological witness" outcomes of Forensic DNA Phenotyping can provide investigative leads to trace unknown persons, who are unidentifiable with current comparative DNA profiling. This intelligence application of DNA marks a substantially different forensic use of genetic material rather than that of current DNA profiling presented in the courtroom. Currently, group-specific pigmentation traits are already predictable from DNA with reasonably high accuracies, while several other externally visible characteristics are under genetic investigation. Until individual-specific appearance becomes accurately predictable from DNA, conventional DNA profiling needs to be performed subsequent to appearance DNA prediction. Notably, and where Forensic DNA Phenotyping shows great promise, this is on a (much) smaller group of potential suspects, who match the appearance characteristics DNA-predicted from the crime scene stain or from the deceased person's remains. Provided sufficient funding being made available, future research to better understand the genetic basis of human appearance will expectedly lead to a substantially more detailed description of an unknown person's appearance from DNA, delivering increased value for police investigations in criminal and missing person cases involving unknowns. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. Prediction of oxygen uptake dynamics by machine learning analysis of wearable sensors during activities of daily living.

    PubMed

    Beltrame, T; Amelard, R; Wong, A; Hughson, R L

    2017-04-05

    Currently, oxygen uptake () is the most precise means of investigating aerobic fitness and level of physical activity; however, can only be directly measured in supervised conditions. With the advancement of new wearable sensor technologies and data processing approaches, it is possible to accurately infer work rate and predict during activities of daily living (ADL). The main objective of this study was to develop and verify the methods required to predict and investigate the dynamics during ADL. The variables derived from the wearable sensors were used to create a predictor based on a random forest method. The temporal dynamics were assessed by the mean normalized gain amplitude (MNG) obtained from frequency domain analysis. The MNG provides a means to assess aerobic fitness. The predicted during ADL was strongly correlated (r = 0.87, P < 0.001) with the measured and the prediction bias was 0.2 ml·min -1 ·kg -1 . The MNG calculated based on predicted was strongly correlated (r = 0.71, P < 0.001) with MNG calculated based on measured data. This new technology provides an important advance in ambulatory and continuous assessment of aerobic fitness with potential for future applications such as the early detection of deterioration of physical health.

  18. Semi-supervised protein subcellular localization.

    PubMed

    Xu, Qian; Hu, Derek Hao; Xue, Hong; Yu, Weichuan; Yang, Qiang

    2009-01-30

    Protein subcellular localization is concerned with predicting the location of a protein within a cell using computational method. The location information can indicate key functionalities of proteins. Accurate predictions of subcellular localizations of protein can aid the prediction of protein function and genome annotation, as well as the identification of drug targets. Computational methods based on machine learning, such as support vector machine approaches, have already been widely used in the prediction of protein subcellular localization. However, a major drawback of these machine learning-based approaches is that a large amount of data should be labeled in order to let the prediction system learn a classifier of good generalization ability. However, in real world cases, it is laborious, expensive and time-consuming to experimentally determine the subcellular localization of a protein and prepare instances of labeled data. In this paper, we present an approach based on a new learning framework, semi-supervised learning, which can use much fewer labeled instances to construct a high quality prediction model. We construct an initial classifier using a small set of labeled examples first, and then use unlabeled instances to refine the classifier for future predictions. Experimental results show that our methods can effectively reduce the workload for labeling data using the unlabeled data. Our method is shown to enhance the state-of-the-art prediction results of SVM classifiers by more than 10%.

  19. ASTRAL, DRAGON and SEDAN scores predict stroke outcome more accurately than physicians.

    PubMed

    Ntaios, G; Gioulekas, F; Papavasileiou, V; Strbian, D; Michel, P

    2016-11-01

    ASTRAL, SEDAN and DRAGON scores are three well-validated scores for stroke outcome prediction. Whether these scores predict stroke outcome more accurately compared with physicians interested in stroke was investigated. Physicians interested in stroke were invited to an online anonymous survey to provide outcome estimates in randomly allocated structured scenarios of recent real-life stroke patients. Their estimates were compared to scores' predictions in the same scenarios. An estimate was considered accurate if it was within 95% confidence intervals of actual outcome. In all, 244 participants from 32 different countries responded assessing 720 real scenarios and 2636 outcomes. The majority of physicians' estimates were inaccurate (1422/2636, 53.9%). 400 (56.8%) of physicians' estimates about the percentage probability of 3-month modified Rankin score (mRS) > 2 were accurate compared with 609 (86.5%) of ASTRAL score estimates (P < 0.0001). 394 (61.2%) of physicians' estimates about the percentage probability of post-thrombolysis symptomatic intracranial haemorrhage were accurate compared with 583 (90.5%) of SEDAN score estimates (P < 0.0001). 160 (24.8%) of physicians' estimates about post-thrombolysis 3-month percentage probability of mRS 0-2 were accurate compared with 240 (37.3%) DRAGON score estimates (P < 0.0001). 260 (40.4%) of physicians' estimates about the percentage probability of post-thrombolysis mRS 5-6 were accurate compared with 518 (80.4%) DRAGON score estimates (P < 0.0001). ASTRAL, DRAGON and SEDAN scores predict outcome of acute ischaemic stroke patients with higher accuracy compared to physicians interested in stroke. © 2016 EAN.

  20. Predicting Coronary Artery Aneurysms in Kawasaki Disease at a North American Center: An Assessment of Baseline z Scores.

    PubMed

    Son, Mary Beth F; Gauvreau, Kimberlee; Kim, Susan; Tang, Alexander; Dedeoglu, Fatma; Fulton, David R; Lo, Mindy S; Baker, Annette L; Sundel, Robert P; Newburger, Jane W

    2017-05-31

    Accurate risk prediction of coronary artery aneurysms (CAAs) in North American children with Kawasaki disease remains a clinical challenge. We sought to determine the predictive utility of baseline coronary dimensions adjusted for body surface area ( z scores) for future CAAs in Kawasaki disease and explored the extent to which addition of established Japanese risk scores to baseline coronary artery z scores improved discrimination for CAA development. We explored the relationships of CAA with baseline z scores; with Kobayashi, Sano, Egami, and Harada risk scores; and with the combination of baseline z scores and risk scores. We defined CAA as a maximum z score (zMax) ≥2.5 of the left anterior descending or right coronary artery at 4 to 8 weeks of illness. Of 261 patients, 77 patients (29%) had a baseline zMax ≥2.0. CAAs occurred in 15 patients (6%). CAAs were strongly associated with baseline zMax ≥2.0 versus <2.0 (12 [16%] versus 3 [2%], respectively, P <0.001). Baseline zMax ≥2.0 had a C statistic of 0.77, good sensitivity (80%), and excellent negative predictive value (98%). None of the risk scores alone had adequate discrimination. When high-risk status per the Japanese risk scores was added to models containing baseline zMax ≥2.0, none were significantly better than baseline zMax ≥2.0 alone. In a North American center, baseline zMax ≥2.0 in children with Kawasaki disease demonstrated high predictive utility for later development of CAA. Future studies should validate the utility of our findings. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  1. Modeling Interdependent and Periodic Real-World Action Sequences

    PubMed Central

    Kurashima, Takeshi; Althoff, Tim; Leskovec, Jure

    2018-01-01

    Mobile health applications, including those that track activities such as exercise, sleep, and diet, are becoming widely used. Accurately predicting human actions in the real world is essential for targeted recommendations that could improve our health and for personalization of these applications. However, making such predictions is extremely difficult due to the complexities of human behavior, which consists of a large number of potential actions that vary over time, depend on each other, and are periodic. Previous work has not jointly modeled these dynamics and has largely focused on item consumption patterns instead of broader types of behaviors such as eating, commuting or exercising. In this work, we develop a novel statistical model, called TIPAS, for Time-varying, Interdependent, and Periodic Action Sequences. Our approach is based on personalized, multivariate temporal point processes that model time-varying action propensities through a mixture of Gaussian intensities. Our model captures short-term and long-term periodic interdependencies between actions through Hawkes process-based self-excitations. We evaluate our approach on two activity logging datasets comprising 12 million real-world actions (e.g., eating, sleep, and exercise) taken by 20 thousand users over 17 months. We demonstrate that our approach allows us to make successful predictions of future user actions and their timing. Specifically, TIPAS improves predictions of actions, and their timing, over existing methods across multiple datasets by up to 156%, and up to 37%, respectively. Performance improvements are particularly large for relatively rare and periodic actions such as walking and biking, improving over baselines by up to 256%. This demonstrates that explicit modeling of dependencies and periodicities in real-world behavior enables successful predictions of future actions, with implications for modeling human behavior, app personalization, and targeting of health interventions. PMID:29780977

  2. Predicting vapor-liquid phase equilibria with augmented ab initio interatomic potentials

    NASA Astrophysics Data System (ADS)

    Vlasiuk, Maryna; Sadus, Richard J.

    2017-06-01

    The ability of ab initio interatomic potentials to accurately predict vapor-liquid phase equilibria is investigated. Monte Carlo simulations are reported for the vapor-liquid equilibria of argon and krypton using recently developed accurate ab initio interatomic potentials. Seventeen interatomic potentials are studied, formulated from different combinations of two-body plus three-body terms. The simulation results are compared to either experimental or reference data for conditions ranging from the triple point to the critical point. It is demonstrated that the use of ab initio potentials enables systematic improvements to the accuracy of predictions via the addition of theoretically based terms. The contribution of three-body interactions is accounted for using the Axilrod-Teller-Muto plus other multipole contributions and the effective Marcelli-Wang-Sadus potentials. The results indicate that the predictive ability of recent interatomic potentials, obtained from quantum chemical calculations, is comparable to that of accurate empirical models. It is demonstrated that the Marcelli-Wang-Sadus potential can be used in combination with accurate two-body ab initio models for the computationally inexpensive and accurate estimation of vapor-liquid phase equilibria.

  3. Predicting vapor-liquid phase equilibria with augmented ab initio interatomic potentials.

    PubMed

    Vlasiuk, Maryna; Sadus, Richard J

    2017-06-28

    The ability of ab initio interatomic potentials to accurately predict vapor-liquid phase equilibria is investigated. Monte Carlo simulations are reported for the vapor-liquid equilibria of argon and krypton using recently developed accurate ab initio interatomic potentials. Seventeen interatomic potentials are studied, formulated from different combinations of two-body plus three-body terms. The simulation results are compared to either experimental or reference data for conditions ranging from the triple point to the critical point. It is demonstrated that the use of ab initio potentials enables systematic improvements to the accuracy of predictions via the addition of theoretically based terms. The contribution of three-body interactions is accounted for using the Axilrod-Teller-Muto plus other multipole contributions and the effective Marcelli-Wang-Sadus potentials. The results indicate that the predictive ability of recent interatomic potentials, obtained from quantum chemical calculations, is comparable to that of accurate empirical models. It is demonstrated that the Marcelli-Wang-Sadus potential can be used in combination with accurate two-body ab initio models for the computationally inexpensive and accurate estimation of vapor-liquid phase equilibria.

  4. Multi-level emulation of a volcanic ash transport and dispersion model to quantify sensitivity to uncertain parameters

    NASA Astrophysics Data System (ADS)

    Harvey, Natalie J.; Huntley, Nathan; Dacre, Helen F.; Goldstein, Michael; Thomson, David; Webster, Helen

    2018-01-01

    Following the disruption to European airspace caused by the eruption of Eyjafjallajökull in 2010 there has been a move towards producing quantitative predictions of volcanic ash concentration using volcanic ash transport and dispersion simulators. However, there is no formal framework for determining the uncertainties of these predictions and performing many simulations using these complex models is computationally expensive. In this paper a Bayesian linear emulation approach is applied to the Numerical Atmospheric-dispersion Modelling Environment (NAME) to better understand the influence of source and internal model parameters on the simulator output. Emulation is a statistical method for predicting the output of a computer simulator at new parameter choices without actually running the simulator. A multi-level emulation approach is applied using two configurations of NAME with different numbers of model particles. Information from many evaluations of the computationally faster configuration is combined with results from relatively few evaluations of the slower, more accurate, configuration. This approach is effective when it is not possible to run the accurate simulator many times and when there is also little prior knowledge about the influence of parameters. The approach is applied to the mean ash column loading in 75 geographical regions on 14 May 2010. Through this analysis it has been found that the parameters that contribute the most to the output uncertainty are initial plume rise height, mass eruption rate, free tropospheric turbulence levels and precipitation threshold for wet deposition. This information can be used to inform future model development and observational campaigns and routine monitoring. The analysis presented here suggests the need for further observational and theoretical research into parameterisation of atmospheric turbulence. Furthermore it can also be used to inform the most important parameter perturbations for a small operational ensemble of simulations. The use of an emulator also identifies the input and internal parameters that do not contribute significantly to simulator uncertainty. Finally, the analysis highlights that the faster, less accurate, configuration of NAME can, on its own, provide useful information for the problem of predicting average column load over large areas.

  5. Wind Turbine Gust Prediction Using Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Towers, Paul; Jones, Bryn

    2013-11-01

    Offshore wind energy is a growing energy source as governments around the world look for environmentally friendly solutions to potential future energy shortages. In order to capture more energy from the wind, larger turbines are being designed, leading to the structures becoming increasingly vulnerable to damage caused by violent gusts of wind. Advance knowledge of such gusts will enable turbine control systems to take preventative action, reducing turbine maintenance costs. We present a system which can accurately forecast the velocity profile of an oncoming wind, given only limited spatial measurements from light detection and ranging (LiDAR) units, which are currently operational in industry. Our method combines nonlinear state estimation techniques with low-order models of atmospheric boundary-layer flows to generate flow-field estimates. We discuss the accuracy of our velocity profile predictions by direct comparison to data derived from large eddy simulations of the atmospheric boundary layer.

  6. Solution x-ray scattering and structure formation in protein dynamics

    NASA Astrophysics Data System (ADS)

    Nasedkin, Alexandr; Davidsson, Jan; Niemi, Antti J.; Peng, Xubiao

    2017-12-01

    We propose a computationally effective approach that builds on Landau mean-field theory in combination with modern nonequilibrium statistical mechanics to model and interpret protein dynamics and structure formation in small- to wide-angle x-ray scattering (S/WAXS) experiments. We develop the methodology by analyzing experimental data in the case of Engrailed homeodomain protein as an example. We demonstrate how to interpret S/WAXS data qualitatively with a good precision and over an extended temperature range. We explain experimental observations in terms of protein phase structure, and we make predictions for future experiments and for how to analyze data at different ambient temperature values. We conclude that the approach we propose has the potential to become a highly accurate, computationally effective, and predictive tool for analyzing S/WAXS data. For this, we compare our results with those obtained previously in an all-atom molecular dynamics simulation.

  7. Hydrophobic potential of mean force as a solvation function for protein structure prediction.

    PubMed

    Lin, Matthew S; Fawzi, Nicolas Lux; Head-Gordon, Teresa

    2007-06-01

    We have developed a solvation function that combines a Generalized Born model for polarization of protein charge by the high dielectric solvent, with a hydrophobic potential of mean force (HPMF) as a model for hydrophobic interaction, to aid in the discrimination of native structures from other misfolded states in protein structure prediction. We find that our energy function outperforms other reported scoring functions in terms of correct native ranking for 91% of proteins and low Z scores for a variety of decoy sets, including the challenging Rosetta decoys. This work shows that the stabilizing effect of hydrophobic exposure to aqueous solvent that defines the HPMF hydration physics is an apparent improvement over solvent-accessible surface area models that penalize hydrophobic exposure. Decoys generated by thermal sampling around the native-state basin reveal a potentially important role for side-chain entropy in the future development of even more accurate free energy surfaces.

  8. Predicting differences in the perceived relevance of crime's costs and benefits in a test of rational choice theory.

    PubMed

    Bouffard, Jeffrey A

    2007-08-01

    Previous hypothetical scenario tests of rational choice theory have presented all participants with the same set of consequences, implicitly assuming that these consequences would be relevant for each individual. Recent research demonstrates that those researcher-presented consequences do not accurately reflect those considered by study participants and that there is individual variation in the relevance of various consequences. Despite this and some theoretical propositions that such differences should exist, little empirical research has explored the possibility of predicting such variation. This study allows participants to develop their own set of relevant consequences for three hypothetical offenses and examines how several demographic and theoretical variables impact those consequences' relevance. Exploratory results suggest individual factors impact the perceived relevance of several cost and benefit types, even among a relatively homogenous sample of college students. Implications for future tests of rational choice theory, as well as policy implications are discussed.

  9. Parsimonious description for predicting high-dimensional dynamics

    PubMed Central

    Hirata, Yoshito; Takeuchi, Tomoya; Horai, Shunsuke; Suzuki, Hideyuki; Aihara, Kazuyuki

    2015-01-01

    When we observe a system, we often cannot observe all its variables and may have some of its limited measurements. Under such a circumstance, delay coordinates, vectors made of successive measurements, are useful to reconstruct the states of the whole system. Although the method of delay coordinates is theoretically supported for high-dimensional dynamical systems, practically there is a limitation because the calculation for higher-dimensional delay coordinates becomes more expensive. Here, we propose a parsimonious description of virtually infinite-dimensional delay coordinates by evaluating their distances with exponentially decaying weights. This description enables us to predict the future values of the measurements faster because we can reuse the calculated distances, and more accurately because the description naturally reduces the bias of the classical delay coordinates toward the stable directions. We demonstrate the proposed method with toy models of the atmosphere and real datasets related to renewable energy. PMID:26510518

  10. High-fidelity plasma codes for burn physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooley, James; Graziani, Frank; Marinak, Marty

    Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiation-hydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of self-heating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes, are a relatively recent computational tool that augments both experimental datamore » and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.« less

  11. Lessons Learned from the Wide Field Camera 3 TV1 Test Campaign and Correlation Effort

    NASA Technical Reports Server (NTRS)

    Peabody, Hume; Stavley, Richard; Bast, William

    2007-01-01

    In January 2004, shortly after the Columbia accident, future servicing missions to the Hubble Space Telescope (HST) were cancelled. In response to this, further work on the Wide Field Camera 3 instrument was ceased. Given the maturity level of the design, a characterization thermal test (TV1) was completed in case the mission was re-instated or an alternate mission found on which to fly the instrument. This thermal test yielded some valuable lessons learned with respect to testing configurations and modeling/correlation practices, including: 1. Ensure that the thermal design can be tested 2. Ensure that the model has sufficient detail for accurate predictions 3. Ensure that the power associated with all active control devices is predicted 4. Avoid unit changes for existing models. This paper documents the difficulties presented when these recommendations were not followed.

  12. Measuring populations to improve vaccination coverage

    NASA Astrophysics Data System (ADS)

    Bharti, Nita; Djibo, Ali; Tatem, Andrew J.; Grenfell, Bryan T.; Ferrari, Matthew J.

    2016-10-01

    In low-income settings, vaccination campaigns supplement routine immunization but often fail to achieve coverage goals due to uncertainty about target population size and distribution. Accurate, updated estimates of target populations are rare but critical; short-term fluctuations can greatly impact population size and susceptibility. We use satellite imagery to quantify population fluctuations and the coverage achieved by a measles outbreak response vaccination campaign in urban Niger and compare campaign estimates to measurements from a post-campaign survey. Vaccine coverage was overestimated because the campaign underestimated resident numbers and seasonal migration further increased the target population. We combine satellite-derived measurements of fluctuations in population distribution with high-resolution measles case reports to develop a dynamic model that illustrates the potential improvement in vaccination campaign coverage if planners account for predictable population fluctuations. Satellite imagery can improve retrospective estimates of vaccination campaign impact and future campaign planning by synchronizing interventions with predictable population fluxes.

  13. Attenuation of low-frequency underwater sound using an array of air-filled balloons and comparison to effective medium theory.

    PubMed

    Lee, Kevin M; Wilson, Preston S; Wochner, Mark S

    2017-12-01

    The ultimate goal of this work is to accurately predict the attenuation through a collection of large (on the order of 10-cm-radius) tethered encapsulated bubbles used in underwater noise abatement systems. Measurements of underwater sound attenuation were performed during a set of lake experiments, where a low-frequency compact electromechanical sound source was surrounded by different arrays of encapsulated bubbles with various individual bubbles sizes and void fractions. The measurements were compared with an existing predictive model [Church, J. Acoust. Soc. Am. 97, 1510-1521 (1995)] of the dispersion relation for linear propagation in liquid containing encapsulated bubbles. Although the model was originally intended to describe ultrasound contrast agents, it is evaluated here for large bubbles, and hence low frequencies, as a design tool for future underwater noise abatement systems, and there is good quantitative agreement between the observations and the model.

  14. Measuring populations to improve vaccination coverage

    PubMed Central

    Bharti, Nita; Djibo, Ali; Tatem, Andrew J.; Grenfell, Bryan T.; Ferrari, Matthew J.

    2016-01-01

    In low-income settings, vaccination campaigns supplement routine immunization but often fail to achieve coverage goals due to uncertainty about target population size and distribution. Accurate, updated estimates of target populations are rare but critical; short-term fluctuations can greatly impact population size and susceptibility. We use satellite imagery to quantify population fluctuations and the coverage achieved by a measles outbreak response vaccination campaign in urban Niger and compare campaign estimates to measurements from a post-campaign survey. Vaccine coverage was overestimated because the campaign underestimated resident numbers and seasonal migration further increased the target population. We combine satellite-derived measurements of fluctuations in population distribution with high-resolution measles case reports to develop a dynamic model that illustrates the potential improvement in vaccination campaign coverage if planners account for predictable population fluctuations. Satellite imagery can improve retrospective estimates of vaccination campaign impact and future campaign planning by synchronizing interventions with predictable population fluxes. PMID:27703191

  15. Comprehensive interpretation of thermal dileptons measured at the CERN super proton synchrotron.

    PubMed

    van Hees, Hendrik; Rapp, Ralf

    2006-09-08

    Employing thermal dilepton rates based on a medium-modified electromagnetic correlation function we show that recent dimuon spectra of the NA60 Collaboration in central In-In collisions at the CERN-SPS can be understood in terms of radiation from a hot and dense hadronic medium. Earlier calculated in-medium rho-meson spectral functions provide an accurate description of the data up to dimuon invariant masses of about M approximately or equal to 0.9 GeV, with good sensitivity to the predicted rho-meson line shape, identifying baryon-induced modifications as the prevalent ones. A reliable evaluation of the contribution enables the study of further medium effects: at masses M>0.9 GeV, 4-pion type annihilation accounts for the experimentally observed excess (possibly augmented by effects of "chiral mixing"), while predictions for thermal emission from in-medium omega and phi mesons may be tested in the future.

  16. Testing the skill of numerical hydraulic modeling to simulate spatiotemporal flooding patterns in the Logone floodplain, Cameroon

    NASA Astrophysics Data System (ADS)

    Fernández, Alfonso; Najafi, Mohammad Reza; Durand, Michael; Mark, Bryan G.; Moritz, Mark; Jung, Hahn Chul; Neal, Jeffrey; Shastry, Apoorva; Laborde, Sarah; Phang, Sui Chian; Hamilton, Ian M.; Xiao, Ningchuan

    2016-08-01

    Recent innovations in hydraulic modeling have enabled global simulation of rivers, including simulation of their coupled wetlands and floodplains. Accurate simulations of floodplains using these approaches may imply tremendous advances in global hydrologic studies and in biogeochemical cycling. One such innovation is to explicitly treat sub-grid channels within two-dimensional models, given only remotely sensed data in areas with limited data availability. However, predicting inundated area in floodplains using a sub-grid model has not been rigorously validated. In this study, we applied the LISFLOOD-FP hydraulic model using a sub-grid channel parameterization to simulate inundation dynamics on the Logone River floodplain, in northern Cameroon, from 2001 to 2007. Our goal was to determine whether floodplain dynamics could be simulated with sufficient accuracy to understand human and natural contributions to current and future inundation patterns. Model inputs in this data-sparse region include in situ river discharge, satellite-derived rainfall, and the shuttle radar topography mission (SRTM) floodplain elevation. We found that the model accurately simulated total floodplain inundation, with a Pearson correlation coefficient greater than 0.9, and RMSE less than 700 km2, compared to peak inundation greater than 6000 km2. Predicted discharge downstream of the floodplain matched measurements (Nash-Sutcliffe efficiency of 0.81), and indicated that net flow from the channel to the floodplain was modeled accurately. However, the spatial pattern of inundation was not well simulated, apparently due to uncertainties in SRTM elevations. We evaluated model results at 250, 500 and 1000-m spatial resolutions, and found that results are insensitive to spatial resolution. We also compared the model output against results from a run of LISFLOOD-FP in which the sub-grid channel parameterization was disabled, finding that the sub-grid parameterization simulated more realistic dynamics. These results suggest that analysis of global inundation is feasible using a sub-grid model, but that spatial patterns at sub-kilometer resolutions still need to be adequately predicted.

  17. A rapid estimation of tsunami run-up based on finite fault models

    NASA Astrophysics Data System (ADS)

    Campos, J.; Fuentes, M. A.; Hayes, G. P.; Barrientos, S. E.; Riquelme, S.

    2014-12-01

    Many efforts have been made to estimate the maximum run-up height of tsunamis associated with large earthquakes. This is a difficult task, because of the time it takes to construct a tsunami model using real time data from the source. It is possible to construct a database of potential seismic sources and their corresponding tsunami a priori. However, such models are generally based on uniform slip distributions and thus oversimplify our knowledge of the earthquake source. Instead, we can use finite fault models of earthquakes to give a more accurate prediction of the tsunami run-up. Here we show how to accurately predict tsunami run-up from any seismic source model using an analytic solution found by Fuentes et al, 2013 that was especially calculated for zones with a very well defined strike, i.e, Chile, Japan, Alaska, etc. The main idea of this work is to produce a tool for emergency response, trading off accuracy for quickness. Our solutions for three large earthquakes are promising. Here we compute models of the run-up for the 2010 Mw 8.8 Maule Earthquake, the 2011 Mw 9.0 Tohoku Earthquake, and the recent 2014 Mw 8.2 Iquique Earthquake. Our maximum rup-up predictions are consistent with measurements made inland after each event, with a peak of 15 to 20 m for Maule, 40 m for Tohoku, and 2,1 m for the Iquique earthquake. Considering recent advances made in the analysis of real time GPS data and the ability to rapidly resolve the finiteness of a large earthquake close to existing GPS networks, it will be possible in the near future to perform these calculations within the first five minutes after the occurrence of any such event. Such calculations will thus provide more accurate run-up information than is otherwise available from existing uniform-slip seismic source databases.

  18. A viscoelastic model for the prediction of transcranial ultrasound propagation: application for the estimation of shear acoustic properties in the human skull

    NASA Astrophysics Data System (ADS)

    Pichardo, Samuel; Moreno-Hernández, Carlos; Drainville, Robert Andrew; Sin, Vivian; Curiel, Laura; Hynynen, Kullervo

    2017-09-01

    A better understanding of ultrasound transmission through the human skull is fundamental to develop optimal imaging and therapeutic applications. In this study, we present global attenuation values and functions that correlate apparent density calculated from computed tomography scans to shear speed of sound. For this purpose, we used a model for sound propagation based on the viscoelastic wave equation (VWE) assuming isotropic conditions. The model was validated using a series of measurements with plates of different plastic materials and angles of incidence of 0°, 15° and 50°. The optimal functions for transcranial ultrasound propagation were established using the VWE, scan measurements of transcranial propagation with an angle of incidence of 40° and a genetic optimization algorithm. Ten (10) locations over three (3) skulls were used for ultrasound frequencies of 270 kHz and 836 kHz. Results with plastic materials demonstrated that the viscoelastic modeling predicted both longitudinal and shear propagation with an average (±s.d.) error of 9(±7)% of the wavelength in the predicted delay and an error of 6.7(±5)% in the estimation of transmitted power. Using the new optimal functions of speed of sound and global attenuation for the human skull, the proposed model predicted the transcranial ultrasound transmission for a frequency of 270 kHz with an expected error in the predicted delay of 5(±2.7)% of the wavelength. The sound propagation model predicted accurately the sound propagation regardless of either shear or longitudinal sound transmission dominated. For 836 kHz, the model predicted accurately in average with an error in the predicted delay of 17(±16)% of the wavelength. Results indicated the importance of the specificity of the information at a voxel level to better understand ultrasound transmission through the skull. These results and new model will be very valuable tools for the future development of transcranial applications of ultrasound therapy and imaging.

  19. Using Functional Data Analysis Models to Estimate Future Time Trends in Age-Specific Breast Cancer Mortality for the United States and England–Wales

    PubMed Central

    Erbas, Bircan; Akram, Muhammed; Gertig, Dorota M; English, Dallas; Hopper, John L.; Kavanagh, Anne M; Hyndman, Rob

    2010-01-01

    Background Mortality/incidence predictions are used for allocating public health resources and should accurately reflect age-related changes through time. We present a new forecasting model for estimating future trends in age-related breast cancer mortality for the United States and England–Wales. Methods We used functional data analysis techniques both to model breast cancer mortality-age relationships in the United States from 1950 through 2001 and England–Wales from 1950 through 2003 and to estimate 20-year predictions using a new forecasting method. Results In the United States, trends for women aged 45 to 54 years have continued to decline since 1980. In contrast, trends in women aged 60 to 84 years increased in the 1980s and declined in the 1990s. For England–Wales, trends for women aged 45 to 74 years slightly increased before 1980, but declined thereafter. The greatest age-related changes for both regions were during the 1990s. For both the United States and England–Wales, trends are expected to decline and then stabilize, with the greatest decline in women aged 60 to 70 years. Forecasts suggest relatively stable trends for women older than 75 years. Conclusions Prediction of age-related changes in mortality/incidence can be used for planning and targeting programs for specific age groups. Currently, these models are being extended to incorporate other variables that may influence age-related changes in mortality/incidence trends. In their current form, these models will be most useful for modeling and projecting future trends of diseases for which there has been very little advancement in treatment and minimal cohort effects (eg. lethal cancers). PMID:20139657

  20. The prediction of drug metabolism, tissue distribution, and bioavailability of 50 structurally diverse compounds in rat using mechanism-based absorption, distribution, and metabolism prediction tools.

    PubMed

    De Buck, Stefan S; Sinha, Vikash K; Fenu, Luca A; Gilissen, Ron A; Mackie, Claire E; Nijsen, Marjoleen J

    2007-04-01

    The aim of this study was to assess a physiologically based modeling approach for predicting drug metabolism, tissue distribution, and bioavailability in rat for a structurally diverse set of neutral and moderate-to-strong basic compounds (n = 50). Hepatic blood clearance (CL(h)) was projected using microsomal data and shown to be well predicted, irrespective of the type of hepatic extraction model (80% within 2-fold). Best predictions of CL(h) were obtained disregarding both plasma and microsomal protein binding, whereas strong bias was seen using either blood binding only or both plasma and microsomal protein binding. Two mechanistic tissue composition-based equations were evaluated for predicting volume of distribution (V(dss)) and tissue-to-plasma partitioning (P(tp)). A first approach, which accounted for ionic interactions with acidic phospholipids, resulted in accurate predictions of V(dss) (80% within 2-fold). In contrast, a second approach, which disregarded ionic interactions, was a poor predictor of V(dss) (60% within 2-fold). The first approach also yielded accurate predictions of P(tp) in muscle, heart, and kidney (80% within 3-fold), whereas in lung, liver, and brain, predictions ranged from 47% to 62% within 3-fold. Using the second approach, P(tp) prediction accuracy in muscle, heart, and kidney was on average 70% within 3-fold, and ranged from 24% to 54% in all other tissues. Combining all methods for predicting V(dss) and CL(h) resulted in accurate predictions of the in vivo half-life (70% within 2-fold). Oral bioavailability was well predicted using CL(h) data and Gastroplus Software (80% within 2-fold). These results illustrate that physiologically based prediction tools can provide accurate predictions of rat pharmacokinetics.

  1. Spatial analysis of plague in California: niche modeling predictions of the current distribution and potential response to climate change

    PubMed Central

    Holt, Ashley C; Salkeld, Daniel J; Fritz, Curtis L; Tucker, James R; Gong, Peng

    2009-01-01

    Background Plague, caused by the bacterium Yersinia pestis, is a public and wildlife health concern in California and the western United States. This study explores the spatial characteristics of positive plague samples in California and tests Maxent, a machine-learning method that can be used to develop niche-based models from presence-only data, for mapping the potential distribution of plague foci. Maxent models were constructed using geocoded seroprevalence data from surveillance of California ground squirrels (Spermophilus beecheyi) as case points and Worldclim bioclimatic data as predictor variables, and compared and validated using area under the receiver operating curve (AUC) statistics. Additionally, model results were compared to locations of positive and negative coyote (Canis latrans) samples, in order to determine the correlation between Maxent model predictions and areas of plague risk as determined via wild carnivore surveillance. Results Models of plague activity in California ground squirrels, based on recent climate conditions, accurately identified case locations (AUC of 0.913 to 0.948) and were significantly correlated with coyote samples. The final models were used to identify potential plague risk areas based on an ensemble of six future climate scenarios. These models suggest that by 2050, climate conditions may reduce plague risk in the southern parts of California and increase risk along the northern coast and Sierras. Conclusion Because different modeling approaches can yield substantially different results, care should be taken when interpreting future model predictions. Nonetheless, niche modeling can be a useful tool for exploring and mapping the potential response of plague activity to climate change. The final models in this study were used to identify potential plague risk areas based on an ensemble of six future climate scenarios, which can help public managers decide where to allocate surveillance resources. In addition, Maxent model results were significantly correlated with coyote samples, indicating that carnivore surveillance programs will continue to be important for tracking the response of plague to future climate conditions. PMID:19558717

  2. When the Sun's Away, N2O5 Comes Out to Play: An Updated Analysis of Ambient N2O5 Heterogeneous Chemistry

    NASA Astrophysics Data System (ADS)

    McDuffie, E. E.; Brown, S. S.

    2017-12-01

    The heterogeneous chemistry of N2O5 impacts the budget of tropospheric oxidants, which directly controls air quality at Earth's surface. The reaction between gas-phase N2O5 and aerosol particles occurs largely at night, and is therefore more important during the less-intensively-studied winter season. Though N2O5-aerosol interactions are vital for the accurate understanding and simulation of tropospheric chemistry and air quality, many uncertainties persist in our understanding of how various environmental factors influence the reaction rate and probability. Quantitative and accurate evaluation of these factors directly improves the predictive capabilities of atmospheric models, used to inform mitigation strategies for wintertime air pollution. In an update to last year's presentation, The Wintertime Fate of N2O5: Observations and Box Model Analysis for the 2015 WINTER Aircraft Campaign, this presentation will focus on recent field results regarding new information about N2O5 heterogeneous chemistry and future research directions.

  3. The 7th lung cancer TNM classification and staging system: Review of the changes and implications.

    PubMed

    Mirsadraee, Saeed; Oswal, Dilip; Alizadeh, Yalda; Caulo, Andrea; van Beek, Edwin

    2012-04-28

    Lung cancer is the most common cause of death from cancer in males, accounting for more than 1.4 million deaths in 2008. It is a growing concern in China, Asia and Africa as well. Accurate staging of the disease is an important part of the management as it provides estimation of patient's prognosis and identifies treatment sterategies. It also helps to build a database for future staging projects. A major revision of lung cancer staging has been announced with effect from January 2010. The new classification is based on a larger surgical and non-surgical cohort of patients, and thus more accurate in terms of outcome prediction compared to the previous classification. There are several original papers regarding this new classification which give comprehensive description of the methodology, the changes in the staging and the statistical analysis. This overview is a simplified description of the changes in the new classification and their potential impact on patients' treatment and prognosis.

  4. Comparative investigations of manual action representations: evidence that chimpanzees represent the costs of potential future actions involving tools.

    PubMed

    Frey, Scott H; Povinelli, Daniel J

    2012-01-12

    The ability to adjust one's ongoing actions in the anticipation of forthcoming task demands is considered as strong evidence for the existence of internal action representations. Studies of action selection in tool use reveal that the behaviours that we choose in the present moment differ depending on what we intend to do next. Further, they point to a specialized role for mechanisms within the human cerebellum and dominant left cerebral hemisphere in representing the likely sensory costs of intended future actions. Recently, the question of whether similar mechanisms exist in other primates has received growing, but still limited, attention. Here, we present data that bear on this issue from a species that is a natural user of tools, our nearest living relative, the chimpanzee. In experiment 1, a subset of chimpanzees showed a non-significant tendency for their grip preferences to be affected by anticipation of the demands associated with bringing a tool's baited end to their mouths. In experiment 2, chimpanzees' initial grip preferences were consistently affected by anticipation of the forthcoming movements in a task that involves using a tool to extract a food reward. The partial discrepancy between the results of these two studies is attributed to the ability to accurately represent differences between the motor costs associated with executing the two response alternatives available within each task. These findings suggest that chimpanzees are capable of accurately representing the costs of intended future actions, and using those predictions to select movements in the present even in the context of externally directed tool use.

  5. Comparative investigations of manual action representations: evidence that chimpanzees represent the costs of potential future actions involving tools

    PubMed Central

    Frey, Scott H.; Povinelli, Daniel J.

    2012-01-01

    The ability to adjust one's ongoing actions in the anticipation of forthcoming task demands is considered as strong evidence for the existence of internal action representations. Studies of action selection in tool use reveal that the behaviours that we choose in the present moment differ depending on what we intend to do next. Further, they point to a specialized role for mechanisms within the human cerebellum and dominant left cerebral hemisphere in representing the likely sensory costs of intended future actions. Recently, the question of whether similar mechanisms exist in other primates has received growing, but still limited, attention. Here, we present data that bear on this issue from a species that is a natural user of tools, our nearest living relative, the chimpanzee. In experiment 1, a subset of chimpanzees showed a non-significant tendency for their grip preferences to be affected by anticipation of the demands associated with bringing a tool's baited end to their mouths. In experiment 2, chimpanzees' initial grip preferences were consistently affected by anticipation of the forthcoming movements in a task that involves using a tool to extract a food reward. The partial discrepancy between the results of these two studies is attributed to the ability to accurately represent differences between the motor costs associated with executing the two response alternatives available within each task. These findings suggest that chimpanzees are capable of accurately representing the costs of intended future actions, and using those predictions to select movements in the present even in the context of externally directed tool use. PMID:22106426

  6. The Optimal Screening for Prediction of Referral and Outcome (OSPRO) in patients with musculoskeletal pain conditions: a longitudinal validation cohort from the USA

    PubMed Central

    George, Steven Z; Beneciuk, Jason M; Lentz, Trevor A; Wu, Samuel S

    2017-01-01

    Purpose There is an increased need for determining which patients with musculoskeletal pain benefit from additional diagnostic testing or psychologically informed intervention. The Optimal Screening for Prediction of Referral and Outcome (OSPRO) cohort studies were designed to develop and validate standard assessment tools for review of systems and yellow flags. This cohort profile paper provides a description of and future plans for the validation cohort. Participants Patients (n=440) with primary complaint of spine, shoulder or knee pain were recruited into the OSPRO validation cohort via a national Orthopaedic Physical Therapy-Investigative Network. Patients were followed up at 4 weeks, 6 months and 12 months for pain, functional status and quality of life outcomes. Healthcare utilisation outcomes were also collected at 6 and 12 months. Findings to date There are no longitudinal findings reported to date from the ongoing OSPRO validation cohort. The previously completed cross-sectional OSPRO development cohort yielded two assessment tools that were investigated in the validation cohort. Future plans Follow-up data collection was completed in January 2017. Primary analyses will investigate how accurately the OSPRO review of systems and yellow flag tools predict 12-month pain, functional status, quality of life and healthcare utilisation outcomes. Planned secondary analyses include prediction of pain interference and/or development of chronic pain, investigation of treatment expectation on patient outcomes and analysis of patient satisfaction following an episode of physical therapy. Trial registration number The OSPRO validation cohort was not registered. PMID:28600371

  7. Energy prediction equations are inadequate for obese Hispanic youth.

    PubMed

    Klein, Catherine J; Villavicencio, Stephan A; Schweitzer, Amy; Bethepu, Joel S; Hoffman, Heather J; Mirza, Nazrat M

    2011-08-01

    Assessing energy requirements is a fundamental activity in clinical dietetics practice. A study was designed to determine whether published linear regression equations were accurate for predicting resting energy expenditure (REE) in fasted Hispanic children with obesity (aged 7 to 15 years). REE was measured using indirect calorimetry; body composition was estimated with whole-body air displacement plethysmography. REE was predicted using four equations: Institute of Medicine for healthy-weight children (IOM-HW), IOM for overweight and obese children (IOM-OS), Harris-Benedict, and Schofield. Accuracy of the prediction was calculated as the absolute value of the difference between the measured and predicted REE divided by the measured REE, expressed as a percentage. Predicted values within 85% to 115% of measured were defined as accurate. Participants (n=58; 53% boys) were mean age 11.8±2.1 years, had 43.5%±5.1% body fat, and had a body mass index of 31.5±5.8 (98.6±1.1 body mass index percentile). Measured REE was 2,339±680 kcal/day; predicted REE was 1,815±401 kcal/day (IOM-HW), 1,794±311 kcal/day (IOM-OS), 1,151±300 kcal/day (Harris-Benedict), and, 1,771±316 kcal/day (Schofield). Measured REE adjusted for body weight averaged 32.0±8.4 kcal/kg/day (95% confidence interval 29.8 to 34.2). Published equations predicted REE within 15% accuracy for only 36% to 40% of 58 participants, except for Harris-Benedict, which did not achieve accuracy for any participant. The most frequently accurate values were obtained using IOM-HW, which predicted REE within 15% accuracy for 55% (17/31) of boys. Published equations did not accurately predict REE for youth in the study sample. Further studies are warranted to formulate accurate energy prediction equations for this population. Copyright © 2011 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  8. GMAT versus Alternatives: Predictive Validity Evidence from Central Europe and the Middle East

    ERIC Educational Resources Information Center

    Koys, Daniel

    2010-01-01

    The author found that the GPA at the end of the MBA program is most accurately predicted by the Graduate Management Admission Test (GMAT) and the Test of English as a Foreign Language (TOEFL). MBA GPA is also predicted, though less accurately, by the Scholastic Level Exam, a mathematics test, undergraduate GPA, and previous career progression. If…

  9. Prediction of chemo-response in serous ovarian cancer.

    PubMed

    Gonzalez Bosquet, Jesus; Newtson, Andreea M; Chung, Rebecca K; Thiel, Kristina W; Ginader, Timothy; Goodheart, Michael J; Leslie, Kimberly K; Smith, Brian J

    2016-10-19

    Nearly one-third of serous ovarian cancer (OVCA) patients will not respond to initial treatment with surgery and chemotherapy and die within one year of diagnosis. If patients who are unlikely to respond to current standard therapy can be identified up front, enhanced tumor analyses and treatment regimens could potentially be offered. Using the Cancer Genome Atlas (TCGA) serous OVCA database, we previously identified a robust molecular signature of 422-genes associated with chemo-response. Our objective was to test whether this signature is an accurate and sensitive predictor of chemo-response in serous OVCA. We first constructed prediction models to predict chemo-response using our previously described 422-gene signature that was associated with response to treatment in serous OVCA. Performance of all prediction models were measured with area under the curves (AUCs, a measure of the model's accuracy) and their respective confidence intervals (CIs). To optimize the prediction process, we determined which elements of the signature most contributed to chemo-response prediction. All prediction models were replicated and validated using six publicly available independent gene expression datasets. The 422-gene signature prediction models predicted chemo-response with AUCs of ~70 %. Optimization of prediction models identified the 34 most important genes in chemo-response prediction. These 34-gene models had improved performance, with AUCs approaching 80 %. Both 422-gene and 34-gene prediction models were replicated and validated in six independent datasets. These prediction models serve as the foundation for the future development and implementation of a diagnostic tool to predict response to chemotherapy for serous OVCA patients.

  10. High Speed Civil Transport (HSCT) Isolated Nacelle Transonic Boattail Drag Study and Results Using Computational Fluid Dynamics (CFD)

    NASA Technical Reports Server (NTRS)

    Midea, Anthony C.; Austin, Thomas; Pao, S. Paul; DeBonis, James R.; Mani, Mori

    2005-01-01

    Nozzle boattail drag is significant for the High Speed Civil Transport (HSCT) and can be as high as 25 percent of the overall propulsion system thrust at transonic conditions. Thus, nozzle boattail drag has the potential to create a thrust drag pinch and can reduce HSCT aircraft aerodynamic efficiencies at transonic operating conditions. In order to accurately predict HSCT performance, it is imperative that nozzle boattail drag be accurately predicted. Previous methods to predict HSCT nozzle boattail drag were suspect in the transonic regime. In addition, previous prediction methods were unable to account for complex nozzle geometry and were not flexible enough for engine cycle trade studies. A computational fluid dynamics (CFD) effort was conducted by NASA and McDonnell Douglas to evaluate the magnitude and characteristics of HSCT nozzle boattail drag at transonic conditions. A team of engineers used various CFD codes and provided consistent, accurate boattail drag coefficient predictions for a family of HSCT nozzle configurations. The CFD results were incorporated into a nozzle drag database that encompassed the entire HSCT flight regime and provided the basis for an accurate and flexible prediction methodology.

  11. High Speed Civil Transport (HSCT) Isolated Nacelle Transonic Boattail Drag Study and Results Using Computational Fluid Dynamics (CFD)

    NASA Technical Reports Server (NTRS)

    Midea, Anthony C.; Austin, Thomas; Pao, S. Paul; DeBonis, James R.; Mani, Mori

    1999-01-01

    Nozzle boattail drag is significant for the High Speed Civil Transport (HSCT) and can be as high as 25% of the overall propulsion system thrust at transonic conditions. Thus, nozzle boattail drag has the potential to create a thrust-drag pinch and can reduce HSCT aircraft aerodynamic efficiencies at transonic operating conditions. In order to accurately predict HSCT performance, it is imperative that nozzle boattail drag be accurately predicted. Previous methods to predict HSCT nozzle boattail drag were suspect in the transonic regime. In addition, previous prediction methods were unable to account for complex nozzle geometry and were not flexible enough for engine cycle trade studies. A computational fluid dynamics (CFD) effort was conducted by NASA and McDonnell Douglas to evaluate the magnitude and characteristics of HSCT nozzle boattail drag at transonic conditions. A team of engineers used various CFD codes and provided consistent, accurate boattail drag coefficient predictions for a family of HSCT nozzle configurations. The CFD results were incorporated into a nozzle drag database that encompassed the entire HSCT flight regime and provided the basis for an accurate and flexible prediction methodology.

  12. CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Slotnick, Jeffrey; Khodadoust, Abdollah; Alonso, Juan; Darmofal, David; Gropp, William; Lurie, Elizabeth; Mavriplis, Dimitri

    2014-01-01

    This report documents the results of a study to address the long range, strategic planning required by NASA's Revolutionary Computational Aerosciences (RCA) program in the area of computational fluid dynamics (CFD), including future software and hardware requirements for High Performance Computing (HPC). Specifically, the "Vision 2030" CFD study is to provide a knowledge-based forecast of the future computational capabilities required for turbulent, transitional, and reacting flow simulations across a broad Mach number regime, and to lay the foundation for the development of a future framework and/or environment where physics-based, accurate predictions of complex turbulent flows, including flow separation, can be accomplished routinely and efficiently in cooperation with other physics-based simulations to enable multi-physics analysis and design. Specific technical requirements from the aerospace industrial and scientific communities were obtained to determine critical capability gaps, anticipated technical challenges, and impediments to achieving the target CFD capability in 2030. A preliminary development plan and roadmap were created to help focus investments in technology development to help achieve the CFD vision in 2030.

  13. Control surface hinge moment prediction using computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Simpson, Christopher David

    The following research determines the feasibility of predicting control surface hinge moments using various computational methods. A detailed analysis is conducted using a 2D GA(W)-1 airfoil with a 20% plain flap. Simple hinge moment prediction methods are tested, including empirical Datcom relations and XFOIL. Steady-state and time-accurate turbulent, viscous, Navier-Stokes solutions are computed using Fun3D. Hinge moment coefficients are computed. Mesh construction techniques are discussed. An adjoint-based mesh adaptation case is also evaluated. An NACA 0012 45-degree swept horizontal stabilizer with a 25% elevator is also evaluated using Fun3D. Results are compared with experimental wind-tunnel data obtained from references. Finally, the costs of various solution methods are estimated. Results indicate that while a steady-state Navier-Stokes solution can accurately predict control surface hinge moments for small angles of attack and deflection angles, a time-accurate solution is necessary to accurately predict hinge moments in the presence of flow separation. The ability to capture the unsteady vortex shedding behavior present in moderate to large control surface deflections is found to be critical to hinge moment prediction accuracy. Adjoint-based mesh adaptation is shown to give hinge moment predictions similar to a globally-refined mesh for a steady-state 2D simulation.

  14. Efficient Third Harmonic Generation for Wind Lidar Applications

    NASA Technical Reports Server (NTRS)

    Mordaunt, David W.; Cheung, Eric C.; Ho, James G.; Palese, Stephen P.

    1998-01-01

    The characterization of atmospheric winds on a global basis is a key parameter required for accurate weather prediction. The use of a space based lidar system for remote measurement of wind speed would provide detailed and highly accurate data for future weather prediction models. This paper reports the demonstration of efficient third harmonic conversion of a 1 micrometer laser to provide an ultraviolet (UV) source suitable for a wind lidar system based on atmospheric molecular scattering. Although infrared based lidars using aerosol scattering have been demonstrated to provide accurate wind measurement, a UV based system using molecular or Rayleigh scattering will provide accurate global wind measurements, even in those areas of the atmosphere where the aerosol density is too low to yield good infrared backscatter signals. The overall objective of this work is to demonstrate the maturity of the laser technology and its suitability for a near term flight aboard the space shuttle. The laser source is based on diode-pumped solid-state laser technology which has been extensively demonstrated at TRW in a variety of programs and internal development efforts. The pump laser used for the third harmonic demonstration is a breadboard system, designated the Laser for Risk Reduction Experiments (LARRE), which has been operating regularly for over 5 years. The laser technology has been further refined in an engineering model designated as the Compact Advanced Pulsed Solid-State Laser (CAPSSL), in which the laser head was packaged into an 8 x 8 x 18 inch volume with a weight of approximately 61 pounds. The CAPSSL system is a ruggedized configuration suitable for typical military applications. The LARRE and CAPSSL systems are based on Nd:YAG with an output wavelength of 1064 nm. The current work proves the viability of converting the Nd:YAG fundamental to the third harmonic wavelength at 355 nm for use in a direct detection wind lidar based on atmospheric Rayleigh scattering.

  15. Accuracy of three-dimensional facial soft tissue simulation in post-traumatic zygoma reconstruction.

    PubMed

    Li, P; Zhou, Z W; Ren, J Y; Zhang, Y; Tian, W D; Tang, W

    2016-12-01

    The aim of this study was to evaluate the accuracy of novel software-CMF-preCADS-for the prediction of soft tissue changes following repositioning surgery for zygomatic fractures. Twenty patients who had sustained an isolated zygomatic fracture accompanied by facial deformity and who were treated with repositioning surgery participated in this study. Cone beam computed tomography (CBCT) scans and three-dimensional (3D) stereophotographs were acquired preoperatively and postoperatively. The 3D skeletal model from the preoperative CBCT data was matched with the postoperative one, and the fractured zygomatic fragments were segmented and aligned to the postoperative position for prediction. Then, the predicted model was matched with the postoperative 3D stereophotograph for quantification of the simulation error. The mean absolute error in the zygomatic soft tissue region between the predicted model and the real one was 1.42±1.56mm for all cases. The accuracy of the prediction (mean absolute error ≤2mm) was 87%. In the subjective assessment it was found that the majority of evaluators considered the predicted model and the postoperative model to be 'very similar'. CMF-preCADS software can provide a realistic, accurate prediction of the facial soft tissue appearance after repositioning surgery for zygomatic fractures. The reliability of this software for other types of repositioning surgery for maxillofacial fractures should be validated in the future. Copyright © 2016. Published by Elsevier Ltd.

  16. Performance and robustness of penalized and unpenalized methods for genetic prediction of complex human disease.

    PubMed

    Abraham, Gad; Kowalczyk, Adam; Zobel, Justin; Inouye, Michael

    2013-02-01

    A central goal of medical genetics is to accurately predict complex disease from genotypes. Here, we present a comprehensive analysis of simulated and real data using lasso and elastic-net penalized support-vector machine models, a mixed-effects linear model, a polygenic score, and unpenalized logistic regression. In simulation, the sparse penalized models achieved lower false-positive rates and higher precision than the other methods for detecting causal SNPs. The common practice of prefiltering SNP lists for subsequent penalized modeling was examined and shown to substantially reduce the ability to recover the causal SNPs. Using genome-wide SNP profiles across eight complex diseases within cross-validation, lasso and elastic-net models achieved substantially better predictive ability in celiac disease, type 1 diabetes, and Crohn's disease, and had equivalent predictive ability in the rest, with the results in celiac disease strongly replicating between independent datasets. We investigated the effect of linkage disequilibrium on the predictive models, showing that the penalized methods leverage this information to their advantage, compared with methods that assume SNP independence. Our findings show that sparse penalized approaches are robust across different disease architectures, producing as good as or better phenotype predictions and variance explained. This has fundamental ramifications for the selection and future development of methods to genetically predict human disease. © 2012 WILEY PERIODICALS, INC.

  17. The Optimization of Trained and Untrained Image Classification Algorithms for Use on Large Spatial Datasets

    NASA Technical Reports Server (NTRS)

    Kocurek, Michael J.

    2005-01-01

    The HARVIST project seeks to automatically provide an accurate, interactive interface to predict crop yield over the entire United States. In order to accomplish this goal, large images must be quickly and automatically classified by crop type. Current trained and untrained classification algorithms, while accurate, are highly inefficient when operating on large datasets. This project sought to develop new variants of two standard trained and untrained classification algorithms that are optimized to take advantage of the spatial nature of image data. The first algorithm, harvist-cluster, utilizes divide-and-conquer techniques to precluster an image in the hopes of increasing overall clustering speed. The second algorithm, harvistSVM, utilizes support vector machines (SVMs), a type of trained classifier. It seeks to increase classification speed by applying a "meta-SVM" to a quick (but inaccurate) SVM to approximate a slower, yet more accurate, SVM. Speedups were achieved by tuning the algorithm to quickly identify when the quick SVM was incorrect, and then reclassifying low-confidence pixels as necessary. Comparing the classification speeds of both algorithms to known baselines showed a slight speedup for large values of k (the number of clusters) for harvist-cluster, and a significant speedup for harvistSVM. Future work aims to automate the parameter tuning process required for harvistSVM, and further improve classification accuracy and speed. Additionally, this research will move documents created in Canvas into ArcGIS. The launch of the Mars Reconnaissance Orbiter (MRO) will provide a wealth of image data such as global maps of Martian weather and high resolution global images of Mars. The ability to store this new data in a georeferenced format will support future Mars missions by providing data for landing site selection and the search for water on Mars.

  18. Predicting problems in school performance from preschool health, developmental and behavioural assessments.

    PubMed Central

    Cadman, D; Walter, S D; Chambers, L W; Ferguson, R; Szatmari, P; Johnson, N; McNamee, J

    1988-01-01

    To determine the accuracy of various predictors of school problems, we conducted a 3-year prospective study of 1999 children who began school in the Niagara region of Ontario in 1980. During the year before school entry the parents gave a health, developmental and behavioural history during an interview with a community health nurse, and the children underwent vision and hearing screening tests and the Denver Developmental Screening Test (DDST). At the end of the 1980-81 school year the kindergarten teachers rated the children's learning problems. At the end of the 1982-83 school year the presence of school problems was ascertained, and the predictive accuracy of items from the preschool history and examination and of the kindergarten teachers' ratings was calculated. The health, developmental and behavioural history with or without the DDST was found to predict later school problems with acceptable accuracy. The kindergarten teachers' ratings gave slightly more accurate predictions. We conclude that in communities where prompt diagnostic evaluation and effective therapeutic or preventive help can be provided to children identified as being at high risk, health professionals may play a useful role in screening for future school problems. PMID:3383038

  19. Corrosion Prediction with Parallel Finite Element Modeling for Coupled Hygro-Chemo Transport into Concrete under Chloride-Rich Environment

    PubMed Central

    Na, Okpin; Cai, Xiao-Chuan; Xi, Yunping

    2017-01-01

    The prediction of the chloride-induced corrosion is very important because of the durable life of concrete structure. To simulate more realistic durability performance of concrete structures, complex scientific methods and more accurate material models are needed. In order to predict the robust results of corrosion initiation time and to describe the thin layer from concrete surface to reinforcement, a large number of fine meshes are also used. The purpose of this study is to suggest more realistic physical model regarding coupled hygro-chemo transport and to implement the model with parallel finite element algorithm. Furthermore, microclimate model with environmental humidity and seasonal temperature is adopted. As a result, the prediction model of chloride diffusion under unsaturated condition was developed with parallel algorithms and was applied to the existing bridge to validate the model with multi-boundary condition. As the number of processors increased, the computational time decreased until the number of processors became optimized. Then, the computational time increased because the communication time between the processors increased. The framework of present model can be extended to simulate the multi-species de-icing salts ingress into non-saturated concrete structures in future work. PMID:28772714

  20. Renormalization group analysis of the 2000-2002 anti-bubble in the US S&P500 index: explanation of the hierarchy of five crashes and prediction

    NASA Astrophysics Data System (ADS)

    Zhou, Wei-Xing; Sornette, Didier

    2003-12-01

    We propose a straightforward extension of our previously proposed log-periodic power-law model of the “anti-bubble” regime of the USA stock market since the summer of 2000, in terms of the renormalization group framework to model critical points. Using a previous work by Gluzman and Sornette (Phys. Rev. E 65 (2003) 036142) on the classification of the class of Weierstrass-like functions, we show that the five crashes that occurred since August 2000 can be accurately modeled by this approach, in a fully consistent way with no additional parameters. Our theory suggests an overall consistent organization of the investors forming a collective network which interact to form the pessimistic bearish “anti-bubble” regime with intermittent acceleration of the positive feedbacks of pessimistic sentiment leading to these crashes. We develop retrospective predictions, that confirm the existence of significant arbitrage opportunities for a trader using our model. Finally, we offer a prediction for the unknown future of the US S&P500 index extending over 2003 and 2004, that refines the previous prediction of Sornette and Zhou (Quant. Finance 2 (2002) 468).

Top